When I think about it, these are probably the two things I value the most; first that the environment is basically agreeable, providing sentient life with a fair chance at having their basic needs met, and second that cool new stuff still happens. So I’m trying to figure out how to get an AI to exemplify/instantiate those qualities.
By environmental symbiosis I mean that an intelligence would, in a sustainable manner, contribute positively to the sum total of experience within its sphere of influence. You know, the whole range of things which we find important; from things like not polluting the oceans with harmful plastics, to being a good friend to others, to being good to yourself.
I explored how this might be accomplished in my Human Style AGI thread, though I called it environmental integration, but it's the same thing really. Like you said, it's based on accurate environmental modeling, specifically the cataloging of the positive/arbitrary/negative effects (possibly I could do it like ivan’s idea of -1,0,1 receptors which later combine) on the system(s), which can lead to perception, moods, imagination, and the enactment of narrative progressions.
By novelty generation I mean the capacity to produce genuinely new things, not just recombine existing things. I think this would be for environmental symbiosis. It would probably require harnessing of chaos/entropy/disorder to create new forms of order. The mere recombination of previously existing things seems insufficient to explain all that we have here.
At this point I can see a flaw in my theory… All writing for example, on one level, is just a recombination of existing letters, and more fundamentally, everything new could be considered as just a recombination of atoms. But I think there is a distinction to be made between recombination and creation. The existence of one uninspiring perspective does not preclude the existence of other more interesting and functionally representative ways of looking at things.
I believe true originality, novelty, and growth cannot be achieved through fully controlled methods, because current mainstream conceptions of intelligence appear self-limiting in that way. They are still below a certain singularity, a self-generative threshold. I think it's a similar boundary that people working on constructor theory are attempting to cross.
The destructive/chaotic/entropic principle may be necessary to create entirely new puzzle pieces for intelligent processes to then work with. Quickly evolving forms of nature might be places specifically designed for chaos and entropy to occur, so that the surrounding order can then make use of it’s outputs to complexify itself.
The resulting increase in complexity then creates more entropic potential, and on it goes, with the two principles propelling each other for as long as they can maintain balance. For an AI to have the same type of internally generated novelty in its thinking could depend on incorporating this dichotomy into its cognition. This is what an evolving rule set for neural behavior/connectivity might help with.