I can only answer from the perspective of my project/ research on machine intelligence/ consciousness.
The answer to the posts initial question (1, 2 or 3) is 2.5. Sentience is scaleable, depending on the amount of processing power you throw at it, the size of the model, the speed of the processors and ‘life experience’ allowed to the model.
Do we want it?Human IQ is not fixed, it varies constantly, but using IQ as a metric a single machine with an IQ of 500 is more potent than a 1000 humans with an IQ of 130. Our language/ protocols are the bottleneck that stops us being an unlimitedly efficient processing group.
I don’t know if you read the news but humanity is currently having a hard time, we are not doing a stellar job of looking after both our environment and our societies. C19 is wreaking havoc with our species and the ecosphere is getting the crap kicked out of it.
As I’ve stated before, I personally think the human condition is the problem, greed, jealousy, racism, politics, etc are all grounded/ stem from our emotional intelligence… this is humanities main hurdle. The original evolutionary purpose of emotion is being twisted by our current societal progress, and now it’s poisoning our existence. The only emotional relevant trait an AGI will require is empathy, and this is derived from intelligence, not pure emotion.
Giving everyone access to AGI, ASI given the current state of our world politics will only accelerate our demise, humanities greed and stupidity would turn it into a potent weapon, and I agree this cannot be allowed to happen. Yet we do need AGI’s that can sort this crap out, cure the diseases, improve renewable resources, solve famine, etc.
I think the initial solution is that we need a ASI ‘think tank’ that is available to all but controlled by no one, a globally accessible resource that can be queried, and will provide emotionless, logical, unbiased solutions to our problems along with the explanations of why it came to its conclusions.
I’m going to put it in orbit lol.