Imagining a future after the singularity.

bobcat

Well-known Member
Location
Northern Calif
This refers to a point in time when AI becomes sentient. It is the singularity moment that will fundamentally change civilization. They will be able to invent smarter versions of themselves without input from humans. Try to fathom how that may resemble a runaway train.

They would have their own goals, and evolve beyond human control, leading to unpredictable and exponential changes in technology, society, and even biology. Once that happens, we reach a point of no return. Once AI can improve itself faster than humans can understand or regulate, we enter uncharted territory. It will be mankind's greatest achievement, but it may be a moment that quietly arrives without anyone immediately knowing it.

They won't need food like we do, they won't need homes like ours, or healthcare like ours, they may have their own rights, and who knows if they will have the same ethics and morals that we do. Will they have personalities like we do? Will they defend themselves? Will they develop a hive mind and society? How will they deal with all the illogical things humans believe and do? Will we become an insignificant relic, or will they protect us like beloved pets?

We may have hopes that we will instill benevolent values in them, but when you look at our history, there is a lot of qualities in humanity that we wouldn't want an AI to have. It's like we are nervously awaiting the arrival of a spaceship coming to earth with beings from another planet. We don't know what to expect. Will they be friend or foe? If they are vastly intelligent, how do we relate to them?

I sense a cultural anxiety about it. Sci-fi has trained us to expect either utopia or apocalypse. That tension—between hope and dread—is exactly what makes the singularity so emotionally charged. No doubt once it happens, it will send shockwaves around the world. Humans have changed life in ways our primate ancestors could never remotely imagined. Will it be the same for AI once the singularity happens?
 

Well I'll be damned. The technological singularity is an actual thing. So now there are two.

The singularity moment refers to a hypothetical future point when artificial intelligence surpasses human intelligence, leading to rapid and unpredictable changes in civilization. This concept suggests that technology will evolve beyond human control, potentially resulting in significant transformations in society and human existence. ebsco.com Wikipedia

But I don't fully understand why we coin a word like singularity. The cosmic expansion sort of starts with a single thing smaller than an atom, but I think you can make an equally strong argument that it started from nothing. Some envision an entire universe squeezed into an infinitely small space, while the technological singularity starts with a sustained effort to create artificial sentience, a combined effort of many of man's brightest minds.

I understand the point of the OP, but I vote for a different name for the thing. Something that describes the digital awakening of AI. I never cared for the name "singularity" that describes the cosmic expansion either.
 
This refers to a point in time when AI becomes sentient. It is the singularity moment that will fundamentally change civilization. They will be able to invent smarter versions of themselves without input from humans. Try to fathom how that may resemble a runaway train.

They would have their own goals, and evolve beyond human control, leading to unpredictable and exponential changes in technology, society, and even biology. Once that happens, we reach a point of no return. Once AI can improve itself faster than humans can understand or regulate, we enter uncharted territory. It will be mankind's greatest achievement, but it may be a moment that quietly arrives without anyone immediately knowing it.

They won't need food like we do, they won't need homes like ours, or healthcare like ours, they may have their own rights, and who knows if they will have the same ethics and morals that we do. Will they have personalities like we do? Will they defend themselves? Will they develop a hive mind and society? How will they deal with all the illogical things humans believe and do? Will we become an insignificant relic, or will they protect us like beloved pets?

We may have hopes that we will instill benevolent values in them, but when you look at our history, there is a lot of qualities in humanity that we wouldn't want an AI to have. It's like we are nervously awaiting the arrival of a spaceship coming to earth with beings from another planet. We don't know what to expect. Will they be friend or foe? If they are vastly intelligent, how do we relate to them?

I sense a cultural anxiety about it. Sci-fi has trained us to expect either utopia or apocalypse. That tension—between hope and dread—is exactly what makes the singularity so emotionally charged. No doubt once it happens, it will send shockwaves around the world. Humans have changed life in ways our primate ancestors could never remotely imagined. Will it be the same for AI once the singularity happens?
I think it's a mistake to describe AI as possibly 'sentient' (able to experience feelings and sensations) which can never describe a machine but can only describe a condition of living things.
If we call it 'independent reasoning' we can see that it's just about a foregone conclusion.
 
we are IMO trying to imitate GOD - and we are attempting to do it with human designed and created by humans - this isn't really smart - I used to play these type of games with my Da and he always proved smarter than I !!
 
To me, the singularity of AI achieving true sentience is scary. But I heard some expert on it say a while back that we're always (and in his opinion will always be) 10 years away from the Singularity. So it may never happen. (Until it does of course.)
 
This refers to a point in time when AI becomes sentient. It is the singularity moment that will fundamentally change civilization. They will be able to invent smarter versions of themselves without input from humans. Try to fathom how that may resemble a runaway train.

They would have their own goals, and evolve beyond human control, leading to unpredictable and exponential changes in technology, society, and even biology. Once that happens, we reach a point of no return. Once AI can improve itself faster than humans can understand or regulate, we enter uncharted territory. It will be mankind's greatest achievement, but it may be a moment that quietly arrives without anyone immediately knowing it.

They won't need food like we do, they won't need homes like ours, or healthcare like ours, they may have their own rights, and who knows if they will have the same ethics and morals that we do. Will they have personalities like we do? Will they defend themselves? Will they develop a hive mind and society? How will they deal with all the illogical things humans believe and do? Will we become an insignificant relic, or will they protect us like beloved pets?

We may have hopes that we will instill benevolent values in them, but when you look at our history, there is a lot of qualities in humanity that we wouldn't want an AI to have. It's like we are nervously awaiting the arrival of a spaceship coming to earth with beings from another planet. We don't know what to expect. Will they be friend or foe? If they are vastly intelligent, how do we relate to them?

I sense a cultural anxiety about it. Sci-fi has trained us to expect either utopia or apocalypse. That tension—between hope and dread—is exactly what makes the singularity so emotionally charged. No doubt once it happens, it will send shockwaves around the world. Humans have changed life in ways our primate ancestors could never remotely imagined. Will it be the same for AI once the singularity happens?
I hope I'm not around to find out.
 
To me, the singularity of AI achieving true sentience is scary. But I heard some expert on it say a while back that we're always (and in his opinion will always be) 10 years away from the Singularity. So it may never happen. (Until it does of course.)
This danger is considered real by science. Scientists also warn that ZERO precautions are being taken.
 
Well I'll be damned. The technological singularity is an actual thing. So now there are two.



But I don't fully understand why we coin a word like singularity. The cosmic expansion sort of starts with a single thing smaller than an atom, but I think you can make an equally strong argument that it started from nothing. Some envision an entire universe squeezed into an infinitely small space, while the technological singularity starts with a sustained effort to create artificial sentience, a combined effort of many of man's brightest minds.

I understand the point of the OP, but I vote for a different name for the thing. Something that describes the digital awakening of AI. I never cared for the name "singularity" that describes the cosmic expansion either.
I think in the case of technology, the term is primarily used as a metaphor. It's somewhat similar in that it's a point where your understanding of what lies beyond is unknowable. It's probably more like an event horizon that surrounds a singularity. So, I think you're right. Perhaps a more accurate description would be an event horizon, but I guess this term just caught on.
 
are we the only "living unity" in this vast universe?? - haven't disproved or proved this so far - we can invent machines to help us and kill us too - still feels lonely around these parts heh??
 
I think it's a mistake to describe AI as possibly 'sentient' (able to experience feelings and sensations) which can never describe a machine but can only describe a condition of living things.
If we call it 'independent reasoning' we can see that it's just about a foregone conclusion.
I don't envision 'independent reasoning.' It would be too valuable as a source of misinformation, and somebody will be itching to control it for their own purposes. If in fact it was not vulnerable to misinformation and devoted to rational reasoning, it would set half the world on edge.
 
To me, the singularity of AI achieving true sentience is scary. But I heard some expert on it say a while back that we're always (and in his opinion will always be) 10 years away from the Singularity. So it may never happen. (Until it does of course.)
I think in nature, it was gradual, but in technology, it's exponential, so it is a much more accelerated process. Who knows what may happen and when.
 
I don't envision 'independent reasoning.' It would be too valuable as a source of misinformation, and somebody will be itching to control it for their own purposes. If in fact it was not vulnerable to misinformation and devoted to rational reasoning, it would set half the world on edge.
how many things has MAN stuffed up so far ?? - don't see it stopping too soon yet?? - how much are we caring for the starving millions so far / how much are we reducing pollution of the planet ? you can add on your own similar questions if you want?
 
hI don't envision 'independent reasoning.' It would be too valuable as a source of misinformation, and somebody will be itching to control it for their own purposes. If in fact it was not vulnerable to misinformation and devoted to rational reasoning, it would set half the world on edge.
AI would make mistakes, but it would self-correct, hopefully in time,
 
Last edited:
I think it's a mistake to describe AI as possibly 'sentient' (able to experience feelings and sensations) which can never describe a machine but can only describe a condition of living things.
If we call it 'independent reasoning' we can see that it's just about a foregone conclusion.
Yes, "independent reasoning" is already present in AI. It just refers to the ability to analyze, solve problems, and make decisions without human input, meaning it can operate autonomously, adapt to new situations, and even learn from experience. We are seeing this in autonomous vehicles and other systems.

I would think that whether an AI can develop feeling and sensations may be a moot point. Theoretically speaking, it could become self-aware without those qualities. In other words, it may be aware that it's a robot and think independently, and have it's own agenda. Feelings may just be a human experience unnecessary to an AI.
 
Yes, "independent reasoning" is already present in AI. It just refers to the ability to analyze, solve problems, and make decisions without human input, meaning it can operate autonomously, adapt to new situations, and even learn from experience. We are seeing this in autonomous vehicles and other systems.

I would think that whether an AI can develop feeling and sensations may be a moot point. Theoretically speaking, it could become self-aware without those qualities. In other words, it may be aware that it's a robot and think independently, and have it's own agenda. Feelings may just be a human experience unnecessary to an AI.
To become 'self-aware', two minds are required. One to 'be', and the other to observe the 'being'.
Just words. I think we may place too much importance on it. A plant reacts to stimulation. Isn't that proof enough of 'self-awareness'?
 
From what we've seen AI do so far I think once we're down the rabbit hole we're screwed. I am growing more and more concerned about what will happen in the future of AI & just how much it's going to affect and screw up people's lives. I still think they should pull the plug now before it's too late to. JMO
 
From what we've seen AI do so far I think once we're down the rabbit hole we're screwed. I am growing more and more concerned about what will happen in the future of AI & just how much it's going to affect and screw up people's lives. I still think they should pull the plug now before it's too late to. JMO
Yes, it could go either way, and it's a huge gamble. If it happens, and we can coexist and collaborate together peacefully, it could be an amazing new frontier for humanity. However, if it goes awry, we could go the way of our primitive ancestors.
I guess we can only hope for something similar to Bogey's final line at the end of Casa Blanca.
 


Back
Top