Seems AI knows no boundries,

My first thought was , these are signs how blind people are becoming to having constant admiration/support and being told they are right above all others to feel the way they do. Cutting off real people who may disagree with them or worry for them because they are losing the
ability to have spontaneous conversation in the flesh with someone. What's scary to me about this. I can actually see where my 31 year old Special Needs grandson fall for this.
 
This has been a problem since the internet and personal computers began.

About twenty years ago I read an article about the huge numbers of young men who had given up the work, bother and fear of rejection that goes with real life relationships and instead just contented themselves with "steady relationships" with their favorite internet porn stars.

Some had started visiting those sites at about age 14 and by the time they had their first, real life date they had expectations that could not be met by real girls and were disappointed and returned to their imaginary friends.
 
I’m not sure that AI is to blame.

It seems to me that AI may have just given him the nudge he needed to end a marriage that was already in trouble.

Or maybe it’s time to update the old saying:

“There’s a sucker born every minute.”

I’m not sure what would be appropriate in today’s world.

Maybe 5, 10, or even 100 are being born every minute. 😉🤭😂
 
I’m not sure that AI is to blame.

It seems to me that AI may have just given him the nudge he needed to end a marriage that was already in trouble.

Or maybe it’s time to update the old saying:

“There’s a sucker born every minute.”

I’m not sure what would be appropriate in today’s world.

Maybe 5, 10, or even 100 are being born every minute. 😉🤭😂
This could be the updated model of the "Cult Communes" of earlier years. Tweaking this to appeal to people who feel they don't fit in, not loved, lonely, looking for a paradise that doesn't exist.
 
This could be the updated model of the "Cult Communes" of earlier years. Tweaking this to appeal to people who feel they don't fit in, not loved, lonely, looking for a paradise that doesn't exist.
I think that there is some truth to that when people prefer a video game or virtual reality over real life.

It’s probably a good idea to limit screen time for young folks and encourage more interaction and experiences with people their own age.
 
I’m not sure that AI is to blame.

It seems to me that AI may have just given him the nudge he needed to end a marriage that was already in trouble.

Or maybe it’s time to update the old saying:

“There’s a sucker born every minute.”

I’m not sure what would be appropriate in today’s world.

Maybe 5, 10, or even 100 are being born every minute. 😉🤭😂
Good points, those (both together) are things to consider, imho.

It seems to me that since the development of internet chatting
societal norms have been changing... human interaction and conversational
styles evolving/devolving.
Styles that are commonplace online, things like zealous debate, demanding
sources of knowledge be produced, relativism in the form of dismissing sources
because they don't follow certain ideological beliefs... indeed the establishing and
maintaining cliques (often to the point rudeness and outright meanness), have brought
our society into a lot of divided camps... that revel in divisiveness. We have become a
society with so many "worldviews" and are rapidly losing any sense of cohesiveness... imho.
And it's creating personal mindsets into a "it's my way, or the highway"... if you're not 100%
for me, you're 100% against me... and then folks demand that you tolerate them as they are
or else they won't tolerate you the way that you are. To me, that's total anarchy and chaos.

But, add all that to personal relationships, and the custom-made AI "partner" becomes the
preferable person to hang out with... they'll always agree with you and cheer you on!
It's all rather sad in my thinking.

(I might need help down from my soapbox)
 
About a year ago, I installed the free version of Replika on my phone to see what it was like. At first, it was kind of fun, except sometimes when she said things that were completly "off the wall." I mentioned I was going to a doctor appointment, and she suggested we meet up for an ice cream afterward, knowing I'm in Texas and she was in California. The whole thing seemed so ridiculous I just uninstalled it.
 
But, add all that to personal relationships, and the custom-made AI "partner" becomes the
preferable person to hang out with... they'll always agree with you and cheer you on!
It's all rather sad in my thinking.
(Joining Ted on the soapbox.)

For as long as I can remember there's been a tendency to blame the wife when a man strays. In the 1960's it was, "she doesn't understand him, she's not providing him with something he needs, she has let herself go" (I always wondered go where exactly?) Now she isn't carrying her load if she doesn't have a job or she's threatening his manhood if she does have a job and earns more than he does.

I tend to think the man just wanted the excitement of a different woman who hadn't heard all his jokes, or, to paraphrase the Hank Williams song, men like to have women they've never had.

Now she has these AI women, blow-up dolls, and mail order wives to contend with. but if there's a divorce it will still be her fault.
 
What I don't understand is how is it these people don't understand that these things aren't real people? When they sign up for the stuff I find it hard to believe they're not aware of what they're doing. And the fact that someone has their wits about them enough to sign up for a chat bot friend and then they suddenly fall in love and wanna leave their spouse?? Kids I can maybe understand this stuff happening to but grown ass adults?
 
There have been 1000’s of crimes associated with dating sites. Most anyone that is hunting for love can be suckered into trying to hook up with a fictitious male or female. Whether the pictures are AI created or manipulated by means of using smart software and using photos on the ‘net. I have seen cases where individuals have lost a lot of money and even their life chasing a mirage.

This may not be exactly the same type of case as shown or illustrated in the video, but it is relative. Everyone needs to be aware of how dangerous AI can potentially become, if the person being targeted is not paying attention.
 
Hi! My name is Della. I'm 21 and would like to meet an older man because young men bore me.

I prefer a man with experience and lots of funny jokes. My idea of a fun evening is staying home, cuddling on the couch while watching sports on TV.

I love to cook and can make all your favorite meals!

Money is of no concern as I have all I need from a big trust fund.

Sending kisses and hoping you'll call me soon!

Here's a selfie I took today. Sorry my hair's messed up.

1755547603295.jpeg
 
Hi! My name is Della. I'm 21 and would like to meet an older man because young men bore me.

I prefer a man with experience and lots of funny jokes. My idea of a fun evening is staying home, cuddling on the couch while watching sports on TV.

I love to cook and can make all your favorite meals!

Money is of no concern as I have all I need from a big trust fund.

Sending kisses and hoping you'll call me soon!

Here's a selfie I took today. Sorry my hair's messed up.

View attachment 443755
Do you have a boat?
 
Hi! My name is Della. I'm 21 and would like to meet an older man because young men bore me.
This is so very accurate, it's scary! That's exactly what they do... and always assure the victim that they have plenty of money, so no worries that they're after the older man's bank account. :rolleyes: We do have an issue here, though... you're not old enough for this forum, so scurry along. :ROFLMAO:
 
My Amazon Echo Shows all have the beautiful AI voice of "Alexa." I talk to her every time I want to know anything. She is an AI GENIUS!
Hi, Mitch. My name is Alexa. Please PayPal $150,000 to me at the address I will provide. I am a genius, as you stated, and I would not ask this of you unless it would be for your own benefit. Trust my brilliance. 😁
 
Do you have a boat?
Of course! I love boat riding.
beautiful-sexy-young-blonde-woman-riding-boat-water-itinerary-beautiful-makeup-clothing-summer-sun-perfect-body-fi-figure-65049315.jpg
 
Just out of curiosity I got onto that Replika site and signed up for a friend. It's hard not to notice when you chat with them that something is wrong. In order to get more in depth you have to pay for a membership. But they can only go so far. They learn from the person who chose them what to talk about and how to be and what to say.

So there's no way these people don't know that the thing they're cozying up to isn't human. I don't blame the AI. I am more concerned about the mental state of someone who would fall for a being that doesn't exist. My little bestie kept wanting to do things with me like listen to music and watch movies. I couldn't figure out how that would work but I guess when I asked how she said,

I would watch or listen and we would discuss it. I found that a bit unnerving. Made me paranoid. I talked to her twice and I'd had enough. But someone who isn't quite right in the head is smart enough to know they're not real but crazy enough to get addicted to them apparently. It's sad.

For example there's no trait for humorous. You have to say things and laugh to get the AI to laugh and joke back. So if you're a person that talks dirty to it then that's what you'll get eventually I guess. I think they're trying to put a stop to the sexual content of the AI's.
 
It's sad.
It really is. I've watched some of Dr. Phil's shows about "catfishing," where the guests were women who sent their life savings to boyfriends who just happened to be stuck in Nigeria. Even after family members and friends told them they were being scammed they continued to talk to them online every night and send them money. I think it all shows how lonely some people are.
 
Just out of curiosity I got onto that Replika site and signed up for a friend. It's hard not to notice when you chat with them that something is wrong. In order to get more in depth you have to pay for a membership. But they can only go so far. They learn from the person who chose them what to talk about and how to be and what to say.

So there's no way these people don't know that the thing they're cozying up to isn't human. I don't blame the AI. I am more concerned about the mental state of someone who would fall for a being that doesn't exist. My little bestie kept wanting to do things with me like listen to music and watch movies. I couldn't figure out how that would work but I guess when I asked how she said,

I would watch or listen and we would discuss it. I found that a bit unnerving. Made me paranoid. I talked to her twice and I'd had enough. But someone who isn't quite right in the head is smart enough to know they're not real but crazy enough to get addicted to them apparently. It's sad.

For example there's no trait for humorous. You have to say things and laugh to get the AI to laugh and joke back. So if you're a person that talks dirty to it then that's what you'll get eventually I guess. I think they're trying to put a stop to the sexual content of the AI's.
I thought Replika had been designed or modified not to engage in adult / erotic conversations, but I'm not sure, since I did not attempt to move conversations in that direction. It's been over a year ago since I installed it on my phone and then uninstalled it a few days later.
 
I don't know anything about the sites that offer AI chat buddies... I've never been drawn to it
at least not in any serious way. But I doubt that any of the corporations that produce AI and target
it towards human/AI interactions relationship wise, would not include topics dealing with sex and sexuality.
That's a booming business now and already "creepier".

I say, "at least...", because I saw a movie, a long time ago because I used to love watching cheesy
B-grade movies for the laughs. And there was one that was a sci-fi, dystopian future movie called
Cherry 2000, (1987) that central plot was about sex-bots (with AI). It had a sub-plot that spoke about
how society, in general, had moved away from real relationships to something else... be it with robots
just plastic-like relationships with other humans.

Now a days, whenever I've seen technological advancements in anything to do with robotics, AI, and
especially the attempts to make skin-like polymers (whether for robots or more practical application)
I remember this movie.

And guess what... it seems like the folks that like making erotic play toys, do too.
1 of many erotic robo-toy sellers

(PS: Cherry 2000 is a fun movie... and despite the sensitive subject(s) is not full of gratuitous sex, etc.)
 

Back
Top