Stay Away from CHAI

258°
Helpful
jennyllwilliams
Death by AI? Man kills self after chatting with ChatGPT-like chatbot about climate change

A Belgian man has reportedly died by suicide after chatting with an AI-powered chatbot for six weeks. According to statements by his wife to the Belgian outlet La Libre (via Belga News Agency), the man referred to as Pierre, killed himself after becoming increasingly pessimistic about the effects of global warming, which some refer to as "eco-anxious." The publication has also assessed chats with the AI chatbot on the platform named Chai, which is available to download for free on Apple App Store. The chatbot, similar to the viral ChatGPT, offers answers to complex queries in a conversational way. Chai, unlike ChatGPT, has many pre-made avatars, and users can choose the tone of the conversation based on the AI they select. Some of the trending AI chatbots on Chai include Noah (over-protected boyfriend), Grace (roommate), and Theimas (emperor husband).

https://www.indiatoday.in/tech...

😳🙄😥😒

@androgame @bikidas2060 @getready @guest_999 @kartikxxx @kukdookoo @LIMBO @quantum @saucap @tappukepapa @xuseronline 

Expired
Top Comments
Beacon Beacon
Link Copied

That's stupid af or a giant conspiracy!   

Also, Chat GPT is very biased toward left and climate change and all this fear-mongering is one of their best tools. 

It's possible the husband was already suicidal and this was just a push OR probably tired/afraid of this wife and thought ending the life would be a better option OR the wife hired a hitman(woman) to kill her husband and to make it look like a suicide! 



18 Comments  |  
10 Dimers
  • Sort By
Deal Subedar Deal Subedar
Link Copied

"emperor husband" - now ladies might commit similar mishaps too...

Entertainer Entertainer
Link Copied

Chai! Chai! Chai!

Garama garam Chai 😐

I stopped using Chatgpt. Forgot password also.

Entertainer Entertainer
Link Copied
Agnivo007 wrote:

"emperor husband" - now ladies might commit similar mishaps too...

What is the meaning of this line? -

Some of the trending AI chatbots on Chai include Noah (over-protected boyfriend), Grace (roommate), and Theimas (emperor husband).

Benevolent Benevolent
Link Copied
The guy was having some sort of depression or other mental issues.rip. ai will show its true colours within the next 10 years.
Helpful Helpful
Link Copied
Expand
saucap wrote:

What is the meaning of this line? -

Some of the trending AI chatbots on Chai include Noah (over-protected boyfriend), Grace (roommate), and Theimas (emperor husband).

They are various avatars of ChatBots instead of single. 

Beacon Beacon
Link Copied

That's stupid af or a giant conspiracy!   

Also, Chat GPT is very biased toward left and climate change and all this fear-mongering is one of their best tools. 

It's possible the husband was already suicidal and this was just a push OR probably tired/afraid of this wife and thought ending the life would be a better option OR the wife hired a hitman(woman) to kill her husband and to make it look like a suicide! 



Helpful Helpful
Link Copied

He reminded me of the game blue shark or something like that.

Deal Subedar Deal Subedar
Link Copied
Expand
saucap wrote:

What is the meaning of this line? -

Some of the trending AI chatbots on Chai include Noah (over-protected boyfriend), Grace (roommate), and Theimas (emperor husband).

Just like fantasy characters, one can choose own ai assistant from characters having personalities mentioned there...

Deal Subedar Deal Subedar
Link Copied
kartikxxx wrote:

Ai will eat humans someday pensive

Kaccha to nahin kha sakta... "intelligence" - synapses - neural network - grey matter kha sakta hai data theft aur reprogram karke, bana sakta hai zombies : children cartoon stories jaisa...
Helpful Helpful
Link Copied

Maybe he was already under severe depression or suicidal. His wife and others might have not noticed or ignored and it's not something to joke about.

Or worse, the wife murdered him and is using an ingenious scapegoat.

replyuser
Click here to reply
Reply