Hot Deal

Stay Away from CHAI

347°
Helpful
jennyllwilliams
Death by AI? Man kills self after chatting with ChatGPT-like chatbot about climate change

A Belgian man has reportedly died by suicide after chatting with an AI-powered chatbot for six weeks. According to statements by his wife to the Belgian outlet La Libre (via Belga News Agency), the man referred to as Pierre, killed himself after becoming increasingly pessimistic about the effects of global warming, which some refer to as "eco-anxious." The publication has also assessed chats with the AI chatbot on the platform named Chai, which is available to download for free on Apple App Store. The chatbot, similar to the viral ChatGPT, offers answers to complex queries in a conversational way. Chai, unlike ChatGPT, has many pre-made avatars, and users can choose the tone of the conversation based on the AI they select. Some of the trending AI chatbots on Chai include Noah (over-protected boyfriend), Grace (roommate), and Theimas (emperor husband).

https://www.indiatoday.in/tech...

😳🙄😥😒

@androgame @bikidas2060 @getready @guest_999 @kartikxxx @kukdookoo @LIMBO @quantum @saucap @tappukepapa @xuseronline 

Expired
18 Comments  |  
10 Dimers
  • Sort By
Finance Mentor Finance Mentor
Link Copied

"emperor husband" - now ladies might commit similar mishaps too...

Helpful Helpful
Link Copied

Over protected Boyfriend

View 3 more replies
Entertainer Entertainer
Link Copied

Chai! Chai! Chai!

Garama garam Chai 😐

I stopped using Chatgpt. Forgot password also.

Benevolent Benevolent
Link Copied
The guy was having some sort of depression or other mental issues.rip. ai will show its true colours within the next 10 years.
Wingman Wingman
Link Copied

That's stupid af or a giant conspiracy!   

Also, Chat GPT is very biased toward left and climate change and all this fear-mongering is one of their best tools. 

It's possible the husband was already suicidal and this was just a push OR probably tired/afraid of this wife and thought ending the life would be a better option OR the wife hired a hitman(woman) to kill her husband and to make it look like a suicide! 



Pro DealBaba Pro DealBaba
Link Copied

Ai will eat humans someday pensive

Finance Mentor Finance Mentor
Link Copied
Kaccha to nahin kha sakta... "intelligence" - synapses - neural network - grey matter kha sakta hai data theft aur reprogram karke, bana sakta hai zombies : children cartoon stories jaisa...
Generous Generous
Link Copied

jo dar gaya...

Finance Mentor Finance Mentor
Link Copied

woh pee liya... aur jhulte jhulte chat kiya AI se...

View 2 more replies
Helpful Helpful
Link Copied

He reminded me of the game blue shark or something like that.

Helpful Helpful
Link Copied

Blue Whale - The Suicide Game

Deal Cadet Deal Cadet
Link Copied

chai ko chod do bhai

Generous Generous
Link Copied

Maybe he was already under severe depression or suicidal. His wife and others might have not noticed or ignored and it's not something to joke about.

Or worse, the wife murdered him and is using an ingenious scapegoat.

replyuser
Click here to reply
Reply