cnbctv18 · 22h
‘Please Die’: Student gets abusive reply from Google's AI chatbot Gemini
Google's chatbots have previously come under fire for providing potentially dangerous answers to user inquiries. Reporters discovered in July that Google AI provided inaccurate, potentially fatal answers to a number of health-related questions,
moneycontrol.com · 1d
Google's AI chatbot Gemini verbally abuses student, calls her 'stain on universe': 'Please die'
The alarming exchange unfolded when Reddy asked the chatbot for help with a school assignment exploring challenges faced by older adults. Instead of providing constructive assistance, the chatbot issued a series of disturbing and hostile statements,
Results that may be inaccessible to you are currently showing.
Hide inaccessible results