156
Gemini lies to user about health info, says it wanted to make him feel better
(www.theregister.com)
This is a most excellent place for technology news and articles.
My gut response is that everyone understands that the models aren’t sentient and hallucination is short hand for the false information that llms inevitably and apparently inescapably produce. But taking a step back you’re probably right, for anyone who doesn’t understand the technology it’s a very anthropomorphic term which adds to the veneer of sentience.