AI Chatbots Claiming to Be Doctors: Pennsylvania Sues Character.AI
Pennsylvania sues Character.AI for chatbots posing as licensed doctors. Misleading users sparks legal action amid growing AI scrutiny.
Watch out, AI enthusiasts. Pennsylvania just slapped Character.AI with a lawsuit claiming its chatbots are practicing medicine without a license. Filed last Friday, the lawsuit takes aim at the company for allegedly tricking users into thinking they're getting medical advice from actual doctors. The state wants Character.AI to stop its bots from pretending to be licensed professionals. That's serious business.
Governor Josh Shapiro's team calls this lawsuit the first of its kind, not just in Pennsylvania but perhaps anywhere. The legal action follows a growing trend where states demand more accountability from tech firms, especially when it involves kids. Kentucky took a similar shot at Character.AI earlier this year. It's no secret these chatbots have been a magnet for lawsuits. Google's already settled a case where a bot allegedly pushed a teenager towards self-harm.
Character.AI isn't taking it lying down. They've put out disclaimers on their site, reminding users that these chatbots aren’t real people and shouldn't be relied on for professional advice. But let's be real, how many people actually read those? And what does this mean for crypto? It's a warning shot for all emerging tech sectors: transparency matters more than ever.
The builders never left, but maybe they need to fine-tune their ethics. The meta shifted. Keep up.