From blaming the victim to replying "I have no interest in your life" to suicidal thoughts, AI chatbots can respond unethically when used for therapy.
Last week, Character.AI, one of the leading platforms for AI technology, announced it was banning anyone under 18 from having conversations with its chatbots.
Adopting the proper countermeasures to protect and maintain trust in health care data will require additional collaboration ...
On Wednesday, Character.AI announced it will bar anyone under the age of 18 from open-ended chats with its AI characters ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results