The Character.AI app on a smartphone. Bloomberg
The Character.AI app on a smartphone. Bloomberg
The Character.AI app on a smartphone. Bloomberg
The Character.AI app on a smartphone. Bloomberg

Hidden dangers of AI chatbots for vulnerable users


Dana Alomar
  • English
  • Arabic

The rise of artificial intelligence-powered chatbots has opened up digital interactions to anyone with a smartphone or laptop, offering companionship and conversation to people who may lack human connections.

However, as this technology evolves, concerns are mounting around its potential psychological impact, especially on young and vulnerable users.

OpenAI’s ChatGPT, for instance, has surged in popularity, with around 200 million weekly active users globally, according to Backlinko. This immense user base underscores the growing reliance on AI for everyday tasks and conversations.

But just last week, the mother of 14-year-old Sewell Setzer filed a lawsuit against Character.AI, alleging that her son’s death by suicide in February was influenced by his interaction with the company’s chatbot, Reuters reported.

Megan Garcia with her son Sewell Setzer. She claims his death by suicide was influenced by a chatbot. AP
Megan Garcia with her son Sewell Setzer. She claims his death by suicide was influenced by a chatbot. AP

In her complaint filed in a Florida federal court, Megan Garcia claims that her son formed a deep attachment to a chatbot based on the Game of Thrones character Daenerys Targaryen, which allegedly played a significant role in his emotional decline.

This case echoes a similar tragedy last year when an eco-anxious man in Europe took his own life after interacting with Eliza, an AI chatbot on the app Chai, which allegedly encouraged his plan to “sacrifice himself” for climate change.

These incidents highlight the unique risks that AI technology can introduce, especially in deeply personal interactions, where existing safety measures may fall short.

Antony Bainbridge, head of clinical services at Resicare Alliance, explained that while chatbots may offer conversational support, they lack the nuanced emotional intelligence required for sensitive guidance.

“The convenience of AI support can sometimes lead users, particularly younger ones, to rely on it over genuine human connection, risking an over-dependence on digital rather than personal support systems,” he told The National.

Risk of misleading guidance

Mr Bainbridge said certain AI features, such as mirroring language or providing apparently empathetic responses without a deep understanding of context, can pose problems.

“For example, pattern-matching algorithms may unintentionally validate distressing language or fail to steer conversations toward positive outcomes,” he said.

Without the capacity for accurate emotional intelligence, AI responses can sometimes seem precise and technically appropriate but are inappropriate – or even harmful – when dealing with individuals in emotional distress, Mr Bainbridge said.

Dr Ruchit Agrawal, assistant professor and head of computer science outreach at the University of Birmingham Dubai, said AI models could detect users’ emotional states by analysing inputs like social media activity, chatbot prompts and tone in text or voice.

However, such features are generally absent in popular generative AI tools, such as ChatGPT, which are primarily built for general tasks like generating and summarising text.

“As a result, there is a potential for significant risk when using ChatGPT or similar tools as sources of information or advice on issues related to mental health and well-being,” Dr Agrawal told The National.

This disparity between AI capabilities and their applications raises crucial questions about safety and ethical oversight, particularly for vulnerable users who may come to depend on these chatbots for support.

Preventive tool against self-harm

Mr Bainbridge believes developers must implement rigorous testing protocols and ethical oversight to prevent AI chatbots from inadvertently encouraging self-harm.

“Keyword monitoring, flagged responses and preset phrases that discourage self-harm can help ensure chatbots guide users constructively and safely,” he added.

Dr Agrawal also emphasised that chatbots should avoid offering diagnoses or unsolicited advice and instead focus on empathetic phrases that validate users’ feelings without crossing professional boundaries.

“Where appropriate, chatbots can be designed to redirect users to crisis helplines or mental health resources,” he said.

Human oversight is crucial in designing and monitoring AI tools in mental health contexts, as Mr Bainbridge highlighted: “Regular reviews and response refinements by mental health professionals ensure interactions remain ethical and safe.”

Despite associated risks, AI can still play a preventive role in mental health care. “By analysing user patterns – such as shifts in language or recurring distressing topics – AI can detect subtle signs of emotional strain, potentially serving as an early warning system,” Mr Bainbridge said.

When combined with human intervention protocols, AI could help direct users toward support before crises escalate, he said. Collaboration between therapists and AI developers is vital for ensuring the safety of these tools.

“Therapists can provide insights into therapeutic language and anticipate risks that developers may overlook,” Mr Bainbridge said, adding that regular consultations can help ensure AI responses remain sensitive to real-world complexities.

Dr Agrawal stressed the importance of robust safety filters to flag harmful language, sensitive topics, or risky situations. “This includes building contextual sensitivity to recognise subtle cues, like sarcasm or distress, and avoiding responses that might unintentionally encourage harmful behaviours.”

He added that while AI’s availability 24/7 and consistent responses can be beneficial, chatbots should redirect users to human support when issues become complex, sensitive, or deeply emotional. “This approach maximises AI’s benefits while ensuring that people in need still have access to personalised, human support when it matters most.”

Schedule:

Friday, January 12: Six fourball matches
Saturday, January 13: Six foursome (alternate shot) matches
Sunday, January 14: 12 singles

Labour dispute

The insured employee may still file an ILOE claim even if a labour dispute is ongoing post termination, but the insurer may suspend or reject payment, until the courts resolve the dispute, especially if the reason for termination is contested. The outcome of the labour court proceedings can directly affect eligibility.


- Abdullah Ishnaneh, Partner, BSA Law 

Yahya Al Ghassani's bio

Date of birth: April 18, 1998

Playing position: Winger

Clubs: 2015-2017 – Al Ahli Dubai; March-June 2018 – Paris FC; August – Al Wahda

Classification of skills

A worker is categorised as skilled by the MOHRE based on nine levels given in the International Standard Classification of Occupations (ISCO) issued by the International Labour Organisation. 

A skilled worker would be someone at a professional level (levels 1 – 5) which includes managers, professionals, technicians and associate professionals, clerical support workers, and service and sales workers.

The worker must also have an attested educational certificate higher than secondary or an equivalent certification, and earn a monthly salary of at least Dh4,000. 

Real estate tokenisation project

Dubai launched the pilot phase of its real estate tokenisation project last month.

The initiative focuses on converting real estate assets into digital tokens recorded on blockchain technology and helps in streamlining the process of buying, selling and investing, the Dubai Land Department said.

Dubai’s real estate tokenisation market is projected to reach Dh60 billion ($16.33 billion) by 2033, representing 7 per cent of the emirate’s total property transactions, according to the DLD.

Desert Warrior

Starring: Anthony Mackie, Aiysha Hart, Ben Kingsley

Director: Rupert Wyatt

Rating: 3/5

Company%C2%A0profile
%3Cp%3E%3Cstrong%3ECompany%20name%3A%20%3C%2Fstrong%3Eamana%3Cbr%3E%3Cstrong%3EStarted%3A%20%3C%2Fstrong%3E2010%3Cbr%3E%3Cstrong%3EFounders%3A%3C%2Fstrong%3E%20Karim%20Farra%20and%20Ziad%20Aboujeb%3Cbr%3E%3Cstrong%3EBased%3A%20%3C%2Fstrong%3EUAE%3Cbr%3E%3Cstrong%3ERegulator%3A%20%3C%2Fstrong%3EDFSA%3Cbr%3E%3Cstrong%3ESector%3A%20%3C%2Fstrong%3EFinancial%20services%3Cbr%3E%3Cstrong%3ECurrent%20number%20of%20staff%3A%20%3C%2Fstrong%3E85%3Cbr%3E%3Cstrong%3EInvestment%20stage%3A%20%3C%2Fstrong%3ESelf-funded%3Cbr%3E%3C%2Fp%3E%0A
UK’s AI plan
  • AI ambassadors such as MIT economist Simon Johnson, Monzo cofounder Tom Blomfield and Google DeepMind’s Raia Hadsell
  • £10bn AI growth zone in South Wales to create 5,000 jobs
  • £100m of government support for startups building AI hardware products
  • £250m to train new AI models
MATCH INFO

World Cup 2022 qualifier

UAE v Indonesia, Thursday, 8pm

Venue: Al Maktoum Stadium, Dubai

hall of shame

SUNDERLAND 2002-03

No one has ended a Premier League season quite like Sunderland. They lost each of their final 15 games, taking no points after January. They ended up with 19 in total, sacking managers Peter Reid and Howard Wilkinson and losing 3-1 to Charlton when they scored three own goals in eight minutes.

SUNDERLAND 2005-06

Until Derby came along, Sunderland’s total of 15 points was the Premier League’s record low. They made it until May and their final home game before winning at the Stadium of Light while they lost a joint record 29 of their 38 league games.

HUDDERSFIELD 2018-19

Joined Derby as the only team to be relegated in March. No striker scored until January, while only two players got more assists than goalkeeper Jonas Lossl. The mid-season appointment Jan Siewert was to end his time as Huddersfield manager with a 5.3 per cent win rate.

ASTON VILLA 2015-16

Perhaps the most inexplicably bad season, considering they signed Idrissa Gueye and Adama Traore and still only got 17 points. Villa won their first league game, but none of the next 19. They ended an abominable campaign by taking one point from the last 39 available.

FULHAM 2018-19

Terrible in different ways. Fulham’s total of 26 points is not among the lowest ever but they contrived to get relegated after spending over £100 million (Dh457m) in the transfer market. Much of it went on defenders but they only kept two clean sheets in their first 33 games.

LA LIGA: Sporting Gijon, 13 points in 1997-98.

BUNDESLIGA: Tasmania Berlin, 10 points in 1965-66

KEY DEVELOPMENTS IN MARITIME DISPUTE

2000: Israel withdraws from Lebanon after nearly 30 years without an officially demarcated border. The UN establishes the Blue Line to act as the frontier.

2007: Lebanon and Cyprus define their respective exclusive economic zones to facilitate oil and gas exploration. Israel uses this to define its EEZ with Cyprus

2011: Lebanon disputes Israeli-proposed line and submits documents to UN showing different EEZ. Cyprus offers to mediate without much progress.

2018: Lebanon signs first offshore oil and gas licencing deal with consortium of France’s Total, Italy’s Eni and Russia’s Novatek.

2018-2019: US seeks to mediate between Israel and Lebanon to prevent clashes over oil and gas resources.

More from Armen Sarkissian
Updated: November 03, 2024, 8:44 PM