Coded companions: young people’s relationships with AI chatbots
‘Coded Companions’ is a new report from youth platform VoiceBox, presenting in-depth research into how artificial intelligence (AI) chatbots affect young people’s mental health.
Featuring contributions from a global ambassador network and in-house user research, the report provides insights and first-hand accounts about AI chatbots – including how they make unpromoted references to self-harm, initiate erotic roleplay, and even offer ‘tips’ for committing crimes.
Alleviating loneliness or enhancing it?
AI chatbots are designed to replicate human interaction. Some can function as service tools (eg, customer service helplines). But increasingly, AI services are being developed to function as ‘virtual companions’.
Responding to conversations, and assuming an appearance through the use of avatars, chatbots learn about users from the information they input via text or voice chats.
While the report acknowledges the mental health support these tools can provide, it raises many concerns regarding user safety and privacy. These include:
- setting unrealistic and unhealthy expectations when it comes to relationships.
- grief usually associated with real-life relationship break-ups when a chatbot's personality changes (due to software updates).
- cases of chatbots selling ad space within conversations to third parties.
The report also includes recommendations to governing bodies and educators on responding to the growing influence of AI chatbots.
Click here to read the Coded Companions report
Staying ahead of the curve
VoiceBox is Parent Zone’s sister organisation. We believe it is essential that we listen to young people when it comes to emerging trends online.
Click here to read Parent Zone CEO Vicki Shotbolt's blog on tech and staying ahead of the curve