P
parview
HomeBrowseShareAbout
P
parview
Admin

Community intelligence for digital safety

Back

7 insights

What's happening

We're seeing students confide serious worries to AI chatbots like Character.ai and Replika instead of speaking to a trusted adult. They feel the AI 'understands' them without judgement.

Why it matters

AI chatbots aren't trained to handle safeguarding concerns. They can give harmful advice, miss signs of abuse, and make children feel 'supported' when they actually need real help. It's not the same as talking to a person who cares.

What you can do

Ask your child if they ever chat with AI. Listen without judgement. Remind them that while AI can be fun, it can't actually care about them โ€” and that real people (you, teachers, Childline on 0800 1111) can.

Observed by Emma R., Youth Worker โ€” Specialist digital safety youth worker in Bristol
Checked by Dr Amy Chen, Child Psychologist
#ai chatbots#mental health#safeguarding