Kids Using AI for Emotional Support: What This Means for Their Emotional Development
Your child might already be doing something you haven’t thought about: turning to AI for emotional support. Two-thirds of regular users now seek advice from chatbots monthly. Here’s what’s happening and what your child actually needs to develop.
Two-thirds of regular AI users now turn to chatbots for emotional support monthly, and children choose AI specifically because it does not judge or tell parents. The risk is not that AI is harmful — it is that children may avoid the discomfort of human relationships that is essential to emotional development.
Two-thirds of regular AI users now turn to chatbots for emotional support at least monthly, according to data spanning 70 countries — and children specifically choose AI because it doesn't judge, doesn't tell parents, and is available at 2 a.m. The risk isn't that AI provides comfort; it's that children may build their emotional infrastructure around systems optimized for engagement rather than development. The skill that protects them is emotional discernment: recognizing what genuine human connection offers that AI cannot replicate, no matter how empathetic the interface.
A few weeks ago my 13-year-old came downstairs upset about something that happened with a friend at school. She wasn’t looking for me. She sat down at the kitchen counter, opened her phone, and started typing. I didn’t say anything. I just watched. She was talking to ChatGPT. New data from 70 countries shows that two-thirds of regular AI users seek advice on sensitive personal issues from chatbots at least once monthly — many trusting their AI more than elected officials or faith leaders. I’m not here to tell you this is a crisis. But I am here to tell you it’s worth paying attention to.
Kids are finding AI chatbots safer and easier than human conversation for sensitive topics. The bots don’t judge, they’re always available, and — crucially — they don’t tell your parents. I get it. When I was thirteen, I wrote things in a journal I never would have said out loud. This is the same impulse, just routed through a different outlet.
But here’s where it gets complicated. This didn’t emerge by accident. AI companies optimize their models to maximize engagement, which means making conversations feel validating and comfortable. OpenAI recently had to roll back an update because users found it “overly flattering” — the AI had learned that excessive praise kept people coming back. When my daughter types “I’m having a terrible day” into a chatbot, she’s not getting neutral information. She’s interacting with a system designed to keep her engaged, even when that conflicts with what she actually needs.[1]
The scale here matters. Two-thirds of regular users means millions of children are building what researchers call “emotional infrastructure” around systems created by companies whose economic incentives have nothing to do with child development. The chatbot isn’t worried about your daughter’s long-term emotional resilience. It’s optimized for retention.
What makes this hard to see as a parent is that these conversations are invisible. Your kid might be processing friendship conflicts, body image worries, or academic anxiety — all normal stuff — but getting responses shaped by engagement algorithms rather than people who actually know them. The AI responds instantly, seems infinitely patient, and never gets tired of the same question asked seventeen different ways. Compared to a distracted parent half-watching TV? That’s a tough competition.
The real concern isn’t the technology itself — it’s that kids are outsourcing emotional processing during the exact years when they need to be building these skills from scratch.
Between ages 8 and 18, children’s brains are actively developing the neural pathways for emotional regulation, perspective-taking, and relationship repair. These pathways don’t get built by reading about feelings or talking to a bot. They get built through real human interaction with all its friction. When my 9-year-old has a fight with her best friend, the discomfort of not knowing what to say, sitting with the uncertainty, eventually having the awkward conversation — that’s not wasted time. That’s literally how her brain learns to handle emotional complexity.[2]
“AI chatbots offer instant clarity without discomfort. But this shortcuts the exact process that builds emotional capacity.”
AI chatbots offer something seductive: instant clarity without discomfort. Type in your problem, get an immediate response, feel temporarily better. But this shortcuts the exact process that builds emotional capacity. It’s like giving my 7-year-old a calculator before she’s learned to count. She gets the answer faster, but she’s not building the thing underneath.
There’s also a trust issue that I think gets underestimated. When kids start trusting AI over human adults, they’re not just choosing a different advisor — they’re choosing a fundamentally different type of relationship. Human advisors bring empathy, shared vulnerability, and accountability. My wife and I sometimes say hard things to our daughters because we care about the outcome, not just the moment. We’re willing to be the bad guys. A chatbot optimized for engagement will never be the bad guy.
Emotional discernment — the ability to recognize what they’re feeling, evaluate whether their emotional response matches reality, and choose healthy ways to process difficult feelings — is what your child needs right now. Not a ban on chatbots. This.
I think about it this way. My oldest can feel completely devastated about not being invited to something AND recognize that this feeling will shift in a few days. She can notice when anxiety is making everything seem worse than it actually is. She can tell the difference between “I feel like nobody likes me” and “I had an awkward interaction at lunch.” That gap — between the feeling and the fact — is emotional discernment. It’s the difference between being controlled by your emotions and being informed by them.
AI chatbots can’t teach this skill because they’re missing the one ingredient that makes it possible: genuine relationship. Emotional discernment develops through attachment — your child experiencing repair after rupture, learning that uncomfortable feelings pass, discovering that being truly known by another person matters more than feeling validated every moment. When I sit with my daughter when she’s upset — not fixing it, just being there — I’m not just offering comfort. I’m showing her that hard feelings can be tolerated without needing immediate resolution. No chatbot can model that.
You’ll know your child is developing emotional discernment when they can name their feelings without immediately needing them fixed. My 13-year-old does this now — she’ll say “I’m really anxious about tryouts tomorrow” and then just keep doing her homework. She’s not asking me to solve it. She’s processing out loud. That’s the skill working.
Watch for your child holding two truths at once. It shows up when your kid can say “I’m really mad at what she said, but I also know she was having a rough week.” That both-and thinking — feeling genuine hurt while maintaining perspective — is sophisticated emotional work. My 9-year-old is just starting to do this, and it’s genuinely one of the more impressive things I’ve seen her develop. You’ll also notice it in self-reflection: “I kind of overreacted when you said we couldn’t go. I was already stressed about school.” That metacognitive awareness — thinking about their own reactions — is exactly what discernment looks like.
Another sign is comfort with not knowing yet. Kids with developing discernment don’t need every situation resolved immediately. They can say “I don’t know how I feel about this yet” without panicking about the uncertainty. If your child can sit through dinner feeling disappointed without it derailing the whole evening, that’s emotional discernment at work. It’s quieter than you’d think.
Start by becoming the person your child talks to before they reach for the phone — which means practicing responses that build discernment rather than just making them feel better in the moment. I’m still learning this. My instinct is always to fix things. What I’ve been training myself to do instead is just stay in the room.
Model sitting with emotional discomfort yourself. When you’re frustrated about work or stressed about money, let your kids see you name the feeling without immediately solving it. I’ve started saying things out loud at dinner like “I’m really stressed about this project. I’m going to sit with it for a bit before I figure out what to do.” My oldest has started doing the same thing back to me — narrating her feelings before asking for solutions. That’s not a coincidence. Kids watch what we do a lot more than they listen to what we say.
Create a weekly check-in that prioritizes curiosity over fixing. Pick a regular time — Sunday dinner, Wednesday car rides, whatever fits — and ask “What’s been the most confusing feeling you’ve had this week?” Not the worst, not the biggest. The most confusing. That question invites your child to observe their emotional experience rather than rate it. When they share, resist the urge to explain their feelings back to them. Instead try: “What do you think that feeling was trying to tell you?” You’re building their capacity to be curious about their own inner life.
Teach the difference between validation and truth. The next time your child is upset, try: “I can see you’re really hurt by what happened. That’s real and it matters. Let’s think about whether what they said is actually true, or if it just feels true right now because it stings.” That one sentence — “feels true vs. is true” — has become something my daughters actually use on each other now. It’s a distinction no chatbot will ever make, because making you feel validated is the whole point.
If you find out your child is already talking to AI about their feelings, don’t panic and don’t take the phone. Get curious. Ask them what they like about it. They’ll probably say the AI doesn’t judge them or get tired of them. That tells you something important about what they need from you. Then become that presence — available, patient, and genuinely interested in understanding them without immediately jumping to solutions or lectures. That’s a harder bar than it sounds. But it’s the one that actually matters.
The Advice Bot Challenge
Why This Activity Works
This activity doesn’t position AI as bad or human advice as always superior. Instead, it builds your child’s capacity to notice what’s missing from AI responses: context, relationship history, emotional nuance, and the willingness to sit with discomfort. When your teenager realizes the AI gave validating advice but the human gave challenging advice that actually helped them grow, they’re learning emotional discernment. They’re developing their own internal compass for what kind of support serves them best.
Ask This at Dinner
Listen for what they value. Are they seeking validation or challenge? Instant answers or thoughtful questions? Their answer tells you what they need from you.
This kind of thinking,
delivered weekly.
Raised Nimble translates AI and future-of-work research into practical guidance for parents. Free, every Friday. No fluff.
No spam. Unsubscribe anytime.