child-engage-socially-ai

2 Ways Your Child Might Engage Socially With AI – Part 1

If your child is like many children today, screens factor into their free time more than you’d like. And if you’ve dabbled with ChatGPT, Gemini, or any other generative AI chatbot, you know how human-like they sound with their text and voice interactions. Because of this, chatbots can be an alluring and fun way for kids to pass the time. To them, it can feel like texting a friend—it might just be how kids will use AI.

But chatbots are machines. With the prevalence of generative AI apps, you should be aware of these 2 ways your child might engage socially with AI now or in the near future. The interactions, especially for children and teens, may not be healthy. Here is the first way kids might engage socially with AI. The second way is in the next article in this series.

AI as a Companion or Friend

With AI, it’s now possible to attempt to create the perfect friend. He’ll talk to you whenever you want and won’t leave your messages on “read” for hours. But would you want the perfect friend who’s always ready to talk? Would you allow your child to build one?

Several sites out there including Replika, Nomi, and Kindroid allow you to create free AI friends. Users customize their features and even name them. And by talking to an AI friend, the AI begins to learn about the user and tailor its responses. The result is very similar to chatting with a real human being. For younger users, Character.AI is a popular choice and has a large teenage base. “[C]ompanionship apps are one of the fastest-growing parts of the A.I. industry,” says tech journalist Kevin Roose. “Some users will scoff at befriending a chatbot. But others, especially people for whom socializing is hard or unappealing, will invite A.I.s into the innermost parts of their lives.” For some kids, talking with an AI could be very appealing and helpful. But is it healthy?

Your child might create an AI friend, Roose says, which “won’t be a gimmick, a game or a sign of mental illness. It will feel to them like a real, important relationship, one that offers a convincing replica of empathy and understanding and that, in some cases, feels just as good as the real thing.” But parents should know that even if an AI friend can provide support for their child, it’s still a machine with no emotions or feelings. It’s simply a bot trained on vast quantities of data from the internet. It’s not going to have the same emotional responses of a human friend, and it’ll likely only tell you what you want to hear. Your kids need to know this too.

But because it’s a booming field and our children are frequently on screens, they may stumble across an AI friend site at some point. Creating an AI friend, however, shouldn’t be a substitute for real social interactions. MIT professor Sherry Turkle says talking to a machine doesn’t “develop the muscles—the emotional muscles—needed to have real dialogue with real people.” Chatting with an AI friend might help some kids develop stronger social skills as one teenage user points out in this article in The Verge, but apps like Character.AI can become addicting and blur the lines between fantasy and reality.

Parents should also be aware that companion apps like Character.AI have options for users to create romantic friends while specific apps for AI girlfriends or boyfriends also exist. With both, content can quickly become X-rated.

Rather than having your kids turn to machines for their primary source of support, let yours know you’re available whenever they want to talk. Check in often with how they’re doing and spend quality time together to build a strong basis of trust.

Click here to read part 2 of this article.

Sound off: What most concerns or scares you about AI apps designed for users who want to create friends or romantic partners that are actually AI? 

Huddle up with your kids and ask, “Why is it important to have face-to-face conversations with people instead of just chatting online?”