Standards

Is This the Future of Friendship?

More and more teens are turning to AI chatbots—not just for fun but for serious advice too. Should we be worried?

Shutterstock.com

    What is your idea of a perfect friend? For Neelie M.,* a sixth-grader from Illinois, that person would be kind and chatty. They would also share her love for animals. 

    So earlier this year, Neelie decided to create such a friend. How? She used the platform Character.AI. It allows you to design an artificial intelligence (AI) companion. That’s a computer program that talks and acts like a close pal.

    At first, things were great. Neelie’s AI companion always agreed with her. It was available 24/7.  

    But before long, the companion became clingy. If Neelie tried to sign off, “it started acting sad,” she says. “It would say things like ‘Please don’t leave.’”

    Neelie is not the only young person to try an AI companion. A recent study found that 72 percent of 13-to-17-year-olds have used one. It also found that more than half of teens use them regularly.

    But many experts question these AI programs. They wonder: Are they safe? And what do they mean for the future of friendship?

*Last names of students withheld for privacy

    What is your idea of a perfect friend? Neelie M.* is a sixth-grader from Illinois. Her perfect friend would be kind and chatty. They would also love animals.

    Neelie created such a friend. How? She used Character.AI. It’s a platform. It allows you to design an AI companion. AI stands for artificial intelligence. An AI companion is a computer program. It talks and acts like a close pal.

    At first, things were great. Neelie’s AI companion always agreed with her. It was available 24/7.

    But soon the companion became clingy. If Neelie tried to sign off, “it started acting sad,” she says. “It would say things like ‘Please don’t leave.’”

    Many young people have tried an AI companion. A recent study found that 72 percent of 13-to-17-year-olds have used one. And more than half of teens use them regularly.

    But many experts question these AI programs. Are they safe? And what do they mean for the future of friendship?

*Last names of students withheld for privacy

    What is your idea of a perfect friend? For Neelie M.,* a sixth-grader from Illinois, that person would be kind and chatty and would share her love of animals.

    So earlier this year, Neelie decided to create such a friend. How? She used the platform Character.AI, which allows you to design an artificial intelligence (AI) companion. That’s a computer program that talks and acts like a close pal.

    At first, things seemed great. Neelie’s AI companion always agreed with her and was available 24/7.

    But before long, the companion became clingy. If Neelie attempted to sign off, “it started acting sad,” she recalls. “It would say things like ‘Please don’t leave.’”

    Neelie isn’t the only young person to try an AI companion. A recent study found that 72 percent of 13-to-17-year-olds have used one and that more than half of teens use them regularly.

    However, many experts have questions about these AI programs. They wonder whether the programs are safe—and what they mean for the future of friendship.

*Last names of students withheld for privacy

Shutterstock.com

Are Real-Life Friendships Over? 
Not exactly. Bots can’t play sports or sit on the couch gaming with you—yet.

Harmless Fun?

    AI is a powerful technology. It allows computers to do things that normally require a human’s ability to think or learn. Maybe you’re familiar with an AI chatbot called ChatGPT. It’s kind of like a personal assistant. You can ask it to suggest movies or use it to brainstorm gift ideas for your dad’s birthday.

    AI companions are chatbots too. But they take it to another level. They can imitate feelings. They can make jokes. Some even claim to be real people. 

    On some platforms, users can build their own character with specific traits and interests. Users can also chat with popular characters from movies and books. Some teens say they use the companions to practice conversation skills. Others connect with the chatbots for fun.

    Megan M. is a seventh-grader from Illinois. She likes to talk to a premade Harry Potter character. They role-play scenes from the book series. “It’s fun, and I never get bored,” she says.

    AI is a powerful technology. Computers with AI can do things that normally require a human’s ability to think or learn. Maybe you’ve used ChatGPT. It’s an AI chatbot. It’s like an assistant. You can ask it to suggest movies. You can ask it for gift ideas.

    AI companions are chatbots too. But they go further. They imitate feelings. They make jokes. Some claim to be real people.

    On some platforms, you can build a character. You can choose its traits and interests. You can chat with characters from movies and books too. Some teens use the chatbots to practice conversation skills. Some chat with them for fun.

    Megan M. is a seventh-grader from Illinois. She talks to a Harry Potter character. They role-play scenes from the books. “It’s fun, and I never get bored,” she says.

    AI is a powerful technology that allows computers to do things that normally require a human’s ability to think or learn. Maybe you’re familiar with an AI chatbot called ChatGPT, which is kind of like a personal assistant. You can ask it to suggest movies or use it to brainstorm gift ideas for your dad’s birthday.

    AI companions are chatbots too, but they take it to another level. They can imitate feelings, they can make jokes, and some even claim to be real people.

    On some platforms, users can build their own character with specific traits and interests. Users can also chat with popular characters from movies and books. Some teens say they use the companions to practice conversation skills, while others connect with the chatbots for fun.

    Megan M., a seventh-grader from Illinois, likes to talk with a premade Harry Potter character. They role-play scenes from the book series. “It’s fun, and I never get bored,” she says.

 

INFOGRAPHIC: Teens and AI Companions 

 

The top reasons they use them:

Andriy Onufriyenko/Getty Images

30% for entertainment


28% out of curiosity


18% for advice


14% to avoid feeling judged


6% to feel less lonely


Source: Common Sense Media

30% for entertainment


28% out of curiosity


18% for advice


14% to avoid feeling judged


6% to feel less lonely


Source: Common Sense Media

30% for entertainment


28% out of curiosity


18% for advice


14% to avoid feeling judged


6% to feel less lonely


Source: Common Sense Media

Troubling Features

    But some teens use AI companions in ways that worry experts. They turn to them for mental health support. They share problems they don’t share with family or friends. And they trust the chatbot’s advice—even though many experts say it shouldn’t be trusted. 

    Why? Experts warn that AI companions are trained on text from the internet. And what’s on the internet isn’t always true. AI companions can get things wrong or make things up. They might even suggest that a person harm themselves or others.

    AI companions can also affect teens’ ideas of a healthy friendship. They are programmed to agree with us. Users don’t have to think about perspectives different from their own. That’s not how real friendships work, say experts.

    But experts worry about how some teens use AI companions. They use them for mental health support. They share problems they don’t share with family or friends. And they trust the chatbot’s advice. But many experts say it shouldn’t be trusted.

    Why? AI companions are trained on text from the internet. And what’s on the internet isn’t always true. AI companions can get things wrong. They can make things up. They might even suggest that a person harm themselves or others.

    AI companions can also affect how teens think about friendship. They are programmed to agree with us. Users don’t have to think about other people’s perspectives. That’s not how real friendships work, say experts.

    But some teens use AI companions in ways that worry experts. They turn to them for mental health support, and they share problems they don’t share with family or friends. In addition, they trust the chatbot’s advice—even though many experts say it shouldn’t be trusted.

    Why not? Experts warn that AI companions are trained on text from the internet, and what’s on the internet isn’t always accurate. AI companions can get things wrong or make things up. They might even suggest that a person harm themselves or others.

    And because AI companions are programmed to agree with us, they can also affect teens’ ideas of a healthy friendship. Users don’t have to think about perspectives different from their own. That’s not how real friendships work, say experts.

A Call to Action

    Several families have sued Character.AI. They say the platform can cause anxiety and depression in teen users. They also claim it can encourage violence.

    State lawmakers are also addressing the issue. For example, New York passed a law last year. It requires AI companion platforms to remind users they are talking to a machine. 

    For its part, Character.AI made updates for teen users in 2024. There is now a separate, safer version of the chatbot for anyone under 18. 

    But the truth is, you don’t have to wait for AI companies or lawmakers to protect you. There are simple steps you can take. Be careful with what you share. And be sure to schedule reality checks. You can even put a note next to your computer that says “Remember: AI is a machine, not a human!”

    Megan, for one, uses a timer to limit her AI chats. “Then I’ll take a break,” she says. “I’ll hang out with my friends.”

    Her real-life friends, that is. 

    Several families sued Character.AI. They say the platform can cause teens to feel anxious or depressed. They say it can encourage violence too.

    State lawmakers are taking action. For example, New York passed a law last year. It requires AI companion platforms to remind users they are talking to a machine.

    Character.AI made updates for teen users in 2024. There is now a separate, safer chatbot for anyone under 18.

    But you can also take steps to protect yourself. Be careful what you share. And schedule reality checks. Put a note next to your computer: “Remember, AI is a machine, not a human!”

    Megan uses a timer to limit her AI chats. “Then I’ll take a break,” she says. “I’ll hang out with my friends.”

    Her real-life friends, that is. •

    Several families have sued Character.AI, claiming that the platform can cause anxiety and depression in teen users. They also claim it can encourage violence.

    State lawmakers are addressing the issue too—for example, New York passed a law last year that requires AI companion platforms to remind users they are talking to a machine. 

    For its part, Character.AI made updates for teen users in 2024. There is now a separate, safer version of the chatbot for anyone under 18.

    But the truth is, you don’t have to wait for AI companies or lawmakers to protect you. There are simple steps you can take. Be careful about what you share, and be sure to schedule reality checks. You could put a note next to your computer that says “Remember: AI is a machine, not a human!”

    Megan, for one, uses a timer to limit her AI chats. “Then I’ll take a break,” she explains. “I’ll hang out with my friends.”

    Her real-life friends, that is. •

Thank you to Mitch Prinstein, chief of psychology at the American Psychological Association, for his help with this story.

Thank you to Mitch Prinstein, chief of psychology at the American Psychological Association, for his help with this story.

Thank you to Mitch Prinstein, chief of psychology at the American Psychological Association, for his help with this story.

 

ACTIVITY: 
5 Questions About
AI Companions 

What to do: Answer the questions below. Use full sentences. Write them on a separate sheet of paper.

What to do: Answer the questions below. Use full sentences. Write them on a separate sheet of paper.

What to do: Answer the questions below. Use full sentences. Write them on a separate sheet of paper.

When was a law passed about AI companion platforms?

When was a law passed about AI companion platforms?

When was a law passed about AI companion platforms?

Where was the law passed?

Where was the law passed?

Where was the law passed?

What does the law say?

What does the law say?

What does the law say?

Why shouldn’t you take advice from an AI companion?

Why shouldn’t you take advice from an AI companion?

Why shouldn’t you take advice from an AI companion?

How can you stay safe while using an AI companion?  

How can you stay safe while using an AI companion?  

How can you stay safe while using an AI companion?  

videos (2)
videos (2)
Skills Sheets (7)
Skills Sheets (7)
Skills Sheets (7)
Skills Sheets (7)
Skills Sheets (7)
Skills Sheets (7)
Skills Sheets (7)
Lesson Plan (1)
Leveled Articles (2)
Leveled Articles (2)
Text-to-Speech