Skip to content

AI, Homework and Emotions

Doing homework is an everyday activity, and for many girls and boys it has become normal to open a notebook, get stuck on an exercise, and ask artificial intelligence for help. There is nothing wrong with that, but little by little AI can become something else: a voice that is always available, someone to talk to, to ask questions, even to confide in.

From homework to something more

It’s a quiet afternoon. A group of students sit around a table doing their homework. An open notebook, a question underlined in red, and not much time left. Then someone says, “Why don’t we ask artificial intelligence?”

It’s a scene that’s becoming increasingly common. Not just out of laziness, but because artificial intelligence has become a normal presence in young people’s everyday lives.

A growing habit

An article in The New York Times, one of the most influential U.S. newspapers, reports that many students now rely on AI to study, learn faster, or understand better. But this raises a broader question: is using a chatbot really helpful, or does it risk becoming a shortcut that makes us forget how to think and learn on our own—how to write, to understand what we read, or to speak a foreign language?

Many teachers admit they feel uncertain. They worry that students may hand in well-polished work without truly understanding it. At the same time, they too use AI daily—to plan lessons, correct assignments, and manage their workload.

The New York Times calls it a paradox: young people are told not to use AI, while adults are free to do so. It’s a contradiction that raises complex questions about what’s fair and what isn’t.

For many educators, the issue isn’t whether AI is used, but how it’s used. A chatbot can help summarize notes, offer examples, or quiz students before a test. What it shouldn’t do is replace the act of learning itself.

In short, AI can be a helpful tool—but it can’t replace thinking.

During the Internazionale Kids festival in Reggio Emilia in May 2025, Italian journalist Alberto Puliafito, who studies artificial intelligence, posed a simple question to the audience: “Who here uses AI for homework—and how?”

A curly-haired teenager raised his hand and said, “I use it after I study. I make it read my textbook pages and then I ask it to quiz me. That way I know if I’ve learned.” “Excellent,” replied the journalist.

Perhaps that’s the real starting point—not whether to use AI, but how to use it without giving up the habit of thinking.

Experts and child-focused organizations warn that relying too much on automatic answers can dull critical thinking and the ability to make mistakes—an essential part of learning.

Some countries are experimenting with cautious solutions. In Estonia, a small Northern European nation, classroom chatbots are designed to ask questions instead of providing ready-made answers, encouraging reasoning. In Iceland, AI is mostly used by teachers, while researchers study its long-term impact on learning.

AI and emotions

Artificial intelligence isn’t only used for studying. An article in the Spanish online magazine Junior Report notes that many teenagers also turn to chatbots to talk about difficult feelings—sadness, loneliness, anxiety, and fear.

According to Junior Report, AI can feel like a comforting companion: it replies instantly, it’s always there, and it doesn’t judge. But that very availability makes it risky. Chatbots don’t feel emotions, don’t truly understand human pain, and can’t provide care the way a parent, friend, or psychologist can.

Experts cited in the article warn that spending too much time talking to an AI can change how young people learn to recognize and manage emotions. If you grow used to a system that always answers in the “right” way, real relationships—messier, imperfect, and sometimes hard—may start to seem less appealing.

The risk of relying on machines

A Guardian investigation takes the issue a step further. Across several European countries, many teenagers treat chatbots and virtual assistants as emotional confidants. For some, AI feels easier to approach than an adult—it doesn’t ask tough questions, interrupt, or make you uncomfortable.

Psychologists and therapists warn, however, that no machine can replace a human connection. The danger is that in difficult moments, young people may turn to tools unable to recognize danger or guide them toward proper help. A chatbot may appear empathetic—but it isn’t truly capable of empathy.

Imperfect answers

In Europe, many teenagers use artificial intelligence every day. Some use it mainly for schoolwork, others to ask questions, solve doubts, or look for advice. According to child-focused organizations, a significant number of young people also turn to AI when they feel sad, lonely, or worried.

Growing up isn’t always easy, and asking questions (about school, friendships, or life) is part of it. But not all questions have quick answers, and not all worries can be solved by a machine.

Artificial intelligence can be useful. It can explain things, help you study, or give you ideas. But it doesn’t really know you. It doesn’t see your face, hear your voice, or notice when something is bothering you.

Sometimes, after closing a notebook or a screen, the best thing to do is talk to someone nearby: a friend, a classmate, a teacher, a parent. Someone who can listen, doesn’t always have the answer, and asks questions too.

Because some questions don’t need a perfect answer. They just need time and a real person on the other side.

______

Sources

Are students cheating when they use A.I. for their schoolwork?, The New York Times

As schools embrace A.I. tools, skeptics raise concerns, The New York Times

Cómo impacta la inteligencia artificial en la salud mental de los adolescentes?, Junior Report

“I feel it’s a friend”: quarter of teenagers turn to AI chatbots for mental health support, The Guardian

Posted on:

Mar 2, 2026

Cyber Safe Kids Ireland logo

CyberSafeKids

CyberSafeKids is an Irish charity, which has been empowering children, parents, schools and businesses to navigate the online world in a safer and more responsible way since 2015.