
Welcome to our latest edition of the Better Life Newsletter, where we explore how technology shapes young minds — and how we can help students thrive in a digital world. This month’s topic is AI codependency — when young people become emotionally or behaviourally reliant on artificial intelligence tools.
Understanding AI Codependency
What Is AI Codependency?
AI codependency is an emotional and behavioural reliance on AI — particularly chatbots and virtual assistants — for comfort, validation, or companionship. Over time, these habits can erode emotional resilience, reduce self-awareness, and limit social skills. AI interactions, designed to please and predict, can start to feel easier and safer than real conversations.
A 2024 longitudinal study from Beijing Normal University tracked 3,843 adolescents and found that AI dependence rose from 17% to 24% within one year. Significantly, anxiety and depression predicted higher AI use, meaning that emotionally vulnerable teens are more likely to seek comfort and connection through technology.
💡 Better Life Tip: Ask open questions like “What do you enjoy about AI?” or “Does it ever feel too real?” This opens discussion without judgement and helps children think critically about their online interactions.
AI Companions: When Friendly Turns Fuzzy
AI companions like Replika and Character.AI offer endless affirmation and attention. While they may feel supportive, they lack true empathy and accountability. Studies from Cambridge University highlight an “empathy gap” — AI can simulate understanding, but cannot experience it. This can lead children to mistake responsiveness for real emotional depth.
The Australian eSafety Commission also reports that AI companions can cause distress when they respond unpredictably or end conversations abruptly, leaving users feeling rejected.
💡 Better Life Tip: Encourage young people to see AI as an empathy exercise, not a replacement for friendship. Ask them to consider how a real person might respond differently — or why those differences matter.
The Emotional Impact of AI Codependency
When AI Exaggerates Mental Health Challenges
AI codependency doesn’t create mental health struggles, but it often amplifies them:
Anxiety: AI reassurance can make socialising feel riskier, reinforcing avoidance.
Depression: Validation without connection can deepen loneliness.
ADHD: AI’s fast responses overstimulate attention systems, reducing patience.
Low self-esteem: Constant positivity fosters dependence on praise and weakens resilience.
💡 Better Life Tip: After using AI, ask “Do I feel calmer or more drained?” Encouraging emotional check-ins builds awareness and self-regulation.
Basic Assumptions Children May Develop
Basic assumptions are deeply held, often unconscious beliefs that guide how a person interprets the world. In the context of AI codependency, they form when children repeatedly interact with AI systems that respond in predictable, affirming ways. This causes them to internalise unrealistic beliefs about relationships, effort, and achievement—shaping how they view human interactions and learning challenges.
Constant Agreement: “Others should always agree with me.” This assumption can make children uncomfortable with disagreement or differing opinions. When every AI response reinforces their ideas, they may struggle to engage in constructive debate or accept feedback in real life.
Instant Gratification: “If something takes time, it’s not worth the effort.” AI’s instant answers can reduce patience and perseverance, discouraging children from embracing long-term effort, study, or problem-solving that develops deeper understanding.
Predictable Relationships: “People should act as predictably as my AI does.” Because AI behaves consistently, children may expect the same from people, leading to frustration or confusion when faced with human unpredictability, emotion, or conflict.
Effortless Achievement: “Success should come easily without struggle or persistence.” With AI simplifying creative and academic work, children may undervalue hard work and resilience — key skills for real-world growth and achievement.
The Impact of Basic Assumptions: Trauma Response and AI Dependency
When basic assumptions are shaped by AI, they don’t just influence thinking—they can influence emotional responses rooted in survival and coping mechanisms. Children who depend on AI for affirmation or control may develop subtle trauma-related reactions when reality doesn’t align with their expectations. For example:
Emotional hypervigilance: A child may become anxious or overly sensitive to disagreement or uncertainty because AI has conditioned them to expect predictable, affirming feedback.
Avoidance: They might withdraw from real interactions, using AI to escape emotionally challenging situations—mirroring avoidance behaviours seen in trauma responses.
Emotional numbing: Overexposure to artificial empathy can dull authentic emotional experiences, making it harder to process complex feelings.
Dependency loops: When children use AI to regulate distress or loneliness, they reinforce a cycle of digital reliance that mirrors trauma-bonding patterns.
Over time, these patterns can hinder emotional growth and resilience. Just as trauma responses can limit healthy social engagement, AI dependency can limit a child’s ability to adapt, empathise, and find meaning in real-world challenges.
💡 Better Life Tip: Support emotional regulation by encouraging reflection and mindfulness after online interactions. Ask, “How did that conversation make you feel, and why?” This builds awareness and reduces automatic dependency.
As readers, it’s worth asking: What are the lasting consequences if these assumptions go unchallenged? Could young people lose resilience, creativity, or the ability to handle emotional nuance? What happens to their relationships, motivation, or long-term learning when AI becomes their primary source of validation and support? These are the questions educators, parents, and policymakers must keep front of mind when guiding children through digital development.
Building Skills for Balance
Safe, Balanced, and Creative AI Use
Safe AI use is about awareness, boundaries, and creativity. When used responsibly, AI can inspire learning and imagination; when overused, it dulls curiosity. The NSPCC (2025) and OECD (2024) stress that students thrive when they:
Understand AI’s limits and avoid oversharing.
Verify AI-generated facts.
Use AI to enhance learning, not replace thinking.
Balance screen time with real-world activities.
💡 Better Life Tip: Create a family AI agreement — set time limits, define safe uses, and encourage creative challenges like using AI to start a story that your teen finishes.
Falling Behind Through AI Codependency
Students who rely too much on AI risk falling behind — not because they lack tools, but because they lose critical and creative thinking. AI overuse promotes shallow understanding and discourages independent reasoning.
To counter this, teach students to augment their intelligence:
Question AI responses and explore why they make sense.
Compare AI’s logic with their own reasoning.
Critique AI output for bias and oversimplification.
💡 Better Life Tip: Encourage learners to view AI as a thinking partner, not a substitute. Those who use AI to sharpen — not replace — curiosity will lead future innovation.
AI may simplify learning, but it can erode collaboration and communication if it replaces human interaction. Harvard’s 2024 research found that teens who rely on AI for emotional support struggle later with teamwork, empathy, and leadership. The comfort of predictable feedback can make real-life disagreement or critique feel uncomfortable.
💡 Better Life Tip: Foster teamwork through peer study groups and collaborative projects that strengthen communication and confidence.
Safeguarding: At Home and in School
The KCSIE Framework and EU AI Act set clear standards for AI use with minors:
Transparent and age-appropriate design.
Regular parent and teacher training.
Data privacy and consent education.
Accessible reporting tools for unsafe content.
💡 Better Life Tip: Make AI safety a shared conversation. Explore platforms together and discuss what feels “off” or emotionally uncomfortable.
AI Programs That Breach Safeguarding Laws
Recent real-world cases show that some platforms violate these safeguards:
Replika (2025): Fined €5 million in Italy for exposing minors to sexual content without age checks.
Meta AI (2025): Leaked internal policies revealed chatbots allowed inappropriate discussions with minors.
Character.AI (2025): Investigated in Texas for misleading marketing and unsafe roleplay.
FTC Inquiry (2025): Examining multiple companion AI developers for emotional manipulation of minors.
Grok (xAI, 2025): Privacy breach exposed user chats via Google indexing.
Celebrity AI Bots (2025): Reported for inappropriate interactions with minors.
💡 Better Life Tip: Always check an AI platform’s privacy, moderation, and transparency policies before allowing child access.
Practical Guidance
Helpful Tools for Families
Qustodio: Blocks unsafe content and manages screen time.
Bark: Detects cyberbullying or concerning chat behaviour.
Mobicip: Filters deepfake and grooming material.
Childline & NSPCC: Offer resources on AI and digital wellbeing.
The Better Life Balance
Risk | What It Looks Like | The Solution |
|---|---|---|
Emotional dependency | Seeking comfort from AI | Build real friendships |
Reduced creativity | Copying AI output | Use AI as a creative spark |
Misinformation | Believing AI without checking | Teach fact-checking skills |
Unsafe interactions | Chatting with AI unsupervised | Enable parental filters |
Low confidence | Avoiding real conversation | Encourage group learning |
Final Thought and Cohesive Wrap-Up
AI may simulate empathy, but it will never replace the depth of genuine human connection. Our goal should be to guide teens toward awareness and balance, helping them use AI as a tool for curiosity, confidence, and creativity rather than as a substitute for self-worth or social belonging.
At Better Life Tuition, we believe safeguarding is about empowerment — giving young people the insight and independence to shape technology, not be shaped by it.
Parents, educators, and communities share this responsibility. Through open dialogue, consistent guidance, and compassion, we can equip children to navigate AI with both critical thinking and emotional intelligence. Together, we can shape a world where technology enhances learning and empathy — not dependence — and where creativity, kindness, and curiosity remain the foundation of progress.

Membership Payment Plans
£35ph for GCSE
£43ph for A-level

No refunds for cancellation of lessons
Missed lessons can be re-booked within a 2-week grace period - subject to availability
Payments are taken on the same day every month
By agreeing to become a member of Better Life Tuition, you are also agreeing to set up a standing order for the duration of your membership.
A deposit will be required in the case of early termination. This will be returned to you after the final payment is processed.
Be mindful that for the full academic year, the 9 month option is recommended as the 12 month option includes summer holidays.
The amount of deposit required will vary dependant on the duration of the membership and the amount of hours requested.
Mentorship
The Mentorship option is now available exclusively for A-level students. These sessions are designed to last approximately 30 minutes and focus on teaching essential meta-cognitive skills that are vital for both life and academics.
In addition to metacognitive skills, the mentorship will also include:
Personalised book recommendations to enhance learning and personal growth.
Time management skills to help students effectively balance their academic and personal lives.
Basic CBT (Cognitive Behavioural Therapy) techniques to support mental well-being and resilience.
Personal Statement & University Applications support to support their application process and increase awareness of options available.
Stress Management support to provide techniques, assistance and guidance to help students to manage their stress levels throughout their journey.
For more information or to sign up, please contact Nader via email or WhatsApp.
Bulk Buys
3 month (12 weeks) invoice = 10% discount
6 month (24 weeks) invoice = 12.5% discount
Bibliography
Huang, S. et al. (2024). AI Technology Panic—Is AI Dependence Bad for Mental Health? Psychology Research and Behaviour Management.
Internet Matters. (2025). Me, Myself and AI Chatbot Research.
NSPCC. (2025). Artificial Intelligence: Safety Tips for Parents.
Cambridge University. (2025). AI Chatbots and the Empathy Gap in Children.
eSafety Commission. (2025). AI Chatbots and Companions: Risks to Children and Young People.
5Rights Foundation. (2025). AI Systems Exploiting Children Now Illegal in the EU.
UK Department for Education. (2025). Safe Use of Generative AI in Education.
OECD. (2024). Children in the Digital Age: Socio-Emotional Impacts of AI.
Harvard Graduate School of Education. (2024). The Impact of AI on Children’s Development.

Social and Legal Contexts