Thursday, April 17, 2025

Algorithmic Despair: The Mental Health Risks of Advanced AI-Driven Social Media for Young Adults in the Philippine Setting

Algorithmic Despair: The Mental Health Risks of Advanced AI-Driven Social Media for Young Adults in the Philippine Setting

Disclaimer from the Author: 

This article is a study and a reflection of my perspective, formulated from various frameworks and best practices I have encountered in my academic and professional journey. The examples and figures presented are conceptual and should be treated as guiding principles, not as real-world scenarios or validated data.

Readers are advised to use the content herein as a reference for exploring ideas and strategies, not as a definitive source of operational frameworks or policy implementation. While the insights aim to inspire critical thinking and understanding, they are not grounded in empirical research or official government practices.

Users should exercise discretion and seek further research or professional guidance when applying these principles to real-life situations.


Introduction

Advanced AI systems today are no longer passive tools; they are active agents designed to maximize user engagement. These algorithms, particularly on social media platforms, are trained to predict and manipulate user behavior, often at the cost of mental well-being. For young adults and individuals suffering from depression, this form of engagement can become dangerously addictive, reinforcing feelings of isolation, anxiety, and hopelessness. This paper examines the risk these AI-driven platforms pose to mental health, focusing on the Filipino youth, concerning the tragic real-life case of Molly Russell, as reported by The Guardian (Milmo, 2022).



The Case of Molly Russell: A Global Warning

In 2022, The Guardian detailed the heart-wrenching story of 14-year-old Molly Russell, who died by suicide after being exposed to a torrent of self-harm and suicide-related content curated by AI algorithms on Instagram and Pinterest. These platforms served her increasingly bleak and graphic material, despite clear indicators of mental distress. Her case triggered public outrage and a renewed call for regulating algorithmic influence on vulnerable individuals (Milmo, 2022).

Milmo, D. (2022, October 1). ‘The bleakest of worlds’: how Molly Russell fell into a vortex of despair on social media. The Guardian. https://www.theguardian.com/technology/2022/sep/30/how-molly-russell-fell-into-a-vortex-of-despair-on-social-media


AI and the Engagement Algorithm: A Double-Edged Sword

AI models on platforms like TikTok, Facebook, Instagram, and YouTube rely on reinforcement learning and deep neural networks to continuously adapt and serve content most likely to keep users scrolling. While this enhances user experience superficially, it creates "echo chambers of despair" for those battling mental health conditions. Instead of offering support or intervention, the algorithm may feed users content related to:

  • Self-harm

  • Toxic comparison

  • Suicide ideation

  • Cyberbullying

  • Unrealistic body image standards

These create a feedback loop of negativity—where a depressed user is fed content that reinforces their depression.


Filipino Setting: Cultural and Systemic Vulnerabilities

1. Youth Dependence on Social Media

According to DataReportal (2024), the average Filipino spends 3 hours and 43 minutes per day on social media. For many young Filipinos, platforms like Facebook, TikTok, and Instagram are not just for leisure—they're spaces for validation and emotional release.

Scenario: A 17-year-old SHS student from Bulacan struggling with academic pressure turns to TikTok for distraction. The algorithm begins feeding him dark content, including romanticized depictions of suicide or "sad boy" aesthetics. Instead of relief, he spirals further into depression.

2. Mental Health Stigma and Lack of Access

Filipinos often avoid seeking professional help due to cultural stigma or economic barriers. Per the Philippine Mental Health Association (PMHA), only 3 to 5 psychiatrists are available per 100,000 Filipinos.

Scenario: A 20-year-old female college dropout in Davao shows signs of depression but receives only advice like "dasal lang" or "kaya mo 'yan" from family. Her only outlet becomes Instagram, where self-harm content masked as "aesthetic" worsens her mental state.

3. Poor Digital Literacy and Parental Guidance

Filipino parents, especially in rural settings, may not be digitally literate enough to understand what their children consume online. Many are unaware of the existence of AI algorithms or harmful online trends like "sadfishing" or "trauma dumping."

Scenario: In Leyte, a 15-year-old uses a low-cost Android phone for YouTube Shorts. Without supervision, she is repeatedly shown pro-anorexia or "thinspo" content, falsely believing it's "normal" because the algorithm keeps recommending it.


Risks and Implications

Risk CategoryDescriptionImpact on Filipino Youth
Algorithmic ManipulationAI boosts engagement by exploiting emotional vulnerabilityOverexposure to harmful content fuels suicidal ideation
Addictive DesignDopamine-driven UI creates social media dependenceLeads to isolation, anxiety, and sleep disruption
Emotional DesensitizationContinuous viewing of depressive content normalizes sufferingYoung users feel "numb" or develop passive suicidal thoughts
Data-Driven ExploitationUser behavior is mined for ad revenue, not well-beingProfits are prioritized over public mental health safety

Recommendations

  1. Regulatory Action:

    • Adopt stronger digital safeguards and AI accountability laws, such as a Philippine Child Online Protection Act mirroring the UK’s Online Safety Bill.

  2. Algorithmic Transparency:

    • Tech companies operating in the Philippines must disclose how content is curated and enable opt-out features for sensitive content.

  3. Education and Digital Literacy:

    • Schools should integrate mental health first aid and digital literacy education to help youth recognize harmful content and online manipulation.

  4. Strengthening Mental Health Services:

    • Expand telepsychology services via government and LGUs, especially in underserved areas.

  5. Faith-Based & Community Support:

    • Leverage barangay and church groups as safe spaces for young adults to talk without judgment.


Conclusion

The tragedy of Molly Russell should serve as a warning to nations like the Philippines, where a young, digital-native population collides with poverty, stigma, and limited access to mental health services. Advanced AI systems, while revolutionary, must not become instruments of harm. We must demand platforms that value human dignity over engagement metrics and push for ethical AI that uplifts rather than destroys.


References:

  • Milmo, D. (2022, October 1). ‘The bleakest of worlds’: how Molly Russell fell into a vortex of despair on social media. The Guardian. Link

  • DataReportal. (2024). Digital 2024: The Philippines. https://datareportal.com

  • Philippine Mental Health Association (PMHA). https://pmha.org.ph

  • Kim, H., Son, Y., Lee, H., Kang, J., Hammoodi, A., Choi, Y., Kim, H. J., Lee, H., Fond, G., Boyer, L., Kwon, R., Woo, S., & Yon, D. K. (2024). Machine Learning–Based Prediction of suicidal thinking in Adolescents by derivation and validation in 3 independent worldwide cohorts: Algorithm Development and Validation study. Journal of Medical Internet Research, 26, e55913. https://doi.org/10.2196/55913 



No comments:

Post a Comment

Complete Analysis of Near Miss Accidents: Trips, Slips, and Falls in an Office Setup

 Analysis of Trips, Slips, and Falls in an Office Setup Using Risk Matrix Decision Tree Analysis, FMEA, and Bow Tie. 1. Risk Matrix Analysis...