Meta's AR Filter Ban: A Band-Aid Solution for Mental Health?
Meta has announced that third-party augmented reality (AR) filters will be removed from its platforms by January 2025, affecting over two million user-made filters. Although filters have been linked to mental health issues in young women, the removal may push the technology underground, where it could become harder to manage.
- Country:
- Australia
Meta has revealed plans to eliminate third-party augmented reality (AR) filters from its apps, including Facebook, WhatsApp, and Instagram, by January 2025. This decision will impact more than two million user-created filters, sparking debate on its implications for mental health and beauty standards.
AR filters have been criticized for promoting unrealistic beauty ideals, particularly affecting young women's mental health. Despite the potential benefits, critics argue the removal is a superficial measure that may force users to seek similar technologies elsewhere, exacerbating the problem.
Meta's official reason for the removal is to focus on other company priorities, including a substantial investment in artificial intelligence. While first-party filters will remain, the diversity and sophistication of user-generated filters will be significantly reduced, raising concerns about the future of visual literacy and mental health on social media.
(With inputs from agencies.)
ALSO READ
Hidden Legacy: How Leaded Petrol Shaped Mental Health Disorders in Generation X
$5 Million Mental Health Promotion Fund Launched by Government
California Pushes for Mental Health Warning Labels on Social Media
Controversy Unravels Over YesMadam's Mental Health Awareness Campaign
Prioritizing Mental Health: A Key to Corporate Success