© Reuters
In a bid to enhance child safety online, YouTube is executing updates to its content recommendation algorithm, primarily aimed at teenage users. The move comes in the wake of criticisms against social media platforms, including Facebook (NASDAQ:) and Instagram’s parent company, Meta, for profiting from children’s distress and spreading misinformation about platform safety.
The updates, announced via a blog post last Thursday and set to begin on November 2nd, are designed to reduce harmful video suggestions that glorify certain body weights or display social aggression such as “non-contact fights” and intimidation. Harmful content identified includes videos comparing physical features, idealizing certain fitness levels or weights, or showcasing social aggression. YouTube has faced backlash over its content management practices, with critics arguing its video recommendation engine could lead young users towards damaging content.
James Beser, director of youth and kids product at YouTube, and an advisory committee identified these potentially harmful content categories. They have now curbed the repetition of such content for U.S teenagers and plan to extend these changes globally by 2024.
YouTube is also rolling out prominent “take a break” reminders and improved crisis resource panels with live support from crisis service partners. Partnering with the WHO and Common Sense Networks, YouTube is creating resources for safe, empathetic online video creation and effective comment management. The platform’s “take a break” and “bedtime” reminders will now appear as full-screen takeovers on both YouTube Shorts and long-form videos. The crisis resource panels will become full screen when users search for topics related to suicide, self-harm, and eating disorders, suggesting more positive search terms such as “self-compassion” and “grounding exercises”.
Allison Briscoe-Smith from the Youth and Family Advisory Committee highlighted the importance of these “guardrails” in assisting teens to form healthy self-images. Jennifer Kelman, a JustAnswer Therapist on YouTube’s Advisory Board of wellness experts, also stressed the need for time restrictions and protective measures to shield young users from constant exposure to harmful content.
Despite YouTube’s efforts to improve its safety measures and parental controls, a Pew Research Center survey reveals that an estimated 95% of teens are actively engaged with the platform. This indicates the significant influence YouTube has on this demographic and underscores the importance of these new safety measures. These actions are taken following a lawsuit against Meta by multiple states for allegedly promoting harmful content contributing to a youth mental health crisis. Other platforms like Google (NASDAQ:), Snap, and TikTok have also encountered legal issues for purportedly aggravating student mental health problems.
This article was generated with the support of AI and reviewed by an editor. For more information see our T&C.
Read the full article here