Screen Time Clinic

Get Ready for Summer and Download Our Free Summer Challenge Created by Youth Leaders

Toxic Videos and Ads Give More Reason to Wait or Abstain from Social Media in America

The WSJ investigative reporting continues to do a great job unveiling the harmful content of social media in the new article published today “Instagram Serves Up Toxic Video Mix” and parents should take notice, opting to have their children and teens delay utilizing this platform as long as possible, ideally until age 18 given the inappropriate content, predators, and proven mental health harms that have plagued American youth as well as adults due to the addictive design, constant self-comparison, and triggering content. Even adults should approach social media with care and weigh the mental health costs carefully with perceived benefits.

The article overviews how in the digital era, social media platforms play a significant role in shaping our online experiences, with young users being the most active. Instagram, known for its Reels video service similar to TikTok, utilizes algorithms to curate content, tailoring it to what engages users, regardless of their intended interests. This recent investigation (as well as others in 2020) conducted by The Wall Street Journal uncovered concerning findings regarding Instagram’s algorithms and the disturbing content they promote, especially involving minors.

The Journal’s investigation involved setting up test accounts focused on following young gymnasts, cheerleaders, and preteen influencers. Shockingly, these accounts were inundated with explicit and sexualized content, including videos of children, alongside advertisements from prominent brands. The platform’s algorithm seemingly directed such inappropriate content to these accounts.

Moreover, the investigation discovered a disturbing trend: accounts following these young influencers often included a significant number of adult men, and many exhibited an interest in sexual content related to both children and adults. Subsequently, the platform began suggesting more disturbing content interspersed with ads, creating a deeply concerning juxtaposition.

Companies whose ads appeared alongside this inappropriate content, such as Disney, Walmart, and dating app companies like Match Group, expressed dismay and initiated actions to halt their advertisements on Meta’s platforms, citing concerns about associating their brands with such content.

Meta, Instagram’s parent company, responded by highlighting its efforts to reduce harmful content, mentioning investments in safety measures. However, the platform’s ongoing association with inappropriate material continued to persist, raising questions about the effectiveness of their safety controls.

One alarming aspect was the consistent pairing of major brands’ ads with explicit content. The test accounts revealed ads for family-oriented brands appearing adjacent to videos containing explicit and sexual content, causing deep concern regarding the platform’s ad placement policies.

The complexity of this issue lies in Instagram’s algorithm, which seems to target niche interests, directing related content to users based on their interactions. This has resulted in the inadvertent promotion of sexualized content involving minors to users following young influencers.

Furthermore, internal concerns among current and former Meta employees indicated a known problem with the algorithm aggregating child sexualization content across the platform. Despite efforts, changes to the recommendation algorithms that could mitigate these issues are challenging to implement without impacting engagement metrics significantly.

While Meta asserts its commitment to addressing these concerns, the platform’s failure to effectively restrict the promotion of inappropriate content, particularly involving children, remains a critical issue. Additionally, the Canadian Centre for Child Protection highlighted Instagram’s regular recommendation of videos featuring clothed children, raising alarm bells about potential exploitation by online predators.

In conclusion, the alarming trend of inappropriate content surfacing on Instagram, especially involving minors, demands urgent attention and reason for parents to wait until children are 18 to encourage/allow participating on these platforms. If your child is a minor and already using social media, it is critical for parents to monitor content and limit time to reduce exposure to harmful material and mental health consequences. See the full list of harms from Social Media outlined below and research on our website in the articles section: http://4×8.96d.myftpupload.com/articles/ and seek professional help from digital wellness coaches trained in best practices on how to reduce time or rethink technology access while preserving relationships and helping kids thrive in the digital age.

🧠.Universal Social Media Harms and Contributing Factors Identified by PlayFair® and Screen Time Clinic®:

 

● Depression ● Anxiety ● Isolation ● Diminished Focus stamina ● Stress ● Self-loathing and self harm

● False Peer influence and Faux Friendships Qualities

● Deteriorates real social skills and lack of appropriate real life risk taking to build confidence and character ● Promotes isolation

● Promotes unhealthy co-dependant relationship with technology ● Lowers self-esteem ● Sleep deprivation and disruption ●Addictive design

● Algorithms targeting young users without transparency or oversight

● Algorithms that spread posts based on salacious or outrageous content, fake news, violent and depressing content, deep fakes, not truth or common good

● Preference to show content that engages (sexual, violent, shocking, vulnerabilities) in search and served content: Once you search for a topic, it’s nearly impossible to change the Algorithm so it stops delivering that topic and tangential sensational topics

● Limited and inconsistent federal and state standards and legislation in place in USA, many years until widespread protections

● Inability to sue for harms due to AB 230, an antiquated loophole allowing online platforms to not be responsible for content or users

● Targeted advertising to minors and inappropriate ads

● Autoplay videos never ending and unintended content

● No platform transparency

● Lack of social empathy behind a screen

● No protection oversight by independent sources/audits other than watchdog organizations that get discredited by Big Tech

● Limited education K-12 and for parents in media literacy/digital citizenship and online harms

● Parental controls that are useless or weak even though platforms claim they are doing a lot for safety

● Limited ways to effectively supervise content as a parent even with education

● No “Duty of Care” in design required to minors

● Allowing access to content without having an account. Allowing links for content that can be sent to non-account holders. (There is no way to say “My child doesn’t have TikTok” when they can access TikTok through a link or web browser without an account.)

● The process of reporting harmful content does not work, platforms have no incentive to change

● The reporting process of harmful content or harassment is difficult and not transparent ● Reported harmful content rarely taken down

● Reported accounts (drug dealers, CSAM) can create new accounts and come back to the platform easily, only option is to block them, no reporting

●Platforms are applying Terms of Service (TOS) inconsistently to avoid lawsuits and liability of harm, no recourse for your child if they suffer harm or death

● Social Contagion Spreading of Inappropriate or False Content and Behaviors, Negative Emotions

● COVID contributed to excessive screen time and increased access to platforms, stronger more sophisticated algorithms

● EdTech apps and Chromebooks allow users to be pulled away to gaming or social media when they should be learning

● Advertisers targeting kids with inappropriate and/or adult content (movies, products)

● Anonymity/anonymous apps (YOLO) leads to increased bullying and hate speech, unable to prosecute or identify

● Emoji meanings, text codes & internet slang hides intent from parents

● Persuasive design is stronger than human intention

● Physical location – Allowing devices in private spaces, like a child’s bedroom, during school, lack of focus

Share this post:

Scroll to Top