Whistleblower claims Meta and TikTok prioritized algorithmic profit over user safety
Social media platforms including Facebook (Meta) and TikTok have faced accusations from whistleblowers who claim they intentionally allowed harmful content onto user feeds by prioritizing an algorithmic arms race over safety measures after internal research demonstrated that outrage significantly boosts engagement metrics, with dozens of former employees stating these decisions knowingly exposed people to toxic material in a global effort to maximize profits.
Key Points
-
1Whistleblower allegations state that TikTok and Meta knowingly released toxic ads onto platforms to increase user interaction.
-
2Internal algorithmic studies revealed these social media giants prioritized content designed by outrage over safety measures, leading to increased harmful material on users' feeds.
Developments
Perspectives
Some users have knowingly uploaded dangerous substances to platforms.
— [Mar 17, 05:56] Sosiaalinen media | BBCSocial media giants made decisions which allowed more harmful content on people's feeds after internal research showed how outrage fueled engagement; whistleblowers told the BBC that TikTok and Meta risked safety to win an algorithm arms race.
— [Mar 17, 05:49] Feeds.bbci.co.uk'Meta' e 'TikTok', avertizorii spun că au permis creșterea interacțiunii prin publicarea de conținut periculos
— (Economedia.ro)