Editorial: Blowing the whistle on Facebook 



Facebook is deliberately designed to make us angry – or afraid or devastated or elated. The kind of emotions that keep users deeply engaged and coming back for more. Studies and our own eyes have observed toxicity on the social network, including vitriol and the rapid dissemination of misinformation. Instagram, which Facebook owns, can have a harmful effect on teens’ self-esteem. 

Still, the approximate 2.8 billion active users on Facebook and 1 billion on Instagram have found plenty to like. Both offer forums to stay connected, share ideas and gather information – some of it nonsense. More on that later. 

No one understands the weaknesses and strengths better than Facebook itself. The company amassed vast amounts of data yet opted against installing safeguards on its platforms, instead favoring rapid growth and profit. 

Testimony before a U.S. Senate panel by whistleblower Frances Haugen, a former data scientist at the tech giant, prompted bipartisan outrage. Among the findings from data she leaked was that Facebook’s newsfeed algorithm – the tool that determines the posts users see when they open and scroll through Facebook – amplified misinformation. Other studies showed dangerous trends among teenage girls engaging in the photo-centric, highly filtered world of Instagram. 

In one survey, 13.5 percent of U.K. teen girls reported their suicidal thoughts became more frequent after starting on the platform. Another leaked study found 17 percent of teen girls say using Instagram exacerbated their eating disorders. Meanwhile, Facebook has been working to groom a new generation of users. 

Even personal posts shared for the benefit of friends tend to be polished for public consumption. We’ll post glossy family photos but not the umpteen scowling outtakes on the road to perfection. 

But it’s the posts that garner an instant strong reaction, be it delight or rage, that tend to go viral. Facebook’s algorithms reward engagement. Posts that receive a lot of comments and “likes” are shown to more and more people, garnering more engagements. Provocative content spreads far and wide. Even Facebook struggles to tame the beast. CEO Mark Zuckerberg’s goal of encouraging COVID-19 vaccination has been undermined by doubt sown on his own platform. The company has been working to crack down on the spread of misinformation.  

In May, Facebook announced it would test a new feature encouraging users to (gasp) actually open and read articles before sharing them. If only the same barrier could be applied to the comment section. 

Adjusting the newsfeed algorithm so that users see recent posts chronologically instead of the heavy emphasis on high-engagement posts could help improve the environment considerably.  

Short of Facebook or Congress hitting the brakes, we’ll have to hit them ourselves. Look critically at your newsfeed. Make use of the optional time limit settings on your social media apps. Never forget that Facebook is free because users are the commodity.  

Leave a Reply

Your email address will not be published. Required fields are marked *