Regulating Facebook (*ahem* Meta)

Roddy Lindsay, a former Facebook data scientist, argues in his New York Times op-Ed that Congress should pass a simple reform: hold social media companies accountable for the content their algorithms promote. His urge comes in light of Frances Haugen’s Senate testimonies, in which the former Facebook product manager revealed that the tech giant prioritizes profits over user safety.

What’s wrong?

Lindsay writes that two technological developments adopted by social media companies are largely responsible for the fringe content that appears on feeds. 

  • Personalization: the “mass collection of user data through web cookies and Big Data systems.”
  • Algorithmic amplification: “the use of powerful artificial intelligence to select the content shown to users.”
  • While personalization and algorithmic amplification can be beneficial to consumers, they are harmful because they “perpetuate biases and affect society in ways that are barely understood by their creators, much less users or regulators.”

    Engagement is central to social media companies’ business models, which incentives feeds to promote inflammatory and harmful content.  

    What is Section 230 and why is it important? 

    To address personalized algorithmic amplification, Lindsay suggests changing Section 230 of the Communications Decency Act of 1996.  

    Section 230 states that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

    The law essentially allows social media companies to host user-generated content without consequences. These platforms are not legally responsible for the actions of their users. According to the Electronic Frontier Foundation, a nonprofit dedicated to protecting digital privacy and free speech, Section 230 is “perhaps the most influential law to protect the kind of innovation that has allowed the Internet to thrive since 1996.”

    What’s the solution?

    During her testimony, Haugen said reforming Section 230 to hold Facebook accountable would cause the social media company to remove engagement-based ranking. Removing this system, Lindsay argues, would address political bias, reduce fringe content, and still allow social media companies to be successful and profitable.

    “Social media feeds would be free of the unavoidable biases that AI-based systems often introduce. Any algorithmic ranking of user-generated content could be limited to non-personalized features like ‘most popular’ lists or simply be customized for particular geographies or languages.”

    Lindsay admits reforming Section 230 will be challenging, but the urgency of creating reform outweighs the difficulties. 


    Link to original article published 11/2/2021

    Using Format