Aug 6, 2020

Disinformation, Regulation, and the Future of Sec 230

Professor Farid Goes to Washington — Sort Of.

Should big tech companies be reined in? A US News and World Report survey found that roughly 74% of respondents say technology giants should see their powers limited. The flashpoint surrounding regulation is Federal Law Section 230 of the Communication Decency Act. Section 230 was enacted in 1996 — long before Facebook, Google, and Twitter — and it provides, in part, websites with protection from liability for content created and shared on their sites by users. 

While there are some things that aren’t protected, like child pornography and intellectual property violations, for the most part websites aren’t responsible for posts by users because, in the context of the law, the website isn’t a publisher, it’s simply the host. And this, some members of Congress argue, is a problem. 

I School Professor Hany Farid thinks changes need to be made to Sec 230 which would remove the blanket protection from liability these companies currently enjoy. On June 24, 2020, he testified before the joint hearing of the House Subcommittee on Communications and Technology and the Subcommittee on Consumer Protection and Commerce at a hearing entitled: “A Country in Crisis: How Disinformation Online is Dividing the Nation.”  

Consumer Protection Subcommittee chair Jan Schakowsky (D-Ill.) opened her remarks by stating that Congress must step in because “the American people are dying and suffering as a result of online disinformation.”

Mike Doyle (D-Pa.), Communications Subcommittee chair, said that the pandemic  highlighted “the flood of disinformation online — principally distributed by social media companies — and the dangerous and divisive impact it is having on our nation.”

“We can and we must do better when it comes to contending with the misinformation apocalypse that has emerged over the past few years.”

In his opening statement to the subcommittees, Farid asked: “How, in 20 short years, did we go from the promise of the internet to democratize access to knowledge and make the world more understanding and enlightened, to this litany of daily horrors? Due to a combination of naivete, ideology, willful ignorance, and a mentality of growth at all costs, the titans of tech have simply failed to install proper safeguards on their services.”

“We can and we must do better when it comes to contending with some of the most violent, harmful, dangerous, hateful, and fraudulent content online. We can and we must do better when it comes to contending with the misinformation apocalypse that has emerged over the past few years.”

The hearing was held remotely, with Rep. Doyle presiding over a Web Ex gallery of congressional members.

I School: You’ve testified in person before Congress in the past. Was this time a little surreal? 

HF: Yes! Usually, it’s a very serious affair — you know, the halls are hushed, and it’s very formal. This time, members of Congress forgot to unmute their mics, some people left them on and there was interference...it was definitely different. 

I School: In the Q&A period (Committee members were each allowed 5 minutes to question the experts) one of the points you made repeatedly, is that there are a few inroads to force change on the tech giants. 

HF: Yes. There are three pressure points:

The first is regulatory relief. Name any industry — food, financial, automotive — and when there is regulatory relief, those changes result in safer products. I’m old enough to remember when car manufacturers screamed that being required to install airbags would put them out of business… of course, they were wrong. And what happened? Car companies actually started selling their safety records! This only happened because of regulatory pressure. 

Next is customers — the marketplace. Normally, if you’re not happy with a service or product, you can vote with your feet, buy something else. But with social media, we’re not the customers, we’re the product! We’re not paying for these services — I don’t pay a fee to use Facebook. And because of this, we don’t have the power that we’d usually have as consumers...it’s not like we can just hop over to MySpace...because of the nature of the service and the stranglehold, it makes it very difficult for us as the customers.

The last is advertising. We’re seeing this [what it looks like when companies withhold advertising dollars] play out in real-time. Disney just announced that they’re the latest addition to 1000 or so companies that are boycotting Facebook and Instagram ads under Color of Change.1

This can be incredibly effective and has been in the past; for instance, Disney is one of the single largest advertisers on social media. In the past, they’ve stopped advertising on YouTube because YouTube was incredibly unsafe for young children. They said: we’re out. And they pulled their ads until YouTube changed their policies. 


Amplification & Better Business Models 

One of the arguments made against changes to Sec 230 is that if you remove the protections afforded under the law, it will stifle innovation, thereby crippling tech companies. The way Farid sees it, regulatory relief on the market dominance side of things might actually allow for a healthier tech ecosystem to thrive. 

Chair Schakowsky asked Dr. Farid to explain why the big platforms “allow amplification of conspiracy theories and disinformation, and how this business model benefits them.” 

“Social media,” Farid explained, “is in the engagement and attention business. They profit when we spend more time on the platform and they deliver ads. The companies didn’t set out to fuel disinformation. But that’s what the algorithms learned.” He went on to say that the business model is ‘poison’ and that it’s “fundamentally at odds with our societal and democratic goals.”

I School:  You’re not buying the claim that changes to 230 will kill creativity and stifle innovation?

HF:  Google seems to forget that they were the beneficiary of regulation 25 years ago when Microsoft ruled the earth and Google was a little company trying to break in. The Department of Justice came in and told Microsoft — who was bundling their desktops with their browser, Internet Explorer, making it difficult to install another — to knock it off!2 And what happened when Microsoft was forced to make changes? It allowed oxygen in the room so a little search engine called Google could flourish! Now they want to keep out competitors so they can dominate.

I School: Can you elaborate on the amplification of the algorithms that drive social media?

HF: Mark Zuckerberg (CEO of Facebook) will have you believe: ‘Oh, you don’t want me arbitrating what’s true and what’s not.’ But he needs to be reminded that right now, he’s the arbiter of what’s relevant. 

When you go to Facebook, your news feed is highly curated. Facebook is acting like an editor, picking what you see out of the volumes of things you could see... and they’re deciding what you see based on what they think will keep you on the platform longer.

YouTube (parent company Google) is no different. 70% of videos watched on the platform are those promoted by YouTube. 

I School: I was stunned by that figure when you mentioned it in testimony! And then I thought about how I might visit YouTube for a specific video, and then immediately I’m sucked into watching the next five videos…

HF: This isn’t an accident. And you can find some really bad stuff. Recent studies have shown that half of people who joined Facebook groups associated with white nationalists were recommended those groups by Facebook.

“We don’t hold the telephone company responsible for using a phone to conspire to commit a crime, right...? What happens when Facebook drives you into the group and then keeps showing you things that are more and more radicalized? Are they not somehow culpable?”

I School: Say you’re ‘suggested’ into one of these groups of bad actors by algorithmic amplification; you’re someone who becomes heavily involved and eventually takes action, goes out and shoots someone, etc. Do you think that we should hold these companies culpable?

HF: That’s the right question to ask. We don’t hold the telephone company responsible for using a phone to conspire to commit a crime, right? 

But what happens when it’s not a neutral platform? What happens when Facebook drives you into the group and then keeps showing you things that are more and more radicalized? Are they not somehow culpable? Morally, I think they are. But the issue is legally they are not, because of Section 230 of the communications decency act. 


Moderation & Monstrosity

Rep Bret Guthrie (R-Ky) asked Farid if he believed that technology companies possessed the technological means to better moderate illicit content on their platforms, “and if they do, why aren’t they using them?” 

“I don’t think they have the means,” Farid said. “And they don’t have the means because they haven’t prioritized it.” When the DMCA (Digital Millennium Copyright Act of 1998) was passed, Farid argued: “Companies got very good at spotting and removing copyright infringement because of the law.”

I School: If you had your druthers, what would moderation on these platforms look like?

HF: First of all, these trillion-dollar companies need to stop throwing their hands up and saying, the Internet is big, there’s nothing we can do. You built this monstrosity. You can’t then turn around and say: Oh, I can’t control this

“These trillion-dollar companies need to stop throwing their hands up and saying, the Internet is big, there’s nothing we can do.”

I’d like to see two things. One, we simply have to change the blanket liability found in Section 230... it’s simply too broad. It has allowed sites like Backpage which was knowingly trafficking young children, protection from liability under Section 230. That’s insane! There are clearly consequences to Section 230 that we did not anticipate.

And the second thing is that I certainly don’t want to hold social media companies responsible for every post, every video that their millions of users post. That’s not reasonable. But, we can hold them responsible for radicalizing people by driving them to extremism, driving them to sexual abuse material, knowingly allowing this material to be uploaded, and then distributed through their network. They should be held liable for that.

My hope is that if you just crack the door open a bit, you know they’re going to change;  it will take just a little movement to get there.


1Grassroots organization Color of Change is actively pressuring tech companies to take a stand against white nationalist hate and voter suppression on their platforms.

2In 1998, a judge found Microsoft violated parts of the Sherman Antitrust law, and the company and DOJ settled.

3Backpage.com was a classifieds site which was shut down by the Justice Department in 2018

Hany Farid
Hany Farid

Videos

A Country in Crisis: How Disinformation Online is Dividing the Nation

A Country in Crisis: How Disinformation Online is Dividing the Nation

If you require video captions for accessibility and this video does not have captions, click here to request video captioning.

Last updated:

April 18, 2022