After a nearly eight-hour hearing last week in which they weighed four bills to limit liability protections for large Internet platforms under Section 230 of the Communications Decency Act, members of the House Subcommittee on Consumer Protection and Commerce are set to entertain seven more bills on Thursday that would regulate Internet platforms on consumer-protection grounds.
The bills up for debate on Thursday would impose an array of transparency requirements on Internet advertisers, force websites and apps to disclose any ownership ties to China, scrutinize algorithms in an attempt to prevent discrimination or bias, and tighten regulation of content geared toward children.
Thursday’s session will be the second in a series of hearings framed as “holding big tech accountable.” In the first hearing last week, lawmakers and witnesses—including Facebook whistleblower Frances Haugen—voiced support for revising Section 230, though there was disagreement over the breadth and type of potential reforms. The subcommittee considered bills to limit liability protection for Internet platforms for civil-rights or consumer-protection purposes when algorithms amplify certain types of content.
In an opening statement, Committee Chairman Rep. Frank Pallone (D-NJ) insisted the goal is not to do away with Section 230 liability protections, but to narrow them.
“To be clear, Section 230 is critically important to promoting a vibrant and free internet, but I agree with those who suggest the courts have allowed it to stray too far,” said Pallone. “These targeted proposals for reform are intended to balance the benefits of vibrant, free expression online while ensuring that platforms cannot hide behind Section 230 when their business practices meaningfully contribute to real harm.”
The subcommittee’s ranking member, Rep. Bob Latta (R-OH), echoed Pallone in arguing there should be limits to liability protection under Section 230, but expressed skepticism of some proposed reforms, warning they “could lead to unintended consequences like curtailing free speech or innovation.”
Three of the four bills up for discussion focused on removing protection in circumstances involving violations of civil rights laws or promotion of terrorism. The SAFE TECH Act, in addition to addressing civil rights concerns, also would remove Section 230 protections for paid advertisements and other actions that have the effect of violating antitrust, harassment, and international human rights laws.
The last bill up for consideration was the Justice Against Malicious Algorithms Act, which drew particularly sharp skepticism. The bill, introduced in October by Pallone, would take away protections for large Internet companies whose algorithms make personalized recommendations that contribute to “physical or severe emotional injury to any person.” Republicans immediately raised questions about the term “emotional injury,” warning it is too vague. Rep. Dan Crenshaw (R-TX) said the term is “undefined and open to interpretation.”
Witnesses expressed strong support for Section 230 reforms but conceded Section 230 cannot serve as a panacea to address every societal issue that critics have blamed on the tech industry.
Among the witnesses were Matthew Wood, vice president of policy and general counsel for Free Press Action, and Mary Anne Franks, professor at the University of Miami and president of the Cyber Civil Rights Initiative.
“The changes proposed to Section 230’s text in the bills subject to this legislative hearing—as well as in many other bills and even more numerous scholarly proposals—could not solve all of these problems attributed to social media nor other societal harms more generally," said Wood.
Franks concurred, noting that Section 230 reform would not address abuses such as nonconsensual pornography, sexual extortion, doxing, or deepfakes. For those, Franks urged lawmakers to enact federal criminal legislation.
Subcommittee lawmakers will likely hear from a new set of witnesses on Thursday when they discuss bills Pallone said would focus on consumer protections to increase platforms’ accountability to the public.