Schatz, Thune Reintroduce Legislation To Strengthen Rules, Transparency For Online Content Moderation, Hold Internet Companies Accountable
Bipartisan Internet PACT Act Requires Social Media Companies To Establish Clear Content Moderation Policies, Holds Them Accountable For Not Protecting Consumers
WASHINGTON – U.S. Senators Brian Schatz (D-Hawai‘i) and John Thune (R-S.D.) today reintroduced the Internet Platform Accountability and Consumer Transparency (Internet PACT) Act, bipartisan legislation which updates the Communications Act of 1934 by requiring social media companies to establish clear content moderation policies and holding them accountable for content that violates their own policies or is illegal. The bill is cosponsored by U.S. Senators Tammy Baldwin (D-Wis.), John Barrasso (R-Wyo.), Ben Ray Luján (D-N.M.), Bill Cassidy (R-La.), John Hickenlooper (D-Colo.), and Shelley Moore Capito (R-W.Va.).
“Online companies need to establish clear content moderation policies, actually follow those policies, and respond to consumers when they raise concerns about their implementation,” said Senator Schatz. “By requiring these simple things, our bipartisan bill will better protect consumers and hold companies more accountable.”
“This bipartisan legislation is a common-sense approach to preserving user-generated content and free speech on the internet and holding Big Tech accountable by providing much-needed transparency to online consumers,” said Senator Thune. “In order to keep up with America’s ever-expanding digital landscape, and all of the consumers who depend on it, it’s important to pursue policies – like the Internet PACT Act – that protect online consumers by giving them more control of their online experience.”
There is widespread bipartisan agreement that social media platforms have inconsistent and opaque content moderation practices due to a lack of accountability. To address this, the Schatz-Thune Internet PACT Act creates more transparency by:
- Requiring online platforms to explain their content moderation practices in an acceptable use policy that is easily accessible to consumers;
- Implementing biannual reporting requirements for online platforms that includes disaggregated statistics on content that has been removed, demonetized, or deprioritized; and
- Promoting open collaboration and sharing of industry best practices and guidelines through a National Institute of Standards and Technology-led voluntary framework.
The Internet PACT Act holds platforms accountable by:
- Requiring large online platforms to provide due process protections to consumers through a defined complaint system that processes reports and notifies users of moderation decisions within twenty-one days, and allows consumers to appeal online platforms’ content moderation decisions;
- Amending Section 230 to require that large online platforms remove court-determined illegal content and activity within four days; and
- Allowing smaller online platforms to have more flexibility in responding to user complaints, removing illegal content, and acting on illegal activity, based on their size and capacity.
The Internet PACT Act protects consumers by:
- Exempting the enforcement of federal civil laws from Section 230 so that online platforms cannot use it as a defense when federal regulators, like the Department of Justice or Federal Trade Commission (FTC), pursue civil actions online;
- Allowing state attorneys general to enforce federal civil laws against online platforms; and
- Requiring the Government Accountability Office to study and report on the viability of an FTC-administered whistleblower program for employees or contractors of online platforms.
###