| A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1 | ||||||||||||||||||||||||||
2 | ||||||||||||||||||||||||||
3 | Section 230 Reform | Introductory Language | Key Attributes | |||||||||||||||||||||||
4 | ||||||||||||||||||||||||||
5 | Preserving constitutionally protected speech, led by Republican Leaders Cathy McMorris Rodgers (R-WA) and Jim Jordan (R-OH), to remove liability protections for companies who censor constitutionally protected speech on their platforms, require appeals processes, and transparency for content enforcement decisions. | To amend section 230 of the Communications Act of 1934 to provide that immunity under such section does not apply to certain companies and to require internet platform companies to implement and maintain reasonable and user-friendly appeals processes for decisions about content on the platforms of such companies and to submit quarterly filings to the Federal Trade Commission regarding content enforcement decisions and appeals, and for other purposes. | Introduces carve out for "constitutionally protected" speech. Threshold for application: companies have $3 billion in revenue or 300 million active users. Requirements for transparency on content moderation practices, communication on account suspensions, etc. | |||||||||||||||||||||||
6 | Bad Samaritan carve out, led by Rep. Bob Latta (R-OH), to amend section 230 to remove liability protections from companies that act as Bad Samaritans and knowingly promote, solicit, or facilitate illegal activity. | To amend section 230 of the Communications Act of 1934 to provide for a ‘‘bad Samaritan’’ exclusion from immunity under such section, and for other purposes. | Removes liability if company is found to " promote, solicit, or facilitate material or activity by another information content provider that such interactive computer service provider knew or had reason to believe would violate Federal criminal law, if nowingly disseminated or engaged in." | |||||||||||||||||||||||
7 | Chinese Communist Party carve out, led by Rep. Neal Dunn (R-FL), to exclude companies with direct or indirect ties to the Chinese Communist Party from section 230 . | To amend section 230 of the Communications Act of 1934 to exclude from the application of such section persons or entities that are owned by the Chinese Communist Party or certain other entities, and for other purposes. | Does what it says here. | |||||||||||||||||||||||
8 | Nondiscrimination carve out, led by Rep. Dan Crenshaw (R-TX), to remove liability protections from companies who take action based on a user’s racial, sexual, political affiliation, or ethnic grounds. | To amend section 230 of the Communications Act of 1934 to limit immunity under such section for actions based on racial, sexual, political affiliation, or ethnic grounds, and to preserve access to lawful content and prevent discrimination and unfair methods of competition on the internet, and for other purposes. | Introduces exception to 230 if a platform over a certain threshold is found to be discriminating. | |||||||||||||||||||||||
9 | FTC exemption, led by Rep. Fred Upton (R-MI), to amend section 230 to remove liability protections for actions brought against a company by the FTC. | To provide exceptions to immunity under section 230 of the Communications Act of 1934 for the enforcement of Federal civil law, and for other purposes. | Seems to give clarify power of state AGs to enforce federal law (same as below from Crenshaw) | |||||||||||||||||||||||
10 | Cyberbullying carve out, led by Rep. Tim Walberg (R-MI), to amend section 230 to remove liability protections for claims based on cyberbullying. | To amend section 230 of the Communications Act of 1934 to provide that such section has no effect on claims relating to cyberbullying of users under the age of 18, and for other purposes | (A) places an individual in reasonable fear of death or serious bodily injury; and causes, attempts to cause, or would 2 be reasonably expected to cause an individual to 3 commit suicide.’’. | |||||||||||||||||||||||
11 | Doxxing carve out, led by Rep. Jeff Duncan (R-SC), to amend section 230 to remove liability protections for claims based on doxxing. | To amend section 230 of the Communications Act of 1934 to provide that such section has no effect on claims relating to doxxing, and for other purposes. | ||||||||||||||||||||||||
12 | Terrorism carve out, led by Rep. Gary Palmer (R-AL), to amend section 230 to remove liability protections for claims based on foreign terrorism content. | To amend section 230 of the Communications Act of 1934 to provide that such section has no effect on claims relating to terrorism content, and for other purposes. | Relies on US Dept of State Foreign Terrorist Org designations, targers any content "shared or distributed by entities with direct or indirect ites to countris determined to be state sponsors..." or any "state-affiliated content that promotes terrorism, violent extreemism and genocide..." | |||||||||||||||||||||||
13 | Child exploitation, including pornography carve out, led by Gus Bilirakis (R-FL), to amend section 230 to remove liability protections for claims based on child exploitation, including child pornography. | To amend section 230 of the Communications Act of 1934 to provide that such section has no effect on claims relating to child exploitation, including child pornography, and to require the Comptroller General to submit a report on big tech and law enforcement, and for other purposes. | Focus is on removing liability protections for child exploitation and child pornography; but also directs GAO Comptroller Genereal to do a study on "How social media companies currently communicate, consult, and coordinate with Federal, State, and local law enforcement to address illegal content and activity online." | |||||||||||||||||||||||
14 | Counterfeit products carve out, led by Rep. Richard Hudson (R-NC), to amend section 230 to remove liability protections for claims related to counterfeit products. | To amend section 230 of the Communications Act of 1934 to provide that such section has no effect on claims relating to counterfeit products, and for other purposes. | Possibly targeting Amazon primarily? | |||||||||||||||||||||||
15 | Illegal drugs carve out, led by Rep. David McKinley (R-WV), to amend section 230 to remove liability protections for claims based on the illegal sale of drugs and the sale of illegal drugs. | To amend section 230 of the Communications Act of 1934 to provide that such section has no effect on certain laws relating to controlled substances and drugs, and for other purposes. | "based on advertising or offering to sell, deliver, distribute, dispense, or intro14 duce into interstate commerce a controlled substance" | |||||||||||||||||||||||
16 | Product liability carve out, led by Rep. Kelly Armstrong (R-ND), to preserve claims relating to product liability, for any instance in which an interactive computer service has physical possession or control of a product at issue. | To amend the Communications Act of 1934 to preserve claims relating to product liability, for any instance in which an interactive computer service has physical possession or control of a product at issue, and for other purposes. | "a product at issue at any time, including during warehousing, handling, dis13 tribution, shipment, and fulfillment of the product order." So if Amazon sells something and delivers it it would not have Section 230 protection for the product descsription, for instance. | |||||||||||||||||||||||
17 | ||||||||||||||||||||||||||
18 | Content Moderation Practices to Address Certain Content | |||||||||||||||||||||||||
19 | ||||||||||||||||||||||||||
20 | Lawful content protection, led by Rep. Dan Crenshaw (R-TX), to prevent companies from blocking or preventing access to lawful content, as well as degrading or impairing access to such content. | To amend section 230 of the Communications Act of 1934 to limit immunity under such section for actions based on racial, sexual, political affiliation, or ethnic grounds, and to preserve access to lawful content and prevent discrimination and unfair methods of competition on the internet, and for other purposes. | Very similar to bill in yellow above | |||||||||||||||||||||||
21 | Content moderation on child pornography, led by Rep. Morgan Griffith (R-VA), to require companies to implement reasonable content moderation practices to address child pornography. | To require internet platform companies to implement and maintain reasonable content moderation policies and practices to address child pornography on the platforms of such companies, and for other purposes. | Clarity on policies on child porn, naming an officer responsibile, defining process of preventative measurs, employee training, requirement to monitor, evaluate adjust; companies must apply to FTC of content moderation guidelines, introduces opportunity for public comment | |||||||||||||||||||||||
22 | Content moderation on counterfeit and stolen products, led by Rep. Greg Pence (R-IN), to require companies to implement reasonable content moderation practices to address the sale of counterfeit and stolen products. | To require internet platform companies to implement and maintain reasonable content moderation policies and practices to address the sale of counterfeit products, illegal products, and stolen products and materials on the platforms of such companies, and for other purposes. | Similar to above, requiress compliance and initial approval including public comment | |||||||||||||||||||||||
23 | Content moderation on illegal drugs, led by Rep. Brett Guthrie (R-KY), to require companies to implement reasonable content moderation practices to address the illegal sale of drugs and the sale of illegal drugs. | To require internet platform companies to implement and maintain reasonable content moderation policies and practices to address the illegal sale of drugs on the platforms of such companies, and for other purposes. | Similar to above, requiress compliance and initial approval including public comment | |||||||||||||||||||||||
24 | Content moderation on terrorism, led by Rep. Debbie Lesko (R-AZ), to require companies to implement reasonable content moderation practices to address foreign terrorism content. | To require internet platform companies to implement and maintain reasonable content moderation policies and practices to address terrorism content on the platforms of such companies, and for other purposes. | Similar to above, requiress compliance and initial approval including public comment | |||||||||||||||||||||||
25 | Content moderation on child trafficking, led by Rep. Richard Hudson (R-NC), to require companies to implement reasonable content moderation practices to address child trafficking. | To require internet platform companies to implement and maintain reasonable content moderation policies and practices to address trafficking of persons, including children under the age of 18, on the platforms of such companies, and for other purposes. | Similar to above, requiress compliance and initial approval including public comment | |||||||||||||||||||||||
26 | Content moderation on cyberbullying, led by Rep. Larry Bucshon (R-IN), to require companies to implement reasonable content moderation practices to address cyberbullying. | To require internet platform companies to implement and maintain reasonable content moderation policies and practices to address cyberbullying on the platforms of such companies, and for other purposes. | Similar to above, requiress compliance and initial approval including public comment | |||||||||||||||||||||||
27 | Content moderation on revenge porn, led by Rep. Kinzinger (R-IL), to require companies to implement reasonable content moderation practices to address revenge porn. | To require internet platform companies to implement and maintain reasonable content moderation policies and practices to address revenge porn on the platforms of such companies, and for other purposes. | Defines depiction of "sexually explicit conduct" to include the appearance of " the postpubescent female nipple." Definees revenge porn. Similar to above, requiress compliance and initial approval including public comment | |||||||||||||||||||||||
28 | Content moderation on doxxing, led by Rep. Buddy Carter (R-GA), to require companies to implement reasonable content moderation practices to address doxxing. | To require internet platform companies to implement and maintain reasonable content moderation policies and practices to address doxxing on the platforms of such companies, and for other purposes. | Similar to above, requiress compliance and initial approval including public comment | |||||||||||||||||||||||
29 | ||||||||||||||||||||||||||
30 | Protecting Children from Mental Health Harms and Cyberbullying | |||||||||||||||||||||||||
31 | ||||||||||||||||||||||||||
32 | Process for parents to protect their kids, led by Rep. John Joyce (R-PA), to require companies to maintain a user-friendly process for parents to report cyberbullying. | To require internet platform companies to implement and maintain reasonable and user-friendly processes to report cyberbullying, and for other purposes. | Requires public mechanism for reporting; Similar to above, requiress compliance and initial approval including public comment | |||||||||||||||||||||||
33 | Mental health impact disclosure, led by Rep. Bill Johnson (R-OH), to require companies to disclose the mental health impact on their products and services have on children and a NIH study to review whether warning labels about such risks should be required on such products and services. | To require internet platform companies to submit to the Federal Trade Commission biannual filings regarding the impact such companies’ products or services have on users and to require the Director of the National Institutes of Health to conduct a study related to the mental health impact of social media on children, and for other purposes. | Remit is to file on mental health impacts "for users who are under the age of 13, users who are age 13 or older but under the age of 18, and users who are age 18 or older;" also requires public disclosure of "the results of any studies or research such company has conducted or contracted with third parties to conduct on the role and impact such company’s products or services have on the mental health of users" | |||||||||||||||||||||||
34 | Consumer education on mental health effects, led by Rep. Neal Dunn (R-FL), to require the NIH and FTC to develop an annual education campaign about the mental health risks of social media. | To require the National Institutes of Health and the Federal Trade Commission to conduct an educational campaign on the mental health risks related to children’s use of social media platforms, and for other purposes. | In consultation with "technology industry representatives, academic researchers, and consumer advocacy groups," the NIH issus report on menthal health risks for kids of social media... | |||||||||||||||||||||||
35 | ||||||||||||||||||||||||||
36 | Improving Transparency | |||||||||||||||||||||||||
37 | ||||||||||||||||||||||||||
38 | Content policies, led by Rep. Billy Long (R-MO), to require companies to disclose how they develop their content moderation policies. | To require internet platform companies to submit to the Federal Trade Commission biannual filings regarding the content management policies of such companies, and for other purposes. | Introduces a filing procedure and requirmeent of public disclosure of more detail on conteent moderation/management policies | |||||||||||||||||||||||
39 | Appeals policies, led by Rep. Michael Burgess (R-TX), to require companies to disclose how they develop their appeals processes. | To require internet platform companies to submit to the Federal Trade Commission biannual filings regarding the appeals process of such companies, and for other purposes. | Similar to above but with regard to the appeals process | |||||||||||||||||||||||
40 | App Store policies, led by Rep. Steve Scalise (R-LA), to require companies to disclose how they develop and implement their app store policies. | To require app store operators to submit to the Federal Trade Commission annual filings regarding the app store conduct policies of such operators, and for other purposes. | Similar to above but with regard to app store management and disclosures | |||||||||||||||||||||||
41 | Content enforcement, led by Rep. Latta (R-OH), to require companies to disclose their content enforcement decisions related to child pornography, child trafficking, cyberbullying, illegal sale of drugs, foreign terrorism content, counterfeit products, revenge porn, and doxxing. | To require internet platform companies to submit to the Federal Trade Commission quarterly filings regarding the content enforcement decisions of such companies, and for other purposes. | Requires increased transparency on the moderation decisions related to (A) Child pornography (B) Child trafficking. (C) Cyberbullying. (D) Illegal sale of drugs. (E) Terrorism content. F) Counterfeit products, illegal products,and stolen products and materials (G) Revenge porn. (H) Doxxing. | |||||||||||||||||||||||
42 | ||||||||||||||||||||||||||
43 | Additional Accountability Bills | |||||||||||||||||||||||||
44 | ||||||||||||||||||||||||||
45 | Law enforcement study, led by Rep. Gus Bilirakis (R-FL), to direct the GAO to conduct a study on how platforms can better work with law enforcement to address illegal content and crimes on their platforms. | To amend section 230 of the Communications Act of 1934 to provide that such section has no effect on claims relating to child exploitation, including child pornography, and to require the Comptroller General to submit a report on big tech and law enforcement, and for other purposes. | Similar to orange above | |||||||||||||||||||||||
46 | Consumer education on law enforcement, led by Rep. Markwayne Mullin (R-OK), to require annual education campaigns to inform the public about the resources available to them when their safety and security have been violated online. | To require the Federal Trade Commission to conduct an education campaign to inform the public about the resources available when their safety and security has been violated online, and for other purposes. | Federal Trade Commission, the Attorney General, and the head of any other appropriate Federal agency, shall develop an educational program and related resources to inform the publicabout the resources the public has when their safety and security has been violated online. | |||||||||||||||||||||||
47 | Universal Service Fund Contributions, led by Rep. Markwayne Mullin (R-OK), to require a study on the feasibility of requiring Big Tech to contribute to the Universal Service Fund. | To require the Federal Communications Commission to conduct a study and submit to Congress a report examining the feasibility of funding the Universal Service Fund through contributions supplied by edge providers, and for other purposes. | ‘‘Funding Affordable Internet with Reliable Contributions Act’’ or the ‘‘FAIR Contributions Act’’. - means a provider of online content or services, such as a search engine, a social media platform, a streaming service, an app store, a cloud computing service, or an e-commerce platform | |||||||||||||||||||||||
48 | ID verification, led by Rep. John Curtis (R-UT), to require social media companies to require verification of the identity of users prior to use of their platform. | To require a provider of a social media service to verify the identity of users of the service, and for other purposes. | Requires individuals and organizations to verify identity. Individuals can seek an exemption but then the platform, but then the content they see cannot bee algorithmically ranked (?) | |||||||||||||||||||||||
49 | ||||||||||||||||||||||||||
50 | ||||||||||||||||||||||||||
51 | ||||||||||||||||||||||||||
52 | ||||||||||||||||||||||||||
53 | ||||||||||||||||||||||||||
54 | ||||||||||||||||||||||||||
55 | ||||||||||||||||||||||||||
56 | ||||||||||||||||||||||||||
57 | ||||||||||||||||||||||||||
58 | ||||||||||||||||||||||||||
59 | ||||||||||||||||||||||||||
60 | ||||||||||||||||||||||||||
61 | ||||||||||||||||||||||||||
62 | ||||||||||||||||||||||||||
63 | ||||||||||||||||||||||||||
64 | ||||||||||||||||||||||||||
65 | ||||||||||||||||||||||||||
66 | ||||||||||||||||||||||||||
67 | ||||||||||||||||||||||||||
68 | ||||||||||||||||||||||||||
69 | ||||||||||||||||||||||||||
70 | ||||||||||||||||||||||||||
71 | ||||||||||||||||||||||||||
72 | ||||||||||||||||||||||||||
73 | ||||||||||||||||||||||||||
74 | ||||||||||||||||||||||||||
75 | ||||||||||||||||||||||||||
76 | ||||||||||||||||||||||||||
77 | ||||||||||||||||||||||||||
78 | ||||||||||||||||||||||||||
79 | ||||||||||||||||||||||||||
80 | ||||||||||||||||||||||||||
81 | ||||||||||||||||||||||||||
82 | ||||||||||||||||||||||||||
83 | ||||||||||||||||||||||||||
84 | ||||||||||||||||||||||||||
85 | ||||||||||||||||||||||||||
86 | ||||||||||||||||||||||||||
87 | ||||||||||||||||||||||||||
88 | ||||||||||||||||||||||||||
89 | ||||||||||||||||||||||||||
90 | ||||||||||||||||||||||||||
91 | ||||||||||||||||||||||||||
92 | ||||||||||||||||||||||||||
93 | ||||||||||||||||||||||||||
94 | ||||||||||||||||||||||||||
95 | ||||||||||||||||||||||||||
96 | ||||||||||||||||||||||||||
97 | ||||||||||||||||||||||||||
98 | ||||||||||||||||||||||||||
99 | ||||||||||||||||||||||||||
100 |