Forbes and Fifth

Anti-Democratic Practices in Protection of Free Speech Online in the United States and the European Union: Survey and Analysis

Introduction

Deluges of invasively targeted adver­tiser content, democracy-challenging mis­information, and vitriolic hate speech fill the social networking platforms that have become a public square for the globalized era. Consumer content moderation is not a conversation between merely the citizen and the state; intermediaries-the compa­nies that host the platforms that consumers access and use-exist between the two, uniquely positioned both to shape regula­tion and to be shaped by it. Meta, Twitter, and Alphabet, the corporations behind the world’s largest social media platforms, en­joyed vastly unchecked freedom to devel­op through the past twenty years.1 Now, foreign election interference, the rise of alt-right rhetoric, and increasingly narrow options for consumers aged these compa­nies out of this unregulated playground. As the largest economic bloc in the world, it leverages a unique effect to shape reg­ulation around the world, known as the “Brussels effect.”2 When companies adapt their product to comply with EU policies, consumers in other countries where that product is available are thus also affected by this regulatory action. In the content moderation sphere, the “Brussels effect” creates both a standard of compliance for companies and an exemplar for countries the world over to follow. America, long the leader in ideological and technologi­cal innovation, seems to have ceded any claim to power as the global referee for Big Tech. But is the European model durable? Does the United States’ neolib­eral wait-and-see strategy hold water in the face of growing concerns of platforms limiting freedom of expression?

This paper will examine American and European regulatory action towards social media platforms from 1990 on­wards, and analyze the anti-democratic practices in concern to protection to free­dom of speech therein. The First Amend­ment will be examined for its establish­ment of America’s strict and expansive protection of freedom of speech. Section 230 of the Communications Decency Act of 1996 will also be evaluated for its role in establishing intermediary liabili­ty protection, both a boon and a blow to free speech online. Platform law and the internal regulatory controls of American corporations will also be discussed as ma­jor influences in global understanding of tech regulation. The American approach to regulatory action will be considered alongside the European Union’s differing perspective on the importance of regulat­ing hate speech and the necessity of gov­ernment intervention. The E-Commerce Directive will be considered in its equiva­lencies and discrepancies from American digital rights protections. The 2014 Court of Justice of the European Union case Google Spain v. AEPD & González and the 2017 German law NetzDG will be examined for their contributing effects of granting private corporations powers of public adjudication.

The internet is one of humanity’s greatest innovations—bridging the lim­itations of geographical and ideological space to bring people together in expan­sive, multifaceted networks of communi­cation. Its impacts on the human psyche, inevitable as they are, remain to be seen; only one generation in high-income countries has grown up with ready access to the internet, and access to it grows in low-income countries year after year. Unlike any other advancement incommunication, economics, or governance, the internet holds unimpeachable and un­matched power to shape the very struc­tures of governance that aim to regulate it—whether to strengthen or subvert demo­cratic rule.

Consumer Content Moderation in the United States

The First Amendment & Foundational Philosophies of Free Speech

In the United States, often the con­versation about what speech is free to access begins and ends with the First Amendment to the Constitution of the United States: “Congress shall make no law [...] abridging the freedom of speech, or of the press.”3 Ratified in 1791 as part of the Bill of Rights, it was part of a wholesale establishment of the basic rights of the American people. Its word­ing is extremely ambiguous and seems to create an umbrella protection of the right to access, produce, and distribute speech without restriction by the government; it is generally held that this was the aspira­tional direction the framers hoped to point the new republic towards, with detailed understanding and jurisprudence to devel­op over time.4

Some 200 years later, the widespread adoption of the internet prompted law­makers to contend with the Founding Fathers’ expansive vision in the context of a new, untested medium. First Amend­ment issues in regulation mostly concern the rights that companies themselves hold; online platforms are not bound against abridgment of speech like the federal government and can present or restrict user speech as they see fit. Content cura­tion and decisions therein are expressions of their own rights as entities to freely express their company’s values and sentiments. On the consumer level, this means users are not entitled to unrestricted free­dom of speech on private platforms.5

While private entities are not explicit­ly bound by First Amendment restrictions, most social media platforms are American companies staffed with American lawyers: thus, most decisions of corporate gover­nance are made with the First Amendment philosophy to infringe as little as possible on users’ speech.6 There is an uncodified but understood right to free speech—the power structure created by the philosophy of the First Amendment is telegraphed from public governance to private self-regulation, underpinning how these companies operate. As such, the reach of the First Amendment extends far beyond American legal jurisdiction to practically anywhere these platforms operate. Under the same mechanism as the “Brussels effect” by the governing bodies of the Euro­pean Union, content moderation grounded in the First Amendment becomes the world-over standard by the role of Amer­ican companies as simply the largest and most powerful voices in the space.

Section 230 & the Balance Between In­novation and Consumer Protection

In 1996, the United States Congress passed the Communications Decency Act in response to the unmitigated exposure of minors to pornography on the nascent internet—the first major rehaul of commu­nications law since the 1930s. As part of a larger effort to remove obscenity from the digital space, lawmakers sought to en­courage companies to clean up the defam­atory, offensive, and illegal speech users produced on their platforms.7 They did so with Section 230, one of the cornerstones of the information ecosystem and the on­line experience as it is known today.

No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.8

Section 230 employs a two-pronged approach as both a “shield” and “sword.” It shields companies from liability for ille­gal user content—only the speaker them­self bears any legal consequence, and not the platform.9 In creating this protection, Section 230 broke the publisher-distribu­tor paradigm that governed production of traditional media and established online platforms as conduits for content and not originators in and of themselves.10 Companies also expressly wield the power to actively moderate content on their plat­forms as they see fit—the “sword” Section 230 predictions afford. Alongside the pro­tection afforded by their “conduit” status, platforms also cannot be found liable for removing user content. This gives plat­forms the right to take down speech that is not per se illegal but held by the plat­form to be inappropriate.11 While broadly applied, Section 230 draws distinct lim­itations to liability protection; companies are liable for illegal content they either developed directly or encouraged production of the illegal aspect of said content.12

Section 230 is broadly credited with creating the regulatory environment conducive to innovation and rapid growth in the early internet that came to define Silicon Valley. Its liability shield encour­aged platforms hosting user-generated content—now some of the largest and most valuable companies in the world—to expand both their market share and the array of services offered to users, without threat of civilian or state legal action.13 By providing a freedom of choice as to what content is hosted and what is taken down in good faith, companies can develop and test moderation policies that are flexible and can respond quickly to rapidly chang­ing public sentiment or legal standards for free speech.14 Additionally, this power of curation curtails over-regulation; with a greater threat of legal action, platforms may choose to err on the side of caution and proactively remove content that could post any legal risk, regardless of its actual legality.15

In the 25 years since its passage, Section 230 has proven to be vitally im­portant to the information ecosystem and business models of online intermediaries, establishing structures for these compa­nies to actively moderate with impunity from undue legal pressures.

Federal Inaction and the Role of Corpo­rations in Internet Governance

Though its laissez-faire regulatory approach enabled the innovation that posi­tioned America as a thought leader in the internet sector, it has also cost the United States federal government its power posi­tion as a true force in internet governance. Though popular with both Democrats and Republicans, meaningful internet regula­tion has lagged amidst debate over con­cerns with conservative censorship, the potential for antitrust reform, and nearly $100 million spent by major tech firms in lobbying efforts.16 Meta alone has spent over $20 million in 2021, inserting itself as a central player in the legislative debate over antitrust, cybersecurity, and offline infrastructure.17

Following the probe into Russian interference in the 2016 election of Pres­ident Donald Trump, which determined that Facebook’s lax digital advertising policies enabled significant foreign disinformation and propaganda to flood American voters’ timelines, Congressio­nal Republicans have become even more evasive on the topic of regulating content online. Eager to avoid rehashing Special Counsel Robert Mueller’s report and opening a debate into the legitimacy of their own advertising, Republican leader­ship has whipped their party to generally oppose significant debate in this area.18 Now after the 2020 election and the use of social media to coordinate and broadcast the January 6th Capitol insurrection, it is likely this pattern will only continue.

In response to a vacuum of external regulation, tech companies have turned in­ward to regulate themselves with so-called “platform law:” the self-developed and -imposed regulations by which companies govern their platforms. While not legally binding, these codes look and act like legislation, with broad jurisdiction over every facet of a company’s governance.19 With mini bureaucratic, judicial, and analytical apparatuses within these companies them­selves, the private sector has filled the void of government inaction by designing their internal governing structure to mimic the public institutions that have yet to regulate them. These tend to resemble the American three-branch model: companies’ executive C-level positions respond to the legislative recommendations of dedicated policy teams while deferring to jurispru­dential experts to evaluate their sound­ness. As discussed above, the mere fact that these are American companies, with largely American CEOs and high-level leadership, means that liberal ideals of un­abridged freedom of speech and restricted interference from governing entities are intrinsic in these quasi-institutions.

As perhaps the most dynamic and attuned regulatory movement active pres­ently, platform law is thus an inextricable and crucial component to analyze along­side federal policies.20 Facebook in par­ticular has a well-developed and powerful body of platform law. The creation of its Oversight Board in 2020 drew the role of internet corporations as structures of governance into public debate. Composed of a panel of distinguished legal scholars, Nobel Prize laureates, and an ex-prime minister, the Oversight Board is a sui generis institution that acts as the final ar­bitrator of any disputed internal or exter­nal action on its platform and offers policy recommendations that steer Facebook’s leadership.21 As part of a greater push to increase transparency in the protections of freedom of expression for two billion peo­ple around the globe, Facebook has taken the unique step as a private entity of en­joining respected, public leaders in a role of arguably public leadership—a position that is hotly contested for its infringement on democratic rule of law.

Consumer Content Moderation in the European Union

The E-Commerce Directive & Intermedi­ary Liability Protection

Outside the American model, long­standing questions in content regulation challenge intermediary liability: to what extent should companies be able to claim free speech or other rights to be legally protected from the content their users pub­lish? The European Union contends with this question upon the legal framework of the Electronic Commerce Directive of 2000, stylized as the E-Commerce Directive.22 As a directive, it is an instruction to member states to produce an intended outcome but not a centralized executive action in itself: member states are free to enact whatever national policies they see fit to this end.23 In line with the broader pattern through the 1990s of harmonizing patchwork network regulation enacted by member states, the E-Commerce Directive sought to integrate the growing field of international commerce conducted over the internet into Europe’s Single Market economic policy. In doing so, it created a legal standard under which online inter­mediaries are protected from liability for illegal user content if they can prove they acted adequately to remove said material as soon as they were aware of its illegali­ty.24

Where an information society service is provided that consists of the trans­mission in a communication network of information provided by a recipient of the service [...] Member States shall ensure that the service provider is not liable for the information transmitted[...]25

If a Facebook user in Germany publishes a status update expressing hate towards the recent migrants in their com­munity, Facebook has the right to remove that status update as such hate speech is illegal under German law. They are protected from both any law enforcement action for enabling hate speech and from a lawsuit by the user arguing restriction of their freedom of speech. Implementation of these “safe harbor” protections was left to the discretion of member states them­selves, and many chose to do so by form­ing voluntary agreements with intermediaries to create Codes of Conduct for their platforms. In the decades that followed, this was expressed in many different EU-wide acts of soft law, including the formation of a common code of conduct and various Parliamentary taskforces.26 Much like Section 230, this directive rec­ognized information service providers as conduits for online material, as opposed to publishers themselves of original content. It gives companies the license to write their own rules while shielding them from liability for both user content itself and any resulting actions to take down that content. These freedoms gave burgeoning social media platforms through the 2000s flexibility to be created within the EU and for multinational internet companies to expand into the European market, without the constant threat of legal pressure.27 An established understanding of intermediary liability protection forms the legal basis for much of the ensuing European regula­tory action in the 2010s—lawmakers are restrained in their ability to target compa­nies directly but can instead saddle them with additional duties to moderate content on their platforms.

“The Right to be Forgotten” & Conten­tion Between Public Interest and Individ­ual Rights

Free speech in Europe is grounded in Article 11 of the Charter of the Euro­pean Union, ratified in the 2009 Treaty of Lisbon, which grants all European citizens the right to freedom of expression.28 This differs crucially from freedom of speech in America by empowering the govern­ment to intervene with speech when there is a significant public interest to do so—a broad prescription resulting in a manifest­ly different understanding of the relation­ship between government and speech. As such, European jurisprudence in issues of speech tends to be more heavy-handed and public facing than in America.29

Article 11 protects not only the right to create speech but also to access it. Thus, search engines invariably become major players in the free speech debate. Nowhere in case law is this more apparent than the 2014 case establishing a Euro­pean citizen’s “right to be forgotten,” and the responsibility of companies to protect said right.30 In 2012, Spanish citizen Ma­rio Costeja González petitioned Google Spain before the Court of Justice of the European Union. He claimed that the plat­form’s search return of a 2009 newspaper article detailing a seizure of his assets in a bankruptcy proceeding violated his right to privacy, as the information listed in this search was outdated and irrelevant. The court sided with González and ruled the article must be delisted from Google search results.31

This provision permits the processing of personal data where it is necessary for the purposes of the legitimate interests pursued by the controller or by the third party or parties to whom the data are disclosed, except where such interests are overridden by the interests or fundamen­tal rights and freedoms of the data subject[...]32

This decision overruled the notion that the neutrality of search algorithms was sacrosanct and unalterable; here, the CJEU decreed that a citizen’s interest in delisting information about themselves that is irrelevant superseded the search engine’s economic interest and the pub­lic’s right to access that information.33 It firmly placed individual privacy rights over corporate operations and created an important precedent in the agency of an individual to change how a platform processes third-party information. Howev­er, in protecting and enabling individuals, it simultaneously undermines the basic principle of public, free access to informa­tion, and places the power of arbitrating what information is and is not deemed “relevant” in the hands of a private ac­tor.34 Google Spain is thus a double-edged sword—delineating the rights of the citizen established in European law yet placing the power for their adjudication outside public institutions, and instead with private companies.

NetzDG & the Rise of Self­Regulation

Pressure on intermediary platforms to effectively moderate content has only grown in recent years. In 2017, the Ger­man Parliament passed the Netzwerk­durchsetzungsgesetz , or NetzDG. This law formally decreed that social media compa­nies had a legal obligation to remove con­tent that was illegal under German law.35 NetzDG applies to any internet platform with over two million users that operate for profit where users share content with other users.36 While implied, this responsi­bility had yet to be enshrined in hard law. It built off soft law agreed to by the Task Force Against Illegal Online Speech, com­posed of German free speech advocates, government agencies, and representatives from Facebook, Google, and Twitter.

The procedure shall ensure that the provider of the social network [...] removes or blocks access to content that is manifestly unlawful within 24 hours of receiving the complaint[...]37 

Platforms were now shouldered with the responsibility to not only identify content that violated their company codes of conduct but that which violated Ger­man hate speech and disinformation law. Such “manifestly illegal” content had to be removed within 24 hours of its post­ing.38 Moreover, NetzDG added real teeth as enforcement: companies had to comply under threat of heavy sanctions, up to 50 million euros. It is seen as a framework for a potential EU-wide legislative agen­da. Immediately upon its passage, it came under fire by international spectators and free speech advocates; they argued it abridged consumers’ right to free ex­pression by creating perverse economic incentives for platforms to take down borderline or questionable speech, out of an abundance of caution to avoid heavy fines.39 The right of a user to post content that may or may not be legal, and the rights of other users to access it, was now solely adjudicated by a company with a profit motive, rather than a public court of law. NetzDG highlights how the line between protected speech and moderated content becomes ever finer when deciding who draws it.

Democracy in Contention in American & European Regulatory Action

Through the above regulatory frame­work on both sides of the Atlantic, distinct practices emerge. Both the American and European approaches to regulating online platforms are marked by a supersession of the legal argument surrounding govern­ment restriction of freedom of expression altogether. Instead, these governing enti­ties rely on platforms to self-regulate with impunity. With the adoption of NetzDG, Europe has taken it a step further by also delegating power to adjudicate hate speech and disinformation to the private companies hosting these platforms, with full discretionary power. Critics of the combined global effort find fault in the an­ti-democratic vesting of regulatory power out of the public sphere to private indus­try: powers of arbitration in public law, no appropriate means of legal petition, and corrupted motive to govern in the best interests of the public.40

To fill the void left by lacking Amer­ican leadership, companies have had to create legal frameworks of their own and rely on internal controls versus external ones. When entrusted with the power to determine gray-area freedom of speech concerns, these frameworks are not immediately apparent to be effective without real public involvement.

Often, the EU is lauded for its fu­ture-facing leadership and recognition of the real threat hate speech and misinfor­mation can pose to democratic society. Tech regulation is only meaningful if it is enforced, which requires the governing body to have significant global economic heft to leverage an effective threat to re­move itself from these companies’ market.41 China operates outside global digital infrastructure, and America is reluctant to contend with even the potential of reg­ulation.42 No other global power has the economic power to make a credible threat against these companies—thus, by enact­ing any action whatsoever, Europe comes in first place in a competition with only one player.

The direction Europe has steered the global conversation can easily be seen as anti-democratic. By promoting self-regulation instead of imposing exter­nal controls, social media platforms take on the power of the rule of law itself; the responsibility of social media platforms to arbitrate speech for not only its code of conduct compliance but its legality was codified into hard law in NetzDG.43 A user’s speech may be removed from a platform even if they are able to argue its legality as protected speech in a court of law—the option to do so has been fore­closed to them by the automated methods companies rely on to identify illegal con­tent amid vast quantities of legal content.44 Facebook and Twitter are not democratic institutions, nor do they aim to or pretend to be them. Consumers do not have any real say in how these companies operate. Voting by choice does not necessarily ap­ply here—in most cases, the multinational market share of these platforms is so large that no meaningful, equivalent alternative exists. If a user disagrees with Instagram’s content regulation policies, there is no alternative platform that encompasses the whole of the network that the user can reach on Instagram for that same specific purpose. Even in Facebook’s mini-judi­ciary, there is no meaningful power of petition. The average user is not only demurred from action but shut out from real understanding of how any such peti­tion is addressed. Platform law is broadly opaque, and users are subject to guidelines either inaccessible, incomprehensible, or both.45

It cannot be ignored that these com­panies operate with a hardline profit motive. There is nothing explicit in their assumed role in civic life that publicly traded companies owe anything to citizens beyond fiduciary responsibility to their stakeholders. In 2020, the combined value of Google, Amazon, Facebook, Apple, and Microsoft was over $4.5 trillion.46 When compared to national economies, the five largest tech companies roughly match Japan’s GDP, the 3rd largest econ­omy in the world.47 Moreover, their core business models benefit from more speech and more data. While vague notions of corporate responsibility compound the First Amendment philosophy of truly free speech, at the end of the day more content equals more clicks equals more revenue: a clear motive above public good in these companies’ operations. Moreover, they rely on predatory data collection to at­tain advertising revenue—placing certain human rights, such as privacy, at odds with this baseline profit motive.48 There is nothing wrong with providing services for profit; but when such great powers to shape how billions of people employ their rights have been delegated to private entities en masse, it must be questioned whether these are the correct institutions to govern with appropriate respect to the people’s best interests.

With a delegated right to adjudicate public law, no meaningful right of pe­tition, and regulation under unflinching profit motive, this method of governance is not sustainable nor appropriate for the pervasiveness with which these platforms affect daily life for American and Europe­an citizens.

Conclusion

The United States has long been at the forefront of thought leadership in the tech sector—from the Silicon Valley start-ups in garages to the multinational conglomerates they grew into, America has long tread a path into cyberspace for others to follow. American legislators en­joy a dialogue with these companies that Europe does not; a common ground built upon the freedoms of the First Amend­ment enabling construction of structures such as Section 230. Congress deals with these players as domestic constituents—a shared interest in strong American GDP enables a uniquely liberal and laissez-faire regulatory approach, vesting great internal power of moderation of public speech to private players. Especially as the role of social media in American electoral poli­tics develops, this mutual understanding becomes marred by a vector of distrust to­wards these tech platforms for purportedly undue abridgement of speech. Perhaps the real threat to democracy lies not in San Francisco skyrises but in the warped legal structures formed in their partnership with Capitol Hill.

Europe, meanwhile, has emerged in recent years as a credible threat to the unchecked power with which America allows these companies to operate. Con­temporary history, with virulent Sovi­et and Nazi propaganda still in living memory, lends a more pragmatic view of free speech in Europe as well as a grave understanding of the danger hate speech can pose. The government is empowered as nowhere else in the world to take nec­essary action against speech to preserve democracy; though through a burgeoning legal movement towards internal regula­tion, they now cede this power to private entities—perhaps countering the very objective of the right to expression as es­tablished in the Charter. Through Google Spain and NetzDG, companies have the right to assign legality and delineate the rights of EU citizens without being dem­ocratic institutions themselves or answer­able in any real way to the body politic. Is free speech truly free if it lives under rule of private power?

None of the legislation or legal thought I have discussed here is prima facie good or bad, effective or ineffective. Both America and Europe have taken credible steps in contending with the pow­er of the internet as a force threatening to destabilize centuries of established rights enjoyed by people the world over. The in­ternet is itself a new phenomenon, still in the nascent stages of its understanding and impact on the world. None of these issues are ours entirely to solve. The questions I have posed here will be grappled with for generations—but we owe it to future un­derstanding of free speech and of the very nature of how forces outside government and civics affect human rights to begin to find their answers.

 

Bibliography

Article 11 - Freedom of expression and information. (n.d.). European Union Agency for Fundamental Rights. https://fra.europa.eu/en/eu-charter/article/11-freedom-expression-and-information

Berkowitz, E. (2022). Dangerous Ideas: A Brief History of Censorship in the West, from the Ancients to Fake News. Beacon Press.

Brand value of the most valuable brands in the world 2022. (2022, June 24). Statista. https://www.statista.com/statistics/269444/brand-value-of-the-most-valuable-compa- niesworldwide/

Case C-131/12 Google Spain SL and Google Inc. v Agencia Española de Protección de Datos (AEPD) and Mario Costeja González (2014) Official Journal C212, p. 4-5.

Client Profile: Meta. (2022, April 22). OpenSecrets. https://www.opensecrets.org/federal-lobbying/clients/bills?cycle=2021&id=D000033563

Communications Decency Act of 1996, 104 U.S.C § 230(c). (1996).

Cornell Law School. (n.d.). Procedural Matters and Freedom of Speech: State Action. Legal Information Institute. https://www.law.cornell.edu/constitution-conan/amendment-1/procedural-matters-and-fre edom-of-speech-state-action

‘Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic com­merce, in the Internal Market (‘Directive on electronic commerce’)’ (2000) Official Journal L178, p. 12.

D’Souza, D. (2019, June 25). Tech Lobby: Internet Giants Spend Record Amounts, Elec­tronics Firms Trim Budgets. Investopedia. https://www.investopedia.com/tech/what-are-tech-giants-lobbying-trump-era/

European External Action Service. (2021, September 28). Freedom of expression. EEAS. https://www.eeas.europa.eu/eeas/freedom-expression_en

Feiner, L., & Graham, M. (2020, October 31). Congress fails to pass Big Tech legislation ahead of election. CNBC. https://www.cnbc.com/2020/10/31/congress-fails-to-pass-big-tech-legislation-ahead-of-election.html

‘Gesetz zur Verbesserung der Rechtsdurchsetzung in sozialen Netzwerken’ (2017) Federal Law Gazette I, p. 3352.

Global Freedom of Expression | Columbia University. (2022). Google Spain SL v. Agen­cia Española de Protección de Datos - Global Freedom of Expression. Retrieved June 30, 2022, from https://globalfreedomofexpression.columbia.edu/cases/google-spain-sl-v-agencia-espanola-de-proteccion-de-datos-aepd/

Goodman, E. P., & Whittington, R. (2019, August). Section 230 of the Communications Decency Act and the Future of Online Speech. German Marshall Fund of the United States, (20). JSTOR. http://www.jstor.com/stable/resrep21228

Haupt, C. E. (2022, January 22). Regulating Speech Online: Free Speech Values in Constitutional Frames. Washington University Law Review. https://wustllawreview.org/2022/01/22/regulating-speech-online-free-speech-values-in-constitutional-frames/

Kaye, D. (2019). Speech Police: The Global Struggle to Govern the Internet. Columbia Global Reports.

Kelley, J. (2020, December 3). Section 230 is Good, Actually. Electronic Frontier Foundation. https://www.eff.org/deeplinks/2020/12/section-230-good-actually

Klonick, K. (2017, March 20). The New Governors: The People, Rules, and Processes Governing Online Speech. Harvard Law Review, 131(1598), 1598-1670. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2937985

Klonick, K. (2021, February 12). Inside the Making of Facebook’s Supreme Court. The New Yorker. https://www.newyorker.com/tech/annals-of-technology/inside-the-making-of-facebooks-supreme-court

Madiega, T. (2020, May). Reform of the EU liability regime for online intermediaries: Background on the forthcoming digital services act. EPRS | European Parliamentary Research Service, 649(404). https://www.europarl.europa.eu/thinktank/en/document/ EPRS_IDA(2020)649404

Marolda, G. (2022, May 18). The European Union: History, Institutions, Decision-Mak­ing and Policy-Making [Lecture]. University of Pittsburgh.

Laskai, L. (2017). ‘NAILING JELLO TO A WALL'. In J. Golley, L. Jaivin, & L. Tomba (Eds.), Control (pp. 191-208). ANU Press. http://www.jstor.org/stableZj.ctt1sq5tvf.21

Overview of the NetzDG Network Enforcement Law. (2017, July 17). Center for Democ­racy and Technology. https://cdt.org/insights/overview-of-the-netzdg-network-enforcement-law/

Rasure, E. (2022, June 27). Countries by GDP: The Top 25 Economies in the World. In- vestopedia. https://www.investopedia.com/insights/worlds-top-economies/

Rosenberg, I. (2021). The Fight for Free Speech: Ten Cases That Define Our First Amendment Freedoms. NYU Press.

Savin, A. (2014, February 26). How Europe formulates internet policy. Internet Policy Review, 3(1), https://policyreview.info/articles/analysis/how-europe-formulates-internet-policy

Savin, A. (2020). EU Internet Law (3rd ed.). Edward Elgar Publishing.

U.S. Const. amend. I. (1791).

Zandt, F. (2022, January 25). • Chart: Big Tech Goes Big on Lobbying Efforts. Statista. https://www.statista.com/chart/26673/highest-lobbying-spending-in-the-tech-industry-in-the-us/

 


1 Kaye, D. (2019). Speech Police: The Global Struggle to Govern the Internet. Columbia Global Reports.

2 Marolda, G. (2022, May 18). The European Union: History, Institutions, Decision-Making and Policy-Making [Lec­ture]. University of Pittsburgh.

3 U.S. Const. amend. I. (1791).

4 Berkowitz, E. (2022). Dangerous Ideas: A Brief History of Censorship in the West, from the Ancients to Fake News. Bea­con Press.

5 Cornell Law School. (n.d.). Proce­dural Matters and Freedom of Speech: State Action. Legal Information Institute.

6 Kaye, D. (2019). Speech Police: The Global Struggle to Govern the Internet. Columbia Global Reports.

7 Goodman, E. P., & Whittington, R. (2019, August). Section 230 of the Com­munications Decency Act and the Future of Online Speech. German Marshall Fund of the United States, (20). JSTOR.

8 47 U.S.C. § 230

9 Goodman, E. P., & Whittington, R. (2019, August). Section 230 of the Com­munications Decency Act and the Future of Online Speech. German Marshall Fund of the United States, (20). JSTOR.

10 Kaye, D. (2019). Speech Police: The Global Struggle to Govern the Internet. Columbia Global Reports.

11 Goodman, E. P., & Whittington, R. (2019, August). Section 230 of the Com­munications Decency Act and the Future of Online Speech. German Marshall Fund of the United States, (20). JSTOR.

12 Ibid.

13 Kelley, J. (2020, December 3). Section 230 is Good, Actually. Electronic Frontier Foundation.

14 Ibid.

15 Ibid.

16 Zandt, F. (2022, January 25). · Chart: Big Tech Goes Big on Lobbying Ef­forts . Statista.

           D’Souza, D. (2019, June 25). Tech Lobby: Internet Giants Spend Record Amounts, Electronics Firms Trim Budgets. Investopedia.

17 Client Profile: Meta. (2022, April 22). OpenSecrets.

18 Feiner, L., & Graham, M. (2020, October 31). Congress fails to pass Big Tech legislation ahead of election. CNBC.

19 Kaye, D. (2019). Speech Police: The Global Struggle to Govern the Internet. Columbia Global Reports.

20 Klonick, K. (2017, March 20). The New Governors: The People, Rules, and Processes Governing Online Speech. Har­vard Law Review, 131(1598), 1598-1670.

21 Ibid.

22 Madiega, T. (2020, May). Reform of the EU liability regime for online inter­mediaries: Background on the forthcoming digital services act. EPRS | European Par­liamentary Research Service, 649 (404).

23 Marolda, G. (2022, May 18). The European Union: History, Institutions, Decision-Making and Policy-Making [Lec­ture]. University of Pittsburgh.

24 Kaye, D. (2019). Speech Police: The Global Struggle to Govern the Internet. Columbia Global Reports.

25 ‘Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Mar­ket (‘Directive on electronic commerce’)’ (2000) Official Journal L178, p. 12.

26 Madiega, T. (2020, May). Reform of the EU liability regime for online inter­mediaries: Background on the forthcom­ing digital services act. EPRS | European Parliamentary Research Service, 649(404).

27 Kaye, D. (2019). Speech Police: The Global Struggle to Govern the Internet. Columbia Global Reports.

28 Article 11 - Freedom of expression and information. (n.d.). European Union Agency for Fundamental Rights.

29 Kaye, D. (2019). Speech Police: The Global Struggle to Govern the Internet. Columbia Global Reports.

30 Ibid.

31 Global Freedom of Expression | Columbia University. (2022). Google Spain SL v. Agencia Española de Protección de Datos - Global Freedom of Expression.

32 Case C-131/12 Google Spain SL and Google Inc. v Agencia Española de Protección de Datos (AEPD) and Mario Costeja González (2014) Official Journal C212, p. 4-5.

33 Kaye, D. (2019). Speech Police: The Global Struggle to Govern the Internet. Columbia Global Reports.

34 Ibid.

35 Ibid.

36 Overview of the NetzDG Network Enforcement Law. (2017, July 17). Center for Democracy and Technology.

37 Netzdurchsetzunggesetz, Art. 1 §3(2)(2)

38 Kaye, D. (2019). Speech Police: The Global Struggle to Govern the Internet. Columbia Global Reports.

39 Ibid. 

40 Ibid.

41 Savin, A. (2020). EU Internet Law (3rd ed.). Edward Elgar Publishing.

42 Laskai, L. (2017). ‘NAILING JEL­LO TO A WALL. In J. Golley, L. Jaivin, & L. Tomba (Eds.), Control (pp. 191-208). ANU Press.

43 Kaye, D. (2019). Speech Police: The Global Struggle to Govern the Internet. Columbia Global Reports.

44 Ibid.

45 Klonick, K. (2017, March 20). The New Governors: The People, Rules, and Processes Governing Online Speech. Har­vard Law Review, 131(1598), 1598-1670.

46 Brand value of the most valuable brands in the world 2022. (2022, June 24). Statista.

47 Rasure, E. (2022, June 27). Coun­tries by GDP: The Top 25 Economies in the World. Investopedia. 

48 Kaye, D. (2019). Speech Police: The Global Struggle to Govern the Internet. Columbia Global Reports.

Volume 21, Fall 2022