TikTok and Free Speech: TikTok’s First Amendment Case Against the Government

by Erin Gray, Associate Member, University of Cincinnati Law Review Vol. 93

I. Introduction

On January 19, 2025, TikTok, a social media platform used by 170 million Americans, may cease to exist within the United States.[1] On March 14, 2024, Congress passed the Protecting Americans from Foreign Adversary Controlled Applications Act (“Act”) which could ban TikTok from operating within the United States.[2] However, TikTok has challenged the constitutionality of the Act. TikTok and several TikTok content creators have sued the Attorney General of the United States Court of Appeals for the District of Columbia (“Court of Appeals”), alleging that the Act violates their First Amendment rights.[3]

This article explores TikTok’s case against the Government and how the Court of Appeals may rule on the First Amendment arguments raised by TikTok. Part II examines case law the Court of Appeals may consider. Part III discusses the Act and the parties’ arguments and argues how the Court of Appeals should decide the case. Lastly, Part IV summarizes those conclusions and considers future implications of the Court of Appeal’s ruling.

II. Background

A. First Amendment

The First Amendment states “Congress shall make no law . . .  abridging the freedom of speech.”[4] In Lamont v. Postmaster General, the Supreme Court first recognized that recipients of speech have a First Amendment right to receive speech.[5] Attempting to control Communist propaganda, the government passed a law that required administrative officials to detain mail and force the addressee to request the mail before dispatch.[6] The Court held that this was an unconstitutional abridgment of First Amendment rights because it chilled speech and placed an affirmative action on the recipients to receive their mail.[7] Additionally, in Agency for International Development v. Alliance for Open Society International, the Court held that First Amendment protections do not apply to foreign organizations and citizens outside the United States.[8]

Furthermore, in United States v. O’Brien, the Court determined how to analyze First Amendment issues when a law targets both speech and non-speech elements.[9] The Court stated “[w]hen ‘speech’ and ‘nonspeech’ elements are combined in the same course of conduct, a sufficiently important governmental interest in regulating the nonspeech element can justify incidental limitations on First Amendment freedoms.”[10] The Court articulated the test to be applied: “a government regulation is sufficiently justified if it is within the constitutional power of the Government; if it furthers an important or substantial governmental interest; if the governmental interest is unrelated to the suppression of free expression; and if the incidental restriction on alleged First Amendment freedoms is no greater than is essential to the furtherance of that interest.”[11] Therefore, the Court upheld a law that prohibited the destruction of selective service registration cards.[12] The Court found that the law did not violate the defendant’s First Amendment right when he burned his selective service card in an antiwar protest.[13] First, the Court found that on its face, the law did prohibit expressive conduct.[14] Second, the government had a substantial interest in assuring that the draft cards were available and there was no alternative to achieve that objective.[15]

In Reed v. Town of Gilbert, the Court articulated when a law is considerd content-based, and how that distinction factors into First Amendment scrutiny.[16] The Court explained that first, a court must examine whether the law distinguishes on its face between subject matter or viewpoints.[17] If the law does not distinguish on its face and is facially neutral, then a court must examine whether the purpose or justification for the law is content-based.[18] If the law itself or the purpose and justification are content-based, then the law must pass strict scrutiny.[19]

In McCullen v. Coakley, the Court described what strict scrutiny entails.[20] To pass strict scrutiny, a law must further a compelling interest and be narrowly tailored to further that interest.[21] To be narrowly tailored, the law must use “the least restrictive means of achieving a compelling interest.”[22] Ultimately, the Court concluded that a law banning pedestrians within thirty-five feet of abortion clinics was not narrowly tailored to further the compelling interest of ensuring unobstructed access to healthcare because the state could have pursued other options before restricting access to the buffer zones.[23]

As the internet has grown, the Court has faced unique First Amendment issues. In Moody v. NetChoice, the Court held that American social media companies who moderate content engage in protected First Amendment activities.[24] The Court stated, “[d]eciding on the third-party speech that will be included in or excluded from a compilation—and then organizing and presenting the included items—is expressive activity of its own. And that activity results in a distinctive expressive product.”[25] Two trade associations, representing American social media platforms, challenged two state laws that limited the ability of the social media companies to engage in content moderation as unconstitutional under the First Amendment.[26] While the Court remanded the case for non-First Amendment reasons, the Court emphasized that content moderation is a protected First Amendment activity.[27]

B. Foreign Affairs and National Security

In Bank Markazi v. Peterson, the Court acknowledged that in foreign affairs, Congress and the President have the power to conduct policy as necessary and proper.[28] Similarly, in Ziglar v. Abbasi, the Court stated, “[n]ational-security policy is the prerogative of the Congress and President.”[29] And, “[c]ourts have shown deference to what the Executive Branch “has determined . . . is ‘essential to national security.”[30] Again, the Court cautioned courts against second-guessing the political branches’ judgments on matters of national security.[31] In Holder v. Humanitarian Law Project, the Court emphasized that when confronting an evolving threat, the political branches can rely “on informed judgment rather than concrete evidence” when taking “preventative measure[s].”[32] Additionally, the Court emphasized that national security is a compelling interest for the government.[33] In Haig v. Agee, the Court stated, “[i]t is obvious and unarguable that no governmental interest is more compelling than the security of the Nation.”[34]

III. Discussion

The Act prohibits third parties from distributing, maintaining, or updating Foreign Adversary Controlled Applications (“FACA”) in the United States.[35] Under the Act, any application “operated, directly or indirectly,” by ‘ByteDance, Ltd.” or “TikTok,” is automatically determined as FACA.[36] Additionally, the Act determines that any successors or subsidiaries of those companies will also be deemed FACA.[37] However, other applications besides TikTok can be found as FACA.[38] This includes any application that is operated by a “covered company” that is “controlled by a foreign adversary” and is “determined by the President to present a significant threat to the national security of the United States” following an administrative process.[39] The Act does provide a pathway in which FACA can continue to operate within the United States through a “qualified divesture.”[40] The President would determine through an interagency process that an application is no longer controlled by a foreign adversary and that there is no “operational relationship” between the application and any “formally affiliated entities controlled by a foreign adversary.”[41]

TikTok challenges the constitutionality of the Act and argues that the company engages in protected First Amendment activities which the Act unlawfully restricts.[42] TikTok alleges that the Act is content-based because there are content-based distinctions within it.[43] As such, the Act is subject to strict scrutiny.[44] TikTok believes the Act fails strict scrutiny because the Government lacks a narrowly tailored compelling interest given that the Government could have accepted an agreement with the company that addressed national security concerns.[45]

The Government emphasizes that to fully address the threat that TikTok poses, the Act is the narrowest option available to protect national security.[46] Additionally, the Government believes that the Act does not target protected expressive activity, but rather the conduct of a foreign company.[47] As a result, the Government insists the First Amendment does not apply here.[48] Even if the First Amendment did apply, the Act passes strict scrutiny because the Court must defer to Congress and the President for judgments regarding national security.[49]

A. TikTok does not have First Amendment Protections

The threshold question is whether TikTok engages in protected First Amendment activities. At the onset, this case and NetChoice seem quite similar. Here, TikTok engages in content moderation.[50] TikTok controls content through its community guidelines which prohibits content such as nudity, hateful ideologies, and harmful behavior.[51] Similarly, the social media companies in NetChoice also engaged in content moderation which the Court deemed protected under the First Amendment.[52] While it may appear that TikTok’s content moderation would be an expressive product under the First Amendment, this is incorrect. TikTok’s parent company, ByteDance, is controlled by multiple global subsidiaries.[53] As a result, TikTok is controlled by foreign organizations, and the content moderation that TikTok engages in is the speech of a foreign organization. Under Agency for International Development, foreign-controlled organizations do not have First Amendment protections in the United States.[54] Therefore, TikTok has no First Amendment protection to its content moderation. Whereas in NetChoice, the social media companies were American companies only.[55] Thus, while in NetChoice the First Amendment protected content moderation, here, the First Amendment fails to protect TikTok’s content moderation.[56] As a result of this, the Act, which explicitly targets the foreign-controlled aspect of TikTok, is regulating a non-expressive action.

B. American Users of TikTok have First Amendment Protections 

Next, the Court of Appeals must address whether American users of TikTok have any First Amendment interests at risk. Here, TikTok has 170 million American users who are recipients of TikTok’s content moderation.[57] In Lamont, the Court recognized that recipients of speech have a protected First Amendment interest in receiving speech.[58] Similarly, to the recipients of the physical mail in Lamont, American users receive TikTok’s content moderation through the application.[59] Just as the Court in Lamont recognized, the Court of Appeals must similarly recognize that American TikTok users also have a First Amendment interest in receiving the content moderation of TikTok.[60] Thus, the Act does burden the First Amendment rights of American TikTok users.

C. The Act Fails the O’Brien Test.

Subsequently, the Court of Appeals must examine whether the Act passes the O’Brien test because the Act targets both speech (American user’s First Amendment interests) and non-speech (the foreign content moderation of TikTok) elements. Under the first prong of the O’Brien test, the Act must fall within the constitutional powers of Congress.[61] Here, the Act concerns foreign policy and national security concerns, which the Court has long recognized as within the domain of the Executive and Legislative branches.[62] Therefore, the Act does fall within the constitutional powers of Congress and the President to regulate. Under the second prong, the Act must further an important or substantial government interest.[63] Here, the Act furthers the nation’s security because it ensures that sensitive American data is safe from the access of foreign countries.[64] The Court recognized that it is “obvious and unarguable” that national security is the highest governmental interest.[65] Thus, the Act does further, at a minimum, a substantial government interest.

Under the third prong, the Act cannot be related to the suppression of free expression.[66] First, the Court of Appeals must facially examine the Act for any content-based restrictions.[67] On its face, the Act is not content-based because it does not target a specific subject matter of speech or viewpoint.[68] Instead, the Act targets the non-expressive product of foreign adversary-controlled applications.[69] However, under Town of Gilbert, the Court of Appeals must also examine whether the purpose or justification of the Act is content-based.[70] The Government cited a hearing before the House of Representative’s Energy and Commerce Committee on March 23, 2023, that discussed Congress’s concerns about TikTok.[71] The opening statement summarized the following about the dangers of TikTok:

National security experts are sounding the alarm, warning that the Chinese Communist government could require TikTok to compromise device security, maliciously access American user data, promote pro-Communist propaganda, and undermine American interests. Disinformation campaigns could be launched by the Chinese Communist government through TikTok, which has already become rife with misinformation and disinformation, illegal activities, and hate speech.[72]

While Congress is concerned about national security and safeguarding American data, it is also apparent that Congress is concerned about the type of speech TikTok users are engaging in. Given that Congress distinguished disfavored speech in its justification of the dangers of TikTok, the Act is content-based.[73] As a result, the Act fails the third prong of the O’Brien test, and the fourth prong does not need to be analyzed. Instead, the Act must be examined under strict scrutiny.

D. The Act Passes Strict Scrutiny

As the Act was content-based, the Court of Appeals must then assess the Act under strict scrutiny. To pass strict scrutiny, a law must further a compelling interest and be narrowly tailored to further that interest.[74] As mentioned earlier, the Court recognized that national security is a compelling interest.[75] Therefore, the Act passes the first requirement of strict scrutiny. Next, the Court of Appeals must analyze whether the Government’s compelling interest is narrowly tailored.[76]

Petitioners allege that the Government could have accepted a national security agreement that would insulate TikTok from Chinese influence.[77] The agreement promises that American data would be stored within an American cloud-based company to ensure security.[78] Additionally, the agreement assured that a third party would monitor TikTok’s content moderation practices to ensure that there is no foreign manipulation of content.[79] However, the Government demanded divestment because the national security agreement was determined to be an inadequate solution.[80] The Government noted that the agreement claimed to insulate the application from Chinese influence, yet divestment would be infeasible because of TikTok’s reliance on ByteDance.[81] Specifically, by TikTok’s own admission, “a new owner of TikTok in the United States would at a minimum require a data-sharing agreement with ByteDance.”[82] The Government thus concluded that divestment was the only option to address security concerns that TikTok’s intertwinement with the Chinese-owned ByteDance creates.[83]

The Court previously cautioned judges against second-guessing Congress’s and the President’s conclusions on national security.[84] Specifically, the Court articulated that to “confront evolving threats” of national security Congress and the President may rely “on informed judgment rather than concrete evidence” when taking “preventative measure[s].”[85] Here, Congress and the President determined that divestment was the only way to protect American data and national security.[86] As such, the Court of Appeals should not question that conclusion and should instead give deference to the political branches to make that determination. The Court of Appeals should accept that the agreement offered by TikTok was not a viable option to protect national security.  As a result of this deference, the Court of Appeals should find the Act is the least speech-restrictive means to further the Government’s interest. And therefore, the Act passes strict scrutiny.

IV. Conclusion

In conclusion, the Court of Appeals may find that the Act passes strict scrutiny and therefore is permissible under the First Amendment. And unless TikTok divests from ByteDance, the application may no longer operate in the United States after January 19, 2025. This could be a historic free speech ruling within the United States. As new threats to national security develop, other foreign-based applications may also be subject to bans to protect American interests.


[1] Brief of Petitioner-Appellant at 1, TikTok v. Garland, No. 24-1113 (D.C. Cir. June 20, 2024).

[2] Id.

[3] Id.

[4]  U.S. Const. amend. I.

[5] Lamont v. Postmaster General, 381 U.S. 301, 306 (1965).

[6] Id. at 302.

[7] Id. at 306-07.

[8] Agency for Int’l Dev. v. All. for Open Soc’y Int’l., 570 U.S. 205, 436 (2013).

[9] United States v. O’Brien, 391 U.S. 367, 376. 

[10] Id.

[11] Id. at 377.

[12] Id. at 369.

[13] Id.

[14] Id. at 382.

[15] Id.

[16] Reed v. Town of Gilbert, 576 U.S. 155, 159 (2015) (finding that an aesthetic interest was not compelling or narrowly tailored to support a content-based restriction).

[17] Id. at 165-67.

[18] Id.

[19] Id.; See United States v. Eichman, 496 U.S. 310, 317 (1990) (holding that, “[a]lthough the Flag Protection Act contains no explicit contest-based limitation on the scope of prohibited conduct, it is nevertheless clear that the Government’s asserted interest is ‘related “to the suppression of free expression.”’”).

[20] McCullen v. Coakley, 573 U.S. 464, 478 (2014).

[21] Id.

[22] Id.

[23] Id. at 496.

[24] Moody v. NetChoice, 144 S.Ct. 2383, 2402 (2024).

[25] Id.

[26] Id. at 2388.

[27] Id.

[28] Bank Markazi v. Peterson, 578 U.S. 212, 325 (2016).

[29] Ziglar v. Abbasi, 582 U.S. 120, 142 (2017); See U.S. Const., Art. I, §8; Art. II, §§1, 2.

[30] Ziglar, 582 U.S. at 142 (quoting Winter v. Nat. Res. Def., 555 U.S. 7, 25-26 (2008)).

[31] Holder v. Humanitarian L. Project, 561 U.S. 1, 34-35 (2010).

[32] Id.

[33] Haig v. Agee, 453 U.S. 280, 307 (1981).

[34] Id.

[35] Protecting Americans from Foreign Adversary Controlled Applications Act, Pub. L. No. 118-50, div. H, 138 Stat. 955, § 2(a)(1) (2024).

[36] Id. § 2(g)(3).

[37] Id.

[38] Id. § 2(g)(3)(B).

[39] Id.

[40] Id. § 2(g)(6).

[41] Id.

[42] Brief of Petitioner-Appellant, supra note 1, at 29.

[43] Id. at 33.

[44] Id. at 47.

[45] Id. at 54.

[46] Brief of Respondent-Appellee at 57, TikTok v. Garland, No. 24-1113 (D.C. Cir. July 26, 2024).

[47] Id. at 59.

[48] Id.

[49] Id. at 65.

[50] Brief of Petitioner-Appellant, supra note 1, at 8.

[51] Id.

[52] Moody v. NetChoice, 144 S.Ct. 2383, 2402 (2024).

[53] Brief of Petitioner-Appellant, supra note1, at 7.

[54] Agency for Int’l Dev. v. All. for Open Soc’y Int’l., 570 U.S. 205, 436 (2013).

[55] NetChoice, 144 S.Ct. at 2402.

[56] Id.

[57] Brief of Petitioner-Appellant, supra note 1, at 1.

[58] Lamont v. Postmaster General, 381 U.S. 301, 306 (1965).

[59] Id.

[60] Id.

[61] United States v. O’Brien, 391 U.S. 367, 377 (1968).

[62] See Bank Markazi v. Peterson, 578 U.S. 212, 325 (2016).

[63] O’Brien, 391 U.S. at 377.

[64] Protecting Americans from Foreign Adversary Controlled Applications Act, Pub. L. No. 118-50, div. H, 138 Stat. 955, 956 (2024).

[65] See Haig v. Agee, 453 U.S. 280, 307 (1981).

[66] O’Brien, 391 U.S. at 377.

[67] See Reed v. Town of Gilbert, 576 U.S. 155, 165-67 (2015).

[68] Protecting Americans from Foreign Adversary Controlled Applications Act of 2024, §§ 1, 2, 3.

[69] Id.

[70] Reed, 576 U.S. at 165-67.

[71] Brief of Respondent-Appellee, supra note 46, at 57.

[72] TikTok: How Congress Can Safeguard American Data Privacy and Protect Children From Online Harms, Hearing Before the H. Comm. on Energy and Com,, 118 Cong., 8 (2023) (Opening Statement of Rep. Frank Pallone Jr.).

[73] See United States v. Eichman, 496 U.S. 310, 317 (1990) (holding that, “[a]lthough the Flag Protection Act contains no explicit contest-based limitation on the scope of prohibited conduct, it is nevertheless clear that the Government’s asserted interest is ‘related “to the suppression of free expression.”’”).

[74] Reed, 576 U.S. at 165-67.

[75] See Haig v. Agee 453 U.S. 280, 307 (1981).

[76] See Reed, 576 U.S. at 165-67.

[77] Brief of Petitioner-Appellant, supra note 1, at 15.

[78] Id.

[79] Id. at 16.

[80] Brief of Respondent-Appellee, supra note 46, at 50.

[81] Id.

[82] Brief of Petitioner-Appellant, supra note 1, at 23.

[83] Brief of Respondent-Appellee, supra note 46, at 50.

[84]  Holder v. Humanitarian L. Project, 561 U.S. 1, 34-35 (2010).

[85] Id.

[86] Brief of Respondent-Appellee, supra note 46, at 50.


Cover Photo by Oliver Bergeron on Unsplash.

Authors

  • Erin Gray is a 2L at the University of Cincinnati College of Law and an Associate Member of the Law Review. Before starting law school, Erin received a degree in International Studies and Liberal Arts with minors in Spanish and History from the University of Cincinnati. She is primarily interested in pursuing litigation work.

Up ↑

Discover more from University of Cincinnati Law Review Blog

Subscribe now to keep reading and get access to the full archive.

Continue reading

Skip to content