The TikTok Paradox: National Security, Digital Sovereignty, and the Forging of U.S. Tech Policy

A dramatic image of the U.S. Capitol building at dusk with a large, glowing TikTok logo projected onto its dome, appearing to crack the stone. Lightning bolts extend from the logo. In the foreground, a diverse crowd holds up smartphones, all displaying the TikTok logo. A red, stylized Chinese dragon figure emerges from a cloud of data on the right side of the Capitol, signifying a foreign influence.

David’s Note: This article was substantially revised on October 10, 2025 to incorporate new research and provide a more comprehensive analysis.

On January 17, 2025, the U.S. Supreme Court upheld a landmark law that forces the sale of TikTok, a platform used by over 170 million Americans, or face a nationwide ban.1 This decision highlighted a central paradox in modern American policy. TikTok is at once a legislative target, condemned as a grave national security threat, and an indispensable campaign tool, actively leveraged by the political actors who seek to regulate it.

This paper argues that this apparent contradiction is not a sign of policy incoherence. Instead, it reveals an evolving and deliberate strategy to confront a novel threat to the nation’s digital sovereignty. Digital sovereignty is a nation’s ability to control its own digital destiny—the data, hardware, and software it relies upon.3 In this context, it means securing the digital infrastructure and information environment within its borders from the control of a strategic adversary.4

The core of this argument is that the threat posed by TikTok is fundamentally structural. It is rooted in the legal and operational subordination of its parent company, ByteDance, to the government of the People’s Republic of China (PRC). This structural risk is distinct from the commercial data practices of domestic social media companies. It has compelled the U.S. to forge a new national security doctrine for the digital age.

To develop this thesis, this paper will proceed in four parts.

  • Section I will establish that TikTok represents a structural national security threat due to its data collection capabilities under PRC law and its potential for algorithmic manipulation.
  • Section II will trace the evolution of U.S. legal strategy, from the failure of broad executive orders to the crafting of a targeted, constitutionally-sound legislative solution.
  • Section III will systematically deconstruct the primary counterarguments against this policy, including those based on the First Amendment, economic disruption, and false equivalencies with U.S. tech firms.
  • Section IV will analyze the political realities that create the central paradox, examining how electoral pragmatism and divided public opinion coexist with the national security consensus.

Ultimately, this analysis will demonstrate that the TikTok dilemma is a landmark case in how a liberal democracy is adapting its legal and political tools to defend its sovereignty in an era of weaponized information.

Section I: TikTok as a Structural National Security Threat

The United States’ policy toward TikTok is often misconstrued as incoherent. A deeper analysis, however, reveals a deliberate strategy aimed at confronting a new type of national security risk: a structural threat vector embedded within the nation’s information ecosystem.

The concern is not merely about the platform’s content or its data collection practices in isolation. The true issue is the non-negotiable legal and operational subordination of its parent company, ByteDance, to the government of the People’s Republic of China (PRC), a strategic adversary. This section establishes the foundational argument that the threat posed by TikTok is fundamentally structural. It is therefore distinct from the risks associated with domestically-owned social media companies.

1.1 The Data Nexus and Sovereign Compulsion

The national security argument centers on the combination of TikTok’s vast data-gathering capabilities and the legal framework governing its parent company. While many social media platforms engage in extensive data collection, the legal obligations imposed on ByteDance by the Chinese government create a unique and unacceptable risk.

TikTok’s application automatically captures a wide array of user information. This includes not only user-generated content but also sensitive metadata like location data, browsing histories, and even keystroke patterns.6 The platform’s privacy policy and multiple lawsuits also allege the collection of biometric identifiers, including “faceprints and voiceprints,” which are of high intelligence value.7 While this data collection is often described as “comparable to what other social media companies gather,” this comparison obscures a critical distinction: the legal authority to compel its disclosure.6

The central pillar of the U.S. government’s case is China’s 2017 National Intelligence Law. This statute requires any Chinese organization or citizen to “support, assist and cooperate with the state intelligence work” upon request.6 This legal mandate places ByteDance in a position of unavoidable subservience to the PRC’s intelligence apparatus. Unlike in the United States, where government access to private data requires a legal process with judicial oversight, the Chinese framework provides no such protection.6 This difference transforms TikTok’s data repository from a commercial asset into a potential state intelligence asset.

This is not a theoretical vulnerability. Top U.S. intelligence and law enforcement officials have repeatedly articulated this threat. FBI Director Christopher Wray has explicitly warned that the Chinese government could leverage its legal authority over ByteDance to control the data of millions of American users.1 Officials have raised two primary concerns:

  • The potential for mass data collection to build dossiers on Americans for espionage or foreign influence targeting.1
  • The ability to manipulate the platform’s powerful recommendation algorithm.1

These warnings, echoed by the Federal Communications Commission (FCC), have formed the basis of a rare bipartisan consensus in Congress.1 The American intelligence community has also pointed to specific attempts by the PRC to influence U.S. elections through platforms like TikTok, solidifying the view that the app represents a direct threat to American democratic processes.11

1.2 The Algorithmic Battleground

Beyond data exfiltration, the second part of the structural threat involves TikTok’s potential use as a tool for covert influence and censorship. The platform’s core feature—its highly effective and opaque recommendation algorithm—is seen as a powerful instrument for shaping public perception on a massive scale. The ability of the Chinese Communist Party (CCP) to “drive the app’s recommendation algorithm to ‘manipulate content’” is a primary concern for U.S. national security officials.8 This control could be used to amplify narratives that align with Beijing’s objectives or to suppress viewpoints it deems threatening.

Evidence suggests this is more than a hypothetical risk. In 2019, leaked internal documents from TikTok revealed explicit instructions for moderators to censor videos on topics politically sensitive to the CCP, including the 1989 Tiananmen Square massacre, Tibetan independence, and the Falun Gong spiritual movement.1 More recently, the platform has faced accusations of suppressing content related to the 2019–2020 pro-democracy protests in Hong Kong and was criticized for removing a video about the CCP’s repression of the Uyghur Muslim population in Xinjiang.6 These actions demonstrate a pattern of content moderation that aligns with the political interests of the Chinese state.

The PRC’s direct intervention in the U.S. legislative process further illuminated the relationship between ByteDance and the Chinese state. As Congress considered the divestiture bill, officials from the Chinese embassy in Washington, D.C., engaged in a direct lobbying campaign, meeting with congressional staffers to argue against the legislation.13 Chinese diplomats reportedly argued that a forced sale would harm U.S. investors and that TikTok was being treated unfairly.14

Many U.S. lawmakers saw this direct advocacy as a telling confirmation of the state control they sought to mitigate. As Senator John Cornyn noted, the secret meetings provided “solid evidence” that TikTok is not independent of the CCP.13 This lobbying effort ultimately reinforced the argument that ByteDance’s ownership of TikTok represents an extension of Chinese state power into the American information space.13

1.3 A Pattern of Precedent: Securing the Digital Supply Chain

The U.S. government’s actions against TikTok are not an isolated event. They are part of a broader strategic effort to secure the nation’s Information and Communications Technology and Services (ICTS) supply chain from foreign adversaries. This policy is guided by frameworks like Executive Order 13873, which empowers the Department of Commerce to prohibit ICTS transactions that present an “undue or unacceptable risk” to U.S. national security.15 Prior cases reveal a consistent pattern of targeting structural risks from companies legally beholden to adversarial governments.

The campaign against Chinese telecommunications giant Huawei is the most direct precedent. In 2019, the U.S. government placed Huawei on the Commerce Department’s “Entity List,” severely restricting its business with American companies.18 The rationale was that Huawei’s close ties to the Chinese government created an unacceptable risk that its 5G networking hardware could be used for espionage or to disrupt critical communications.18 The action targeted the company’s fundamental structure, mirroring the logic applied to TikTok.

The government applied similar logic to Kaspersky Lab, a Russian cybersecurity firm. Citing the company’s ties to Russian intelligence, the Department of Homeland Security banned Kaspersky software from all federal systems in 2017. This culminated in a June 2024 Commerce Department decision to prohibit all sales of Kaspersky software in the United States.21 Again, the action was preemptive and structural.

Even U.S.-based companies have faced scrutiny. Zoom, the video conferencing platform, came under examination when researchers discovered that encryption keys for some meetings were being routed through servers in China, where Zoom has significant development operations.23 Later, it was revealed that Zoom had complied with Chinese government demands to shut down meetings of U.S.-based human rights activists commemorating the Tiananmen Square massacre. This case illustrated how operational exposure to an authoritarian legal regime can compel even an American company to act as an agent of foreign censorship.

These precedents demonstrate a clear trajectory in U.S. policy. The definition of “critical infrastructure” is expanding from physical hardware (Huawei) to cybersecurity software (Kaspersky) and now to the dominant application-layer platforms where public discourse is shaped (TikTok). This evolution suggests a new doctrine of digital sovereignty, where control over the primary channels of public information is an integral component of national security. This emerging doctrine provides the direct impetus for the legislative evolution detailed in the following section.

Section II: Legislative Evolution: From Executive Fiat to Surgical Law

The U.S. government’s path to regulating TikTok reveals a clear evolution in legal strategy. The Trump administration’s initial, heavy-handed attempts to ban the app were swiftly rebuffed by the judiciary. This failure, however, prompted a strategic pivot. The result was the Protecting Americans from Foreign Adversary Controlled Applications Act (PAFACA)—a law meticulously engineered to withstand the constitutional challenges that had doomed its predecessors. This section traces that evolution.

2.1 The Failure of Executive Fiat

The first major attempt to neutralize the TikTok threat came in August 2020, when President Donald Trump issued executive orders aimed at the platform. Invoking his authority under the International Emergency Economic Powers Act (IEEPA), he sought to prohibit all transactions with ByteDance, which would have amounted to a de facto ban.1

These actions met with immediate and successful legal challenges. Federal district courts quickly issued preliminary injunctions, halting the orders.25 The courts’ reasoning was based on a narrow reading of the IEEPA statute. Plaintiffs successfully argued that the President’s orders exceeded his authority because they violated IEEPA’s “informational materials” exception, which expressly limits the power to regulate the flow of information.25 The courts concluded that an app for sharing user-generated videos falls squarely within this definition.

The Biden administration recognized the legal futility of these orders and formally withdrew them in June 2021.25 This move, however, did not signal an abandonment of the underlying security concerns. While the IEEPA cases were dropped, a separate legal challenge to a divestment order was “held in abeyance”.26 This legal pause allowed the administration and TikTok to enter into more than a year of intense negotiations.

The proposed solution, dubbed “Project Texas,” was a complex plan to create a U.S. subsidiary, U.S. Data Security (USDS), to manage all U.S. user data on Oracle’s cloud infrastructure, with board members approved by the U.S. government. Ultimately, these negotiations collapsed. The executive branch concluded that the safeguards were insufficient because certain U.S. user data would still flow to China, ByteDance would retain control, and the U.S. government lacked sufficient visibility to monitor compliance.27 With both executive fiat and private negotiation having failed, the stage was set for a new approach: targeted legislation.

2.2 Crafting PAFACA: A Targeted Legislative Solution

Learning from the judicial defeats of the IEEPA orders, Congress crafted a new legal instrument designed to be constitutionally sound. The result, the Protecting Americans from Foreign Adversary Controlled Applications Act (PAFACA), represents a masterclass in legislative engineering.

The most critical design feature of PAFACA is its strategic shift from regulating content to regulating commercial conduct. The law’s text does not mention speech. Instead, it makes it unlawful for entities like mobile app stores and internet hosting services to “distribute, maintain, or update” a “foreign adversary controlled application”.28 This focus on the commercial conduct of intermediary companies moved the legal basis from shaky emergency powers to Congress’s much firmer authority to regulate commerce.

Crucially, the law was structured not as an absolute prohibition but as a conditional one. It provides a clear “off-ramp”: the prohibitions are nullified if the company executes a “qualified divestiture” that results in the application no longer being controlled by a foreign adversary. This provision allowed proponents to frame the legislation not as a “ban,” but as a structural regulation. The bill’s initial 180-day deadline was later extended to 270 days, with a presidential option for a 90-day extension, providing a more realistic timeframe for a potential sale.29

To avoid accusations of arbitrary enforcement, PAFACA provides specific and narrow definitions.

  • “Foreign adversary” is explicitly limited to China, Russia, Iran, and North Korea.29
  • “Control by a foreign adversary” is defined by a clear threshold: an entity is under such control if a person or entity from a designated adversary nation directly or indirectly owns at least a 20 percent stake.29

Finally, the bill’s core provisions were incorporated into a larger, must-pass foreign aid package, a shrewd political maneuver that ensured its swift passage and enactment into law on April 24, 2024.1 This process illustrates a dynamic dialogue between the branches of government, where judicial constraint compelled the legislative branch to devise a more precise and constitutionally durable instrument. It established a powerful new template for U.S. technology policy and a key tool in its exercise of digital sovereignty.

Section III: Deconstructing the Counterarguments

The passage of PAFACA has ignited a fierce debate, pitting national security against free expression and economic interests. Opponents have mounted powerful counterarguments, contending that the law is an unconstitutional infringement on speech, a devastating economic blow, and a hypocritical policy. However, a thorough examination reveals these claims are largely based on misinterpretations of the law and false equivalencies between different categories of risk.

3.1 The First Amendment: Speech vs. Structure

The most formidable challenge to PAFACA has been on First Amendment grounds. This argument posits that the law is an unprecedented government ban on an entire communications platform, violating the rights of more than 170 million Americans. Civil liberties organizations, including the ACLU, have argued that to justify such a sweeping restraint on speech, the government must demonstrate a “serious, imminent harm to national security,” a bar they contend has not been met.33

This powerful argument, however, was ultimately rejected by the U.S. Supreme Court. In its unanimous opinion in TikTok Inc. v. Garland, the Court upheld the constitutionality of PAFACA. The Court’s reasoning hinged on a critical distinction: regulating speech versus regulating corporate structure. It determined that PAFACA is “facially content neutral” because its prohibitions are triggered not by the content on the platform, but by its control by a foreign adversary.26

Because the law was not aimed at the message but at the ownership, the Court applied “intermediate scrutiny.” This standard requires the law to further an important government interest without burdening substantially more speech than necessary.26 The Court found that preventing a foreign adversary from collecting sensitive personal data of millions of Americans was an important government interest.26

The law was deemed sufficiently tailored precisely because of its divestiture provision. The Court emphasized that PAFACA does not mandate a shutdown; it offers a choice. A new, non-adversary owner “could circulate the same mix of content as before without running afoul of the Act”.32 This “off-ramp” was dispositive. It legally recast the law as a regulation of foreign ownership that only incidentally burdens speech, rather than a direct and impermissible regulation of expression.26 The ruling thus affirmed that national security concerns related to foreign control can justify structural regulations, a key tenet of digital sovereignty.

3.2 Economic Disruption: A False Dichotomy

A second line of opposition focuses on the profound economic harm a “ban” on TikTok would inflict upon its ecosystem of creators and small businesses. This argument leverages compelling data to paint a picture of widespread financial devastation.

According to a 2023 study from Oxford Economics, the platform had a significant economic impact:

  • Contributed $24.2 billion to the U.S. GDP.
  • Supported at least 224,000 American jobs.
  • Generated $14.7 billion in revenue for small- and medium-sized businesses.36

Many argue that the platform’s unique algorithmic reach cannot be easily replicated, meaning a shutdown would be financially catastrophic.36

While the economic data is compelling, the argument rests on a flawed premise: that PAFACA mandates a ban. The legislation does not present a choice between national security and economic prosperity; it presents a path to achieve both. The law’s primary goal is to force a sale, not to shutter the platform. The divestiture provision was designed to preserve the economic ecosystem while surgically removing the national security risk.

The operative choice, therefore, belongs not to the U.S. government but to TikTok’s parent company, ByteDance. The law creates a clear binary: divest control and continue operations, or refuse to divest and trigger the prohibitions. By framing the issue as a “ban,” opponents misplace the agency for any potential economic disruption. If ByteDance refuses a sale, it is that corporate decision—not the U.S. law—that would cause the feared economic harm. As such, the economic arguments serve less as a case against PAFACA and more as a compelling reason for ByteDance to comply with the law.38

3.3 The “Whataboutism” Fallacy: Commercial vs. State Risk

The third major counterargument is a form of “whataboutism” that equates the risks posed by TikTok with those of U.S.-based tech companies. This critique asserts that singling out TikTok is hypocritical when domestic platforms like Meta and Google also engage in mass data collection and have been exploited for foreign disinformation campaigns.2 Proponents of this view argue that the real issue is the unregulated data broker market, through which any foreign intelligence service can simply purchase vast quantities of Americans’ personal data.39

This argument, while appealing, fundamentally conflates two distinct categories of risk: commercial exploitation and state-level espionage. While U.S. tech companies engage in aggressive data collection for commercial purposes, they operate within U.S. jurisdiction. Any government attempt to access the data they hold is constrained by constitutional protections and requires due process.6

ByteDance, in stark contrast, is subject to the jurisdiction of the PRC and its 2017 National Intelligence Law. This law creates a legal obligation of sovereign compulsion, requiring the company to cooperate with state intelligence services upon demand, with no recourse to an independent judiciary.6 The threat is not one of commercial surveillance for targeted advertising. It is the threat of state-directed espionage, blackmail, and influence operations conducted by a geopolitical adversary with legally mandated access to sensitive data on American citizens.6 The risk differential is not a matter of degree but of kind. This justifies the tailored legislative response required to assert digital sovereignty.

Section IV: The Political Reality: Pragmatism and Public Perception

While legal and national security arguments have been debated in courtrooms and classified briefings, TikTok’s fate is also being shaped by public opinion and the realities of modern political campaigning. A profound paradox has emerged: the same government that legislated against TikTok as a threat is populated by political actors who leverage it as an indispensable tool for electoral success. This disconnect is not hypocrisy but a rational response to a fractured media environment.

4.1 The Campaigner’s Dilemma

In a move that starkly illustrated the paradox, President Joe Biden’s 2024 reelection campaign launched an official account, @BidenHQ, in February 2024. The campaign’s rationale was pragmatic: in an “evolving, fragmented, and increasingly personalized media environment,” it was essential to “continue meeting voters where they are”. With over 170 million American users, heavily skewed toward younger demographics, TikTok represented a vital communication channel.40 Following President Biden’s withdrawal from the race, the account was rebranded to @kamalahq for Vice President Kamala Harris, who also launched her own personal account, quickly amassing millions of followers.11

This decision immediately drew criticism for sending a “mixed message”. Senator Mark Warner, the Democratic Chair of the Senate Intelligence Committee, expressed concern about the contradictory signals. The White House and the campaign stressed that the official policy—the ban on TikTok on all federal government devices—remained firmly in place. Campaign officials stated they were taking “advanced safety precautions,” such as using dedicated cell phones for all TikTok activity.

This political pragmatism is bipartisan. Former President Donald Trump, who initiated the first attempt to ban the app, also joined TikTok for his 2024 campaign, recognizing its immense power.11 This shared behavior demonstrates a consensus among political strategists that the electoral cost of ceding such a vast audience is higher than the abstract national security risk.

4.2 A Nation Divided: Public Opinion

The strategic calculations of political campaigns reflect a complex and divided public. While a strong consensus about the TikTok threat exists within the national security establishment, the American public is far more ambivalent, with views fractured along partisan, generational, and user-status lines.

Polling data from the Pew Research Center reveals a clear trend of declining public support for a government ban on TikTok. In March 2023, 50% of U.S. adults supported a ban. By early 2025, that support had fallen to just 34%.43 During the same period, opposition to a ban grew from 22% to 32%.44

The divides become even starker when broken down by demographics. Republicans are consistently more supportive of a ban than Democrats.43 The most significant cleavage, however, is generational. Adults under 30 are the only age group where opposition to a ban consistently outweighs support.44 This aligns with usage patterns, as TikTok users themselves are overwhelmingly opposed to a ban (61% oppose vs. 10% support), while a plurality of non-users favor one (42% support vs. 15% oppose).45

These differing views are rooted in different perceptions of the issue. Supporters of a ban cite data security (83%) and Chinese ownership (75%) as major reasons. Opponents prioritize the restriction of free speech (74%) and a lack of evidence of a threat (61%).43

Survey DateAll AdultsRepublicansDemocratsAges 18-29Ages 65+TikTok UsersNon-Users
Support / OpposeSupport / OpposeSupport / OpposeSupport / OpposeSupport / OpposeSupport / OpposeSupport / Oppose
March 202350% / 22%60% / 15%43% / 28%29% / 46%71% / 8%19% / 56%60% / 11%
Fall 202338% / 27%50% / 22%29% / 33%29% / 41%49% / 19%16% / 56%47% / 16%
Summer 202432% / 28%42% / 22%24% / 33%10% / 61%42% / 15%
Feb 202534% / 32%39% / 28%30% / 37%12% / 66%45% / 19%
Data compiled from multiple Pew Research Center surveys.43 Note: Not all surveys included breakdowns for all age groups.

4.3 The Influencer-Politician: A Microcosm

The experience of Representative Jeff Jackson, a Democrat from North Carolina, perfectly encapsulated this balancing act. Jackson had become a political phenomenon on TikTok, amassing over two million followers by providing candid explanations of complex political events.46

This relationship shattered when Jackson, along with a large bipartisan majority, voted in favor of PAFACA. The backlash was immediate. He was branded a “hypocrite” and a “sellout,” and his comment sections were flooded with accusations of betrayal.47 In the days following the vote, he lost over 200,000 followers.47

Forced to respond, Jackson posted a video of contrition and justification. He began with a direct apology:

“I did not handle this situation well from top to bottom, and that is why I have been completely roasted on this app”.46

He apologized for his communication strategy but not for his vote. To justify his decision, he alluded to his responsibilities as a lawmaker, stating he had been part of “some briefings about this app that were genuinely alarming”—a clear reference to classified national security information.46 At the same time, he tried to downplay the law’s impact, arguing that he believed the chance of an actual ban was “practically zero”.46

Representative Jackson’s predicament serves as a microcosm of the entire TikTok paradox. He was caught between his duty as a member of Congress privy to sensitive threat intelligence and his persona as an “influencer-politician” accountable to a digital constituency deeply skeptical of that intelligence. This disconnect makes coherent and publicly supported governance on technology and security issues exceptionally difficult.

Conclusion: Forging a Doctrine of Digital Sovereignty

The United States’ policy toward TikTok is not a portrait of incoherence. It is the visible manifestation of a nation grappling with the imperatives of a new era, one in which the battlefield for geopolitical influence has expanded decisively into the digital realm. The seemingly contradictory actions—legislating against the platform while using it as a campaign tool—are the predictable frictions that arise when a liberal democracy confronts an asymmetric threat that weaponizes the very principles of openness it is sworn to protect. In this context, weaponized information refers to the deliberate use of information—whether true, false, or a mix—to manipulate perceptions, sow discord, and achieve strategic goals by exploiting digital platforms.

The journey from the failed IEEPA executive orders to the constitutionally-upheld PAFACA represents a landmark case in the development of a 21st-century doctrine of digital sovereignty. It demonstrates a government learning in real-time. PAFACA’s focus on corporate structure rather than content, and its provision of a divestiture remedy, allowed it to surgically target the core national security risk—control by a foreign adversary—while respecting constitutional boundaries. This approach signifies a crucial doctrinal shift: the recognition that a dominant, foreign-controlled information ecosystem can itself be considered critical national infrastructure.

The political paradoxes are a rational adaptation to a fractured media environment where the cost of digital absence is perceived as greater than the abstract risk of engagement. This tension highlights a growing gap between the threat assessments of the national security elite and the perceptions of a public, particularly younger generations, who view the platform primarily as a space for culture, community, and commerce.

Looking Forward: A Two-Pronged Defense for Digital Sovereignty

While targeted interventions like PAFACA are a necessary scalpel for addressing acute threats, they are not a comprehensive solution. The “whataboutism” counterargument, though flawed, correctly identifies a broader, systemic vulnerability: the lack of a robust federal data privacy framework in the United States. The vast quantities of sensitive data collected by all technology companies create a rich and easily accessible marketplace for intelligence gathering that extends far beyond any single application.

Therefore, the most effective long-term defense for America’s emerging digital sovereignty is a multi-layered one. It must combine structural interventions like PAFACA with a foundational legislative effort to secure the entire ecosystem. The passage of a comprehensive federal data privacy law, centered on principles of data minimization, would be the single most impactful step. By strictly limiting the amount of personal data any company can collect, such a law would drastically shrink the available attack surface for all actors. A strong privacy law would not replace the need for tools like PAFACA, but it would complement them, creating a resilient and defensible digital infrastructure for the 21st century. The TikTok paradox has forced a necessary confrontation with these challenges; the ultimate coherence of U.S. policy will be judged by whether it seizes this moment to build a lasting and comprehensive defense.

Works Cited

  1. Associated Press. “A timeline of the TikTok ban.” AP News, January 18, 2025. https://apnews.com/article/tiktok-ban-biden-timeline-india-119969bfc584e92d47baa189a3e1c4fc
  2. Akoto, William. “National Security and the TikTok Ban.” American University School of International Service, January 23, 2025. https://www.american.edu/sis/news/20250123-national-security-and-the-tik-tok-ban.cfm
  3. World Economic Forum. “What is digital sovereignty and how are countries approaching it?” weforum.org, January 2025. https://www.weforum.org/stories/2025/01/europe-digital-sovereignty/
  4. Stefanini Group. “What is Digital Sovereignty and Why Does it Matter for Your Business?” stefanini.com, 2024. https://stefanini.com/en/insights/news/what-is-digital-sovereignty-why-does-it-matter-for-your-business
  5. Apizee. “What is Digital Sovereignty and Why Is It Important?” apizee.com, 2024. https://www.apizee.com/digital-sovereignty.php
  6. Kiely, Eugene. “TikTok and U.S. National Security.” FactCheck.org, February 3, 2025. https://www.factcheck.org/2025/02/tiktok-and-u-s-national-security/
  7. Center for Internet Security. “Why TikTok is the Latest Security Threat.” cisecurity.org, October 27, 2020. https://www.cisecurity.org/insights/blog/why-tiktok-is-the-latest-security-threat
  8. Center for Internet Security. “TikTok Influence Ops, Data Practices Threaten U.S. Security.” cisecurity.org, March 21, 2023. https://www.cisecurity.org/insights/blog/tiktok-influence-ops-data-practices-threaten-us-security
  9. Lewis, James Andrew. “TikTok and National Security.” Center for Strategic and International Studies, March 13, 2024. https://www.csis.org/analysis/tiktok-and-national-security
  10. Associated Press. “Biden campaign joins TikTok, but administration still warns of app’s national security concerns.” PBS NewsHour, February 12, 2024. https://www.pbs.org/newshour/politics/biden-campaign-joins-tiktok-but-administration-still-warns-of-apps-national-security-concerns
  11. German Marshall Fund. “TikTok Tactics: 2024 U.S. Candidates Dance Around Security Risks.” gmfus.org, September 2024. https://www.gmfus.org/news/tiktok-tactics-2024-us-candidates-dance-around-security-risks
  12. The Guardian. “Biden campaign decision to join TikTok raises national security concerns.” theguardian.com, February 13, 2024. https://www.theguardian.com/technology/2024/feb/13/joe-biden-tiktok-campaign-national-security-social-media
  13. Mause, Ben T.N. “Senators Say China’s TikTok Lobbying Efforts Could Backfire.” Notus, April 19, 2024. https://www.notus.org/senate/senate-tiktok-bill
  14. Creitz, Charles. “Chinese Embassy defends TikTok against potential forced sale in meeting with congressional staffers: report.” Fox News, April 18, 2024. https://www.foxnews.com/politics/chinese-embassy-defends-tiktok-potential-forced-sale-meeting-congressional-staffers-report
  15. Congressional Research Service. “Securing the Information and Communications Technology and Services (ICTS) Supply Chain.” congress.gov, July 26, 2023. https://www.congress.gov/crs_external_products/IF/HTML/IF11760.web.html
  16. U.S. Department of Energy. “Securing the Information and Communications Technology and Services Supply Chain (EO 13873).” energy.gov, 2023. https://www.energy.gov/ceser/securing-information-and-communications-technology-and-services-supply-chain-eo-13873
  17. Covington & Burling LLP. “Department of Commerce Releases Final Rule on Securing the Information and Communications Technology and Services Supply Chain.” cov.com, June 21, 2023. https://www.cov.com/en/news-and-insights/insights/2023/06/department-of-commerce-releases-final-rule-on-securing-the-information-and-communications-technology-and-services-supply-chain
  18. Congressional Research Service. “U.S. Government Actions Against Huawei: A Timeline.” congress.gov, January 29, 2021. https://www.congress.gov/crs-product/R46693
  19. Brown, C. Scott. “The HUAWEI ban explained: A complete timeline and everything you need to know.” Android Authority, May 27, 2025. https://www.androidauthority.com/huawei-google-android-ban-988382/
  20. Wikipedia. “Huawei.” en.wikipedia.org, 2025. https://en.wikipedia.org/wiki/Huawei
  21. The Cyber Express. “US Banning Kaspersky in a Decisive Fight for Cybersecurity.” thecyberexpress.com, June 21, 2024. https://thecyberexpress.com/us-banning-kaspersky-fight-for-cybersecurity/
  22. Olcott, Jake, and Pedro Umbelino. “The Aftermath of the Kaspersky Ban.” BitSight, December 18, 2024. https://www.bitsight.com/blog/aftermath-kaspersky-ban
  23. Groll, Elias. “Zoom executives knew about key elements of plan to censor Chinese activists.” CyberScoop, May 17, 2023. https://cyberscoop.com/zoom-china-doj-eric-yuan/
  24. Marczak, Bill, et al. “Move Fast & Roll Your Own Crypto: A Quick Look at the Confidentiality of Zoom Meetings.” The Citizen Lab, April 3, 2020. https://citizenlab.ca/2020/04/move-fast-roll-your-own-crypto-a-quick-look-at-the-confidentiality-of-zoom-meetings/
  25. Congressional Research Service. “TikTok and U.S. National Security Law: The International Emergency Economic Powers Act (IEEPA).” congress.gov, September 28, 2023. https://www.congress.gov/crs-product/LSB10940
  26. Congressional Research Service. “TikTok Inc. v. Garland: Supreme Court Rejects Challenge to TikTok Divestiture Law.” congress.gov, January 22, 2025. https://www.congress.gov/crs-product/LSB11261
  27. U.S. Supreme Court. “Joint Appendix Volume I, TikTok Inc. v. Garland.” supremecourt.gov, December 27, 2024. https://www.supremecourt.gov/DocketPDF/24/24-656/336140/20241227160916329_24-656%20and%2024-657%20JA%20volume%20I.pdf
  28. Santa Clara Law Digital Commons. “H. R. 815.” digitalcommons.law.scu.edu, 2024. https://digitalcommons.law.scu.edu/cgi/viewcontent.cgi?article=3894&context=historical
  29. U.S. Congress. “Text – H.R.7521 – 118th Congress (2023-2024): Protecting Americans from Foreign Adversary Controlled Applications Act.” congress.gov, March 13, 2024. https://www.congress.gov/bill/118th-congress/house-bill/7521/text
  30. Wikipedia. “Protecting Americans from Foreign Adversary Controlled Applications Act.” en.wikipedia.org, 2025. https://en.wikipedia.org/wiki/Protecting_Americans_from_Foreign_Adversary_Controlled_Applications_Act
  31. House Committee on Energy and Commerce. “FACT CHECK: The Truth About H.R. 7521, the Protecting Americans from Foreign Adversary Controlled Applications Act.” energycommerce.house.gov, March 2024. https://energycommerce.house.gov/HR7521
  32. EveryCRSReport.com. “The Protecting Americans from Foreign Adversary Controlled Applications Act (PAFACAA): A Legal Overview.” everycrsreport.com, May 1, 2024. https://www.everycrsreport.com/reports/LSB11252.html
  33. American Civil Liberties Union. “Banning TikTok is Unconstitutional. The Supreme Court Must Step In.” aclu.org, January 16, 2025. https://www.aclu.org/news/national-security/banning-tiktok-is-unconstitutional-the-supreme-court-must-step-in
  34. American Civil Liberties Union. “TikTok Inc., et al. v. Garland (Amicus).” aclu.org, January 17, 2025. https://www.aclu.org/cases/tiktok-inc-et-al-v-garland-amicus
  35. Pacific Legal Foundation. “SCOTUS Scoop: TikTok, Texas, and President Trump’s Executive Orders.” pacificlegal.org, January 26, 2025. https://pacificlegal.org/scotus-scoop-tiktok-texas-and-president-trumps-executive-orders/
  36. Office of Representative Robert Garcia. “TikTok creators say House bill’s passage threatens lives and livelihoods.” robertgarcia.house.gov, March 13, 2024. https://robertgarcia.house.gov/media/in-the-news/washington-post-tiktok-creators-say-house-bills-passage-threatens-lives-and
  37. Global Trade Magazine. “The Economic Impact of a TikTok Ban in the U.S.” globaltrademag.com, May 20, 2024. https://www.globaltrademag.com/the-economic-impact-of-a-tiktok-ban-in-the-u-s/
  38. Moolenaar, John. “TikTok can still save itself. Here’s how.” selectcommitteeontheccp.house.gov, January 19, 2025. https://selectcommitteeontheccp.house.gov/media/editorial/tiktok-can-still-save-itself-heres-how
  39. Britannica. “The Great TikTok Debate.” britannica.com, 2025. https://www.britannica.com/procon/TikTok-debate
  40. Pew Research Center. “8 facts about Americans and TikTok.” pewresearch.org, December 20, 2024. https://www.pewresearch.org/short-reads/2024/12/20/8-facts-about-americans-and-tiktok/
  41. Dean, Brian. “TikTok Users, Stats, Data & Trends (2025).” Backlinko, September 12, 2025. https://backlinko.com/tiktok-users
  42. Voice of America. “Despite Security Concerns, TikTok Still Plays Key Role in 2024 Race.” voanews.com, August 12, 2024. https://www.voanews.com/a/despite-security-concerns-tiktok-still-plays-key-role-2024-race-/7731015.html
  43. McClain, Colleen. “Fewer Americans now support TikTok ban, see the platform as a national security threat than in spring 2023.” Pew Research Center, March 25, 2025. https://www.pewresearch.org/short-reads/2025/03/25/fewer-americans-now-support-tiktok-ban-see-the-platform-as-a-national-security-threat-than-in-spring-2023/
  44. McClain, Colleen. “A declining share of adults, and few teens, support a U.S. TikTok ban.” Pew Research Center, December 11, 2023. https://www.pewresearch.org/short-reads/2023/12/11/a-declining-share-of-adults-and-few-teens-support-a-us-tiktok-ban/
  45. McClain, Colleen, and Wyatt Dawson. “Support for a U.S. TikTok ban continues to decline, and half of adults doubt it will happen.” Pew Research Center, September 5, 2024. https://www.pewresearch.org/short-reads/2024/09/05/support-for-a-us-tiktok-ban-continues-to-decline-and-half-of-adults-doubt-it-will-happen/
  46. Merlan, Anna. “A congressman went on TikTok to apologize for voting to ban TikTok.” Quartz, March 18, 2024. https://qz.com/jeff-jackson-congressman-apologizes-for-tiktok-ban-1851345288
  47. Mendez, Moises. “A Congressman Lost 200,000 Followers After He Voted for a TikTok Ban.” Time Magazine, March 19, 2024. https://time.com/6958140/tiktok-ban-jeff-jackson-vote-apology/
  48. Harrison, Steve. “After being ‘roasted,’ Jeff Jackson apologizes for how he handled TikTok ban vote.” WFAE 90.7, March 17, 2024. https://www.wfae.org/politics/2024-03-17/after-being-roasted-jeff-jackson-apologizes-for-how-he-handled-tiktok-ban-vote