
On December 5, 2025, in Brussels, European regulators delivered a landmark verdict on Elon Musk’s social platform. The European Commission announced a €120 million ($140 million) penalty against X, formerly Twitter, accusing the company of turning its signature blue checkmark into a tool for deception and breaking key transparency rules under the European Union’s new Digital Services Act (DSA).
Once a badge of authenticity, the blue symbol had become, in the Commission’s view, a gateway for impersonation and fraud affecting hundreds of millions of users in the EU. The decision marked the first formal non-compliance ruling under the DSA, setting a precedent for how large online platforms will be policed across the bloc.
The Checkmark Reversed

Before Musk’s takeover, Twitter’s blue check was reserved for celebrities, officials, journalists, and other notable figures who passed stringent, manual identity checks. The European Commission’s investigation found that this system helped the public “distinguish genuine notable account holders from impostors and parodies,” making the mark a practical shorthand for trust.
That changed after Musk acquired Twitter in October 2022 and later rebranded it as X in July 2023. The company began selling “verified” status via an $8-per-month subscription, without what regulators describe as “meaningful verification of who is behind the account.” Paying, not proving identity, became the path to the blue symbol.
According to the Commission, this shift meant a scammer could purchase credibility faster than a victim or legitimate account could flag abuse. What had been earned validation, anchored in verification, became a paid feature that, in the EU’s view, misled people about who was real and who was not.
Three Breaches, One Pattern

In its December 5 ruling, the Commission detailed three distinct infringements. Of the €120 million total, €45 million was linked to what it labeled a deceptive verification system. Another €35 million addressed failures in advertising transparency. The remaining €40 million penalized X for obstructing independent researchers’ access to public data.
Taken together, regulators said these problems formed a pattern. Rather than seeing them as isolated glitches, the Commission concluded they revealed a systemic refusal to comply with EU obligations designed to protect users from scams, disinformation, and other online harms. The breaches, officials argued, made it harder for regulators, civil society groups, and academic researchers to identify risks and hold the platform to account.
Under the DSA, major platforms must be clear about who is paying for ads, how targeting works, and how they manage systemic risks such as misinformation and manipulation. In the X case, the Commission found gaps in each of these areas.
Ad Transparency and Research Access

Behind X’s public interface sits an advertising repository meant to show which messages are being promoted, who paid for them, and which users they are aimed at. This type of database is central to the DSA’s transparency goals, giving watchdogs and researchers the tools to trace scams, track misleading promotions, and monitor influence campaigns.
The Commission found that X’s implementation fell short. Officials said the company “imposes design features and access barriers,” including long delays in processing researchers’ requests. More significantly, they concluded that the ads library “lacks critical information, such as content and the legal entity paying for advertisements,” undermining its usefulness for oversight.
The third violation focused on how X treats independent research on its public data. According to the ruling, “X does not allow researchers to access its public data independently” and instead relies on internal processes that create “unnecessary barriers.” For academics, nonprofits, and civil society organizations studying how algorithms spread misinformation, target young users, or shape elections in the EU, this amounted to being shut out of a key arena of public life.
Comparison With TikTok and Political Fallout
The same day it ruled on X, the Commission reached a different outcome with TikTok. The video platform, owned by China-based ByteDance, moved quickly to settle its DSA case, committing to provide ad repositories, keep them updated within 24 hours, and disclose targeting criteria in line with the law’s transparency requirements. In public comments, EU officials highlighted that they had “secured TikTok’s commitment to enhance ad transparency,” underscoring the contrast with X’s stance.
In Washington, the decision sparked immediate political reaction. Vice President JD Vance criticized the move, saying the EU should support free expression rather than penalize American companies. Secretary of State Marco Rubio went further, calling the fine “an attack on all American tech platforms and the American people” and linking it to broader concerns about overseas restrictions on U.S.-based online speech. Musk replied “Absolutely” on X, aligning himself with the view that the enforcement action amounted to censorship.
The confrontation escalated when Musk barred the European Commission from advertising on X. After EU executive vice-president Henna Virkkunen stated that “deceiving users with blue checkmarks, obscuring information on ads, and shutting out researchers have no place online in the EU,” Musk’s decision effectively blocked the regulator from spending public money on the platform, turning a regulatory clash into an open standoff.
What Comes Next for X and the DSA

Under the Commission’s order, X now faces strict timelines. The company has 60 working days to demonstrate how it will correct what regulators describe as blue-check fraud, or risk “periodic penalty payments.” It has 90 days to submit and begin implementing an action plan on advertising transparency and researcher access.
The DSA allows for far larger penalties in serious or repeated cases, up to 6 percent of a company’s global annual revenue. For X, depending on how turnover is calculated, that ceiling could exceed $500 million. Musk can appeal the decision in EU courts or propose detailed remedial steps; for now, the platform has not set out a comprehensive public compliance strategy.
Beyond X, the ruling is a test of whether the DSA can become a global benchmark for regulating large online platforms. European officials argue that, just as other sectors accept local safety and consumer rules, tech giants operating in the bloc must respect its “sovereign right” to set standards for digital services used by its 450 million residents.
The blue checkmark, once a simple symbol of authenticity, now sits at the center of this broader struggle over transparency, accountability, and who defines trust online. Whether X restores verification to a proof-of-identity role, opens its advertising and data systems to scrutiny, and cooperates with independent research will help determine not only the future of one platform, but the real-world force of the EU’s new digital rulebook.
Sources:
European Commission Press Release IP/25/2934, December 5, 2025; European Commission Digital Services Act enforcement decision
Wikipedia Twitter Verification article; Business Insider timeline of Elon Musk’s Twitter Blue verification rollout (2022)
Reuters/NDTV reporting on political responses from JD Vance and Marco Rubio; BBC reporting on X’s ban of European Commission advertising (December 8, 2025)
TechCrunch/Al Jazeera reporting on DSA penalties; European Union Digital Services Act enforcement documentation and transparency requirements