Reading time: 6 minutes

 

Brussels, 9 April 2026

A new analysis, based on freshly obtained data from MCC Brussels on Facebook activity during the final week of March 2026, highlights significant anomalies in the performance of political content between Péter Magyar and Viktor Orbán, raising questions about how platform rules and regulatory pressures may be shaping the digital playing field.

Key Finding 1: Divergent Rule Regimes Within Facebook

A central factor may lie in how Facebook differentiates between Pages and personal Profiles:

Viktor Orbán operates primarily through a Facebook Page, classified as a political actor and subject to advertising restrictions, reduced reach, and transparency requirements

Péter Magyar operates through a personal Profile, classified as a “public figure,” not subject to the same regulatory and algorithmic constraints

This creates a situation where two leading political actors in the same country operate under fundamentally different rule sets on the same platform, with potentially significant implications for visibility and engagement. 

Key Finding 2: Timing and Regulatory Context

The observed divergence in performance coincides with a period of intensified regulatory enforcement affecting political content and advertising in the European Union. The European Commission has confirmed the activation of the Rapid Response System (RRS) ahead of the Hungarian parliamentary elections. This mechanism enables coordinated, real-time intervention in the online information environment through cooperation between platforms, NGOs, and institutional actors.

While this temporal alignment does not establish causation, it is notable that the shift occurs precisely at a time when platforms are under increasing pressure to moderate political content and comply with new regulatory frameworks.

This raises the possibility that changes in platform operations, whether through automated systems, policy adjustments, or accelerated moderation mechanisms, may be influencing the visibility of political content and the interactions it generates.

Key Finding 3: The Disappearing Comments Phenomenon

An analysis of the Facebook pages of Fidesz candidates between March 26 and 28, 2026, identified a systemic anomaly: in numerous cases, comments remained visible to page administrators while being entirely invisible to ordinary users. This is not a case of deletion or moderation; rather, the comments simply “disappear” from public view.

During the period examined, this phenomenon was observed in 6,509 out of 6,607 posts published by 106 candidates, indicating a clear systemic pattern. The issue affects not only official candidate pages but also affiliated communication channels, suggesting a platform-level malfunction or restriction.

By contrast, no similar phenomenon can be observed on opposition party-affiliated pages, including those of the Tisza Party. On the same platform and during the same period, comment visibility therefore shows significant divergence.

Key Finding 4: Same Reach, Fundamentally Different Outcomes

Despite nearly identical video reach of approximately 2.02 million views for Péter Magyar and 1.95 million for Viktor Orbán, Magyar generated over three times the level of interaction (approx. 825,000 vs. 267,000).

This corresponds to a conversion rate of ~40.9% of viewers engaging for Magyar viewers, compared to ~13.7% for Orbán viewers.

This suggests that identical exposure can lead to radically different behavioral outcomes, a divergence that cannot be explained by reach alone.

To contextualise these figures, engagement rates for political content, even during campaign periods, typically vary within a relatively limited range, and differences of this magnitude are uncommon between actors operating within the same national context and time frame. While content quality, tone, and audience mobilisation can influence interaction levels, such factors alone are generally not expected to produce disparities of this scale when overall reach remains comparable. This suggests that additional factors may be contributing to the observed divergence.

Key Finding 5: Engagement Rate as a Statistical Outlier

Péter Magyar’s engagement rate is approximately 4.5%, compared to a widely accepted Facebook benchmark of around 0.15%.

This represents a deviation of roughly 30 times the norm, positioning his account as a clear outlier within the platform ecosystem.

Context: Rapid Response System (RRS) and the Digital Information Environment

MCC Brussels notes that the European Commission has activated the Rapid Response System (RRS) ahead of the Hungarian parliamentary elections. This mechanism, developed under the EU’s Code of Practice on Disinformation and linked to the Digital Services Act (DSA), all part of the wider Democracy Shield project, enables coordinated, real-time intervention in the online information environment by the EU Commission, NGOs, and major technology platforms.

In practice, the RRS allows NGOs, or "fact-checkers", to flag content for accelerated moderation, potentially leading to its demotion, restriction, or removal. This creates a fast-track intervention architecture capable of shaping political communication during an active electoral period.

Facebook, as a signatory to the relevant Code of Conduct, is part of this system. Public reporting has indicated that content associated with Viktor Orbán has already been subject to restrictions. At the same time, a network of EU-linked and EU-funded organisations participates in moderation and flagging processes, raising questions about independence and neutrality. This architecture not only outsources censorship to NGOs, but also leads to preemptive censorschip by the platforms themselves. 

Precedents from recent European elections like Romania suggest that such systems can operate with limited transparency, with no comprehensive public record of flagged or removed content or the actors responsible. Requests for disclosure regarding the application of these mechanisms have, in several cases, not been fulfilled.

Taken together, these factors point to a highly coordinated but opaque moderation environment in which regulatory frameworks, platform policies, and third-party actors intersect during a sensitive electoral period.

Richard Schenk, Research Fellow at MCC Brussels, leading the DIO initiative, said:

Hungarian voters are increasingly exposed to escalating EU regulatory interventions, such as the Rapid Response System. They deserve trust and clarity, and not a top-down, paternalistic oversight of political discourse. Yet instead of greater transparency, what we observe is increasing opacity, where seemingly identical exposure of candidates results in systematically different outcomes.

The data points to significant deviations from expected platform behaviour in the case of Péter Magyar and Viktor Orbán. While it is possible that organic factors, such as content dynamics, audience behaviour, or campaign effects, contribute to these differences, the presence of structural asymmetries within the platform itself makes it essential that Meta clarify these patterns transparently.

Responsibility for the implementation of these policies in Europe ultimately sits with Meta’s senior leadership, including Oskar Braszczyński, who oversees the company’s public policy and regulatory engagement with EU institutions. Braszczyński’s Facebook profile justly raises questions about his political neutrality. Based on his public statements, the regional leader clearly sympathises with the Hungarian opposition, as well as with left-wing and pro-Ukrainian political movements.  

While not constituting direct evidence of algorithmic favouritism or external influence, these patterns, combined with the broader regulatory and operational context, point to dynamics that demand closer examination and accountability from both the platform and its regulators.