Márton Bene
martonbene.bsky.social
Márton Bene
@martonbene.bsky.social
HUN-REN CSS & ELTE - Budapest
political communication | political behavior | social media | punk music
Leader of the PRiSMa research group
https://bsky.app/profile/prisma-hungary.bsky.social
https://prisma-rg.hu/
That means that negative campaigning is not just louder—it’s also safer. Not only are attack messages more likely to get media coverage (as previous research has shown), they are also less likely to be reinterpreted critically by journalists. 8/8
May 28, 2025 at 7:56 PM
Negative attacks on opponents are framed neutrally—whether they come from populists or mainstream parties, incumbents or challengers. Journalists may be skeptical of some actors, but their response depends more on message type than sender. 7/8
May 28, 2025 at 7:56 PM
In contrast, negative campaign messages are not more likely to trigger journalistic criticism. When politicians act as critics, journalists step aside—but when politicians praise themselves, the media steps in. The press plays devil’s advocate only when no one else does. 6/8
May 28, 2025 at 7:55 PM
According to the results, all types of self-promotional messages are more likely to be negatively framed by journalists—especially those focusing on the politician’s character, and less so those centered on policy issues. 5/8
May 28, 2025 at 7:55 PM
We answered this question using data from the Comparative Campaign Dynamics dataset, based on manual content analysis of 16 election campaigns across 10 countries. 4/8
May 28, 2025 at 7:55 PM
To explore this question, we used Benoit’s typology to distinguish six types of campaign messages and analyzed how journalists frame each of them. 3/8
May 28, 2025 at 7:54 PM
🗞️ Journalists’ negative framing can reshape how voters perceive campaign messages—so it’s crucial for political actors to know which messages are more media-proof, and which are more exposed to criticism. 2/8
May 28, 2025 at 7:53 PM
These findings suggest that Facebook users may be operating under a false sense of security. They trust a platform with a questionable track record and overestimate their own ability to stay safe—creating a potentially fragile foundation for digital trust. 9/9

www.tandfonline.com/doi/full/10....
False sense of security and a flurry of misplaced trust: the construction of trust in and by Facebook
In this research, we highlight the fundamental forces that shape the dynamics of trust in the digital society by examining how platform-specific and platform-mediated trust is constructed on Facebo...
www.tandfonline.com
April 24, 2025 at 10:15 AM
These sources of trust don’t replace each other. Rather than one pillar compensating for another, they reinforce each other. Trust is strongest when users believe in both Facebook’s safeguards and their own ability to manage risks. 8/9
April 24, 2025 at 10:14 AM
Belief in government regulation doesn’t significantly affect trust. Users don’t seem to associate platform safety with external oversight. 7/9
April 24, 2025 at 10:14 AM
A secondary factor is users’ self-confidence. Those who believe they can manage risks—by recognizing and avoiding manipulation—report higher trust. Interestingly, merely identifying risks doesn’t boost trust; it’s action and perceived ability that matter more than vigilance. 6/9
April 24, 2025 at 10:14 AM
The strongest factor influencing trust is the perceived effectiveness of Facebook’s self-regulation. When users believe the platform actively protects them—through moderation or algorithmic controls—their trust increases significantly. 5/9
April 24, 2025 at 10:13 AM
Users who perceive high risk are less likely to trust Facebook as a platform—but this distrust doesn't extend to other users or to the content. People tend to blame the company, yet still trust what they see and who they interact with. 4/9
April 24, 2025 at 10:13 AM
To answer this, we conducted a large-scale survey in 2022 across seven European countries: Estonia, France, Germany, Greece, Hungary, Portugal, and the Netherlands. 3/9
April 24, 2025 at 10:12 AM
In this paper, we investigated how trust on Facebook is influenced by users' perceptions of risk and three potential "pillars" of trust: (1) confidence in their own ability to recognize and avoid harm, (2) belief in Facebook’s self-regulation, and (3) belief in state regulation. 2/9
April 24, 2025 at 10:11 AM