1
5

cross-posted from: https://mander.xyz/post/51052747

Ukraine has accused Russia of organizing staged “anti-war” protests in Cyprus in an effort to disrupt discussions among European leaders over continued financial and military support for Kyiv, according to Ukraine’s Center for Countering Disinformation on April 24.

Russian-backed actors are attempting to influence the outcome of a high-level EU meeting taking place on the island, where President Volodymyr Zelenskyy is presenting proposals for deeper defense integration across Europe.

...

The protests, expected to continue through April 25, are reportedly coordinated by Russian diplomatic and cultural structures operating in Cyprus, including the embassy, Rossotrudnichestvo

-linked networks, and a so-called Russian cultural center. Ukrainian officials allege these groups routinely promote Moscow’s influence abroad under the cover of cultural initiatives.

“These actions are presented as a ‘struggle for peace,’ while in reality they push for ending EU financial support to Ukraine,” the Center said. It added that the campaign also promotes narratives suggesting that military assistance to Kyiv “prolongs the war,” a claim Ukrainian officials describe as an attempt to shift responsibility for Russia’s aggression.

...

2
40
3
43

cross-posted from: https://mander.xyz/post/50989159

Russia is increasingly deploying videos and deepfakes generated by artificial intelligence (AI) in what Ukrainian officials describe as a coordinated psychological warfare campaign designed to distort reality and manipulate target audiences.

Researchers have identified more than 1,000 synthetic videos forming part of a structured “narrative kill chain” – a modular disinformation system built to target soldiers, civilians, and Western audiences with tailored messaging.

...

According to Ukraine’s Center for Countering Disinformation (CDD), citing research by Sensity AI, the scale and structure of these operations show how generative AI is now being used as a mass tool of cognitive warfare.

The content is carefully segmented. For military audiences, it pushes messages about a “failing war effort,” a “collapsing front line,” and seeks to undermine trust in commanders.

Civilian-facing videos are designed to drain morale, normalize Russian-imposed conditions, and weaken confidence in state institutions and the army.

For Western audiences, the focus shifts to discrediting Ukraine, portraying Ukrainian refugees negatively, and arguing against continued support for Kyiv.

...

The goal, analysts say, is not just persuasion – but chaos. A reality where so much synthetic content circulates that even genuine evidence can be dismissed as fake or AI-generated.

...

Officials warned that it would allow Moscow to muddy accountability for real-world atrocities by eroding trust in any form of visual proof.

...

Web Archive link

4
4
5
104
6
18

cross-posted from: https://mander.xyz/post/50954866

Ukrainian Foreign Minister Andrii Sybiha has announced that Ukraine will assist Latvia, Lithuania, and Estonia in countering Russian disinformation and accusations surrounding the use of airspace by Ukrainian drones.

...

Sybiha emphasized the growing information warfare by Russia in the Baltic countries.

“The Russians are particularly active in spreading disinformation in the Baltic states. They are sending the message that you have provided your airspace for the use of Ukrainian drones. There are also other aspects,” he said.

The Minister further stated that the Baltic nations have requested Ukraine’s help in de-escalating the tension caused by Russia’s threats.

“Of course, we will help our friends,” Sybiha confirmed.

...

In a related development, Estonia’s foreign intelligence service has issued a warning about the need for Europe to enhance its defense and internal security to prevent Russia from perceiving an opportunity to challenge NATO.

Kaupo Rosin, head of the Estonian intelligence service, stressed that “Europe must invest in defense and internal security so that... in the future, Russia would conclude it has no chance against NATO countries.”

...

7
12

article: https://www.theatlantic.com/science/2026/04/missing-scientists/686885/

The Atlantic has a long article on the story of missing scientists recently featured here on Slashdot. In short, it is an incoherent conspiracy theory that spreads wide and far, not paying any attention to boundaries of time, space, or area of expertise. "Which is all to say that another piece of flagrant nonsense has ascended to the highest levels of U.S. politics and media," writes the Atlantic's Daniel Engber. "To call it a conspiracy theory would be far too kind, because no comprehensive theory has been floated to explain the pattern of events. But then, even the phrase pattern of events is imprecise, because there is no pattern here at all. Given all the people who could have been roped into this narrative but weren't, any hope of finding meaning falls away. Barring any dramatic new disclosures, the mystery of the missing scientists has the dubious honor of being a sham in every way at once." - https://news.slashdot.org/story/26/04/22/1934234/the-missing-scientist-story-is-unbelievably-dumb

(I would post archive link but I'm blocked because of DNS, VPN, or something else...)

8
8

Adam Johnson, media critic and co-host of Citations Needed, joins Mondoweiss US correspondent Michael Arria to discuss his new book, How to Sell a Genocide: The Media's Complicity in the Destruction of Gaza. All royalties from the book are being donated to the Middle East Children's Alliance.

Adam breaks down how US media manufactured consent for the genocide in Gaza, from the ISIS-ification of Hamas to the fake ceasefire theater that let Biden run out the clock. He explains how liberal media created an alternate reality where Biden was secretly furious with Netanyahu while never once threatening to cut weapons, how campus protests were smeared as antisemitic pogroms while 3,000 children were being killed in a single month, and why the Claudine Gay scandal got more coverage on MSNBC than the murder of Hind Rajab. He also looks ahead at how liberal Zionism will try to co-opt growing opposition to Israel with better PR instead of real policy change.

9
53
submitted 2 days ago* (last edited 2 days ago) by supersquirrel@sopuli.xyz to c/fediverse_vs_disinfo@lemmy.dbzer0.com

Culturally speaking, these tech elites are coded very differently from charismatic Holy Rollers who have had a long tradition of promising their followers that adherence to Christian faith and practices will yield material wealth. But essentially, they are offering a similar, though slightly inverted proposition:

Tech can make you rich and a good Christian. Call it the prosperity gospel of technology. Much in the way they have shaped culture with social media algorithms, tech evangelists now are attempting to normalize the use and acceptance of AI by wrapping it in a spiritual message. They also have explicit policy goals, and the Trump administration appears to be heeding their call, with new federal efforts aimed at unshackling AI from safety regulations.

10
4

A new investigation by Reporters Without Borders (RSF) reveals how certain Facebook pages posing as lifestyle hubs for hobbies and health tips inject Beijing’s political narratives into their feeds to influence public opinion in Taiwan. These pages are tied to the China-based digital marketing company Wubianjie, which blends entertainment, disinformation and political messaging in cognitive warfare campaigns that are difficult to detect.

...

Web Archive link

11
18

cross-posted from: https://scribe.disroot.org/post/8497014

The European Council is listing:

  • Euromore, a media platform operating within the pro Kremlin information architecture as an unofficial media relay. Euromore amplifies, recycles, and legitimises Russian narratives and disinformation targeting European audiences. Furthermore, it recurrently disseminates content challenging the legitimacy of EU institutions and justifying Russia’s war of aggression against Ukraine.

  • The Foundation for the Support and Protection of the Rights of Compatriots Living Abroad (Pravfond). This is a core instrument of the Russian Federation’s foreign influence and propaganda strategy, which is founded and financed by the Russian state. Pravfond’s legal and analytical output is systematically used to reinforce key Kremlin disinformation points, notably allegations of the 'nazification' of Ukraine, claims of widespread 'Russophobia', and assertions of 'systematic persecution of Russian-speaking populations in neighbouring states'.

The designation subjects the entities to an asset freeze, and EU citizens and companies are forbidden from making funds, financial assets or economic resources available to them.

With this decision, restrictive measures in view of Russia’s destabilising activities now apply to a total of 69 individuals and 19 entities.

12
7
submitted 3 days ago* (last edited 3 days ago) by supersquirrel@sopuli.xyz to c/fediverse_vs_disinfo@lemmy.dbzer0.com

Adam Johnson is a media analyst and co-host of the podcast Citations Needed, whose work has appeared in The Nation, The Intercept, and other major outlets. He joined Current Affairs Editor-in-Chief Nathan Robinson to discuss his new book How to Sell a Genocide: The Media’s Complicity in the Destruction of Gaza, an analysis of how U.S. corporate media covered Israel’s assault on Gaza.

https://www.plutobooks.com/product/how-to-sell-a-genocide/

Great very timely discussion!

13
6

I am getting really tired of lazy hyperbolic attacks on Piker, read his words before you listen to the distracting noise.

14
47

Al-Fassel and Pishtaz News look like typical news websites. They have neatly designed homepages and active social media accounts, where they share reporting and videos on Middle Eastern geopolitics in Arabic and Farsi, respectively, as well as English. Al-Fassel’s X account states the publication’s mission is “to investigate events of great significance that are often overlooked by local and regional media, and to shed light on them.” The Pishtaz News X account says it was established “to investigate and expand upon important news that local and regional media often overlook.”

These overlooked stories share the same ideological slant and editorial voice: that of the White House. Al-Fassel’s YouTube account, for instance, has racked up millions of views on Arabic-language videos praising the Trump administration’s Gaza policy and exhorting Hamas to cease “taking orders from the Iranian regime” and release Israeli prisoners. On Pishtaz News, a poll on the homepage recently asked: “[H]ow would you describe your belief about the Supreme Leader’s current health status and whereabouts?” Possible answers range from “In good health but hiding” to “Disfigured” or “Dead.” The excellence of Saudi and Emirati leadership, both close military partners of the U.S., is a recurring theme.

There’s a reason this coverage echoes American foreign policy talking points. Al-Fassel and Pishtaz News are, in fact, part of network of websites and social media accounts purporting to be legitimate Middle Eastern news outlets that are in fact propaganda mills funded by the United States government, The Intercept has found.

...

15
21
submitted 5 days ago* (last edited 5 days ago) by supersquirrel@sopuli.xyz to c/fediverse_vs_disinfo@lemmy.dbzer0.com

Between February 28 (the start of the war), and April 10, DeSmog tracked 22 instances in which Canadian news media gave space to individuals representing Atlas Network-affiliated organizations arguing Trump’s war on Iran justifies the expansion of Canada’s oil and gas sector. In eight of those examples, Atlas-affiliated groups were quoted alongside industry associations, fossil fuel lobby groups, oil companies, or banks with considerable investments in Canada’s oil and gas sector.

These include appearances or attributions in the Canadian Broadcasting Corporation (CBC), the Globe and Mail, the National Post, the Financial Post, as well as CTV and Global television networks, among others.

16
63

What can be said is that the Times’ silence mirrors that of Democratic leaders in Congress, who also barely let out a peep during this period. For their part, it is clear that they aimed to conceal their support for the war from their base, who overwhelmingly oppose it. Within that dynamic, congressional Democrats waited until after the war began to propose a war powers resolution—demonstrating their issues, if any, were about process, not substance.

The Times likewise saved its feckless criticism until after the war began, penning an editorial (2/28/26) the day Trump launched the war (proving their capacity to move quickly when convenient) voicing process concerns: Trump lacked clear achievable objectives, threatened to mire the US in another “endless war,” and failed to consult Congress. Like Democratic leaders, the Times failed to reject—and indeed reiterated—the logic of the war itself: that article of faith that Iran is an intolerably evil and belligerent state (FAIR.org, 3/13/26).

Just like Democratic leaders, the New York Times failed to use its outsized influence to challenge this monstrous war. Instead, it participated in its genesis, through cowardice as much as through sanctimony.

17
17
submitted 6 days ago* (last edited 6 days ago) by supersquirrel@sopuli.xyz to c/fediverse_vs_disinfo@lemmy.dbzer0.com

Although the authors’ personal narratives give the book urgency and immediacy, its larger ambition is diagnostic. Mann and Hotez seek to explain how we arrived at a moment in which anti-science disinformation has become, in their words, “orchestrated,” and credible scientists are recast as public enemies.

Seventy years ago, federally supported research was often celebrated as creating an “endless frontier”—a driver of economic growth, national strength, and democratic vitality. Today, the incentives often run in the opposite direction. When scientific findings threaten entrenched economic or political interests, science itself becomes the target.

To explain this inversion, the authors identify five reinforcing forces that together generate sustained pressure on publicly funded science—what they alliteratively call plutocrats, petro-states, pros, protagonists, and the press. Rather than treating attacks as episodic or personality-driven, they map them as systemic features of the modern political economy and information environment.

18
6
submitted 6 days ago* (last edited 6 days ago) by supersquirrel@sopuli.xyz to c/fediverse_vs_disinfo@lemmy.dbzer0.com

To evaluate the Bubblemetrix Index, a multidimensional instrument was constructed and pre-tested, with the initial version including five factors: “Echo Chamber Exposure”, “Algorithmic Curation Awareness”, “Source Diversity Perception”, “Ideological Homogeneity & Confirmation Bias”, and “Repetitive Interaction & Engagement”. As preliminary data indicated a lack of consistency in the “Echo Chamber Exposure” factor, and the removal of certain items did not yield acceptable reliability, this factor was excluded from the instrument. The final instrument demonstrated good pre-test metric properties.

...

A central theoretical implication of the results is that user agency alone cannot account for the observed patterns of ideological reinforcement on social media platforms. While prior research has variously emphasised users’ capacity to seek diverse information or the constraining role of algorithmic curation, the present findings suggest that these dimensions are closely intertwined. Users with high levels of algorithmic awareness, that is, those who explicitly recognise personalisation mechanisms, are not necessarily less vulnerable to ideological encapsulation. On the contrary, one of the most prominent user profiles identified in this study combines strong algorithmic awareness with elevated confirmation bias and repetitive engagement. This counterintuitive pattern challenges optimistic assumptions underlying media literacy–based approaches to mitigating polarisation and calls into question policy strategies that prioritize user education as a standalone solution (Bechmann & Nielbo, 2018; Dubois & Blank, 2018).

...

This study set out to examine how algorithmic personalisation shapes electoral perceptions by focusing on users’ experiences within filter bubbles. Through the development and application of the Bubblemetrix framework, the research provides empirical evidence that filter bubbles are structured, differentiated, and politically consequential phenomena. Rather than affecting users uniformly, algorithmic reinforcement emerges through specific configurations of awareness, engagement, and ideological alignment.

One of the most significant conclusions is that algorithmic awareness does not function as a protective factor against ideological encapsulation. Users who recognise personalisation mechanisms may nonetheless remain deeply embedded in self-reinforcing informational environments. This finding directly challenges policy approaches that prioritise media literacy and transparency as sufficient tools for mitigating polarisation (Bechmann & Nielbo, 2018; Dubois & Blank, 2018). While such measures are important, they are insufficient to counter structurally embedded dynamics of amplification.

The study further demonstrates that perceived source diversity does not necessarily weaken filter bubble effects. Exposure to multiple outlets can coexist with pronounced ideological homogeneity when algorithmic systems consistently prioritise engagement-congruent content. This insight has direct implications for regulatory strategies that equate diversity with pluralism without addressing how content is algorithmically selected and amplified (Bruns, 2019; Kaiser & Rauchfleisch, 2020).

From a governance standpoint, the findings underscore the need to reconceptualise filter bubbles as systemic risks rather than individual shortcomings. Approaches centred exclusively on user responsibility overlook the structural role of platform architectures in shaping political discourse. Effective mitigation therefore requires interventions that address engagement-driven ranking systems and feedback loops that entrench ideological segmentation.

The identification of distinct user profiles further suggests that regulatory approaches should account for heterogeneity in vulnerability and engagement patterns. Highly engaged users exhibiting strong ideological reinforcement may require more robust safeguards than users who interact selectively with political content. This observation raises broader questions about proportionality, responsibility, and personalisation in platform governance.

Beyond its substantive findings, the study offers a methodological contribution by demonstrating how psychometric instruments and person-centred analyses can inform policy debates. The Bubblemetrix framework provides a means of assessing algorithmic risks in a manner that is empirically grounded and normatively relevant, supporting the development of evidence-based governance strategies.

https://en.wikipedia.org/wiki/Latent_and_observable_variables

In statistics, latent variables (from Latin: present participle of lateo 'lie hidden'[citation needed]) are variables that can only be inferred indirectly through a mathematical model from other observable variables that can be directly observed or measured.[1] Such latent variable models are used in many disciplines, including engineering, medicine, ecology, physics, machine learning/artificial intelligence, natural language processing, bioinformatics, chemometrics, demography, economics, management, political science, psychology and the social sciences.

19
6

cross-posted from: https://scribe.disroot.org/post/8405152

Here is the full report: MANUFACTURING IMPUNITY: RUSSIAN INFORMATION OPERATIONS IN UKRAINE - (pdf)

Global Rights Compliance (GRC) lawyers and researchers at the Reckoning Project worked to expose how information operations and digitally enhanced propaganda were used as an integral part of criminal acts.

This report provides an in-depth examination of “information alibis”, a distinct form of disinformation employed by the Russian Federation (‘Russia’) in armed conflicts. Information alibis involve the preemptive dissemination of false information, carefully crafted to deflect responsibility for crimes committed by the actual perpetrator or perpetrating State.

By constructing these deceptive narratives, Russia aims to absolve its leadership of culpability and manipulate the information landscape to its advantage. This tactic represents a cynical weaponisation of rhetoric as part of Russia’s broader military strategy.

The report analyses the structure, purpose, and objectives of information alibis within the wider context of Russian information operations. It reveals a coherent strategy rooted in state doctrine, evident in operations within Ukraine, Syria, and other regions.

This strategy involves a complex interplay of state actors and carefully orchestrated disinformation campaigns designed to mislead both domestic and international audiences.

The report establishes a foundational understanding of how information alibis function as an important tool for a group of persons acting with a common purpose. Future efforts will focus on expanding case-specific investigations, strengthening attribution to individual perpetrators, and translating findings into probative evidence for accountability mechanisms.

The project was aimed at combining investigations into the command structure of Russian information operations, detailed content analysis of propaganda, data analysis of online behaviour, and other evidence to make a compelling case that:

– Shifted the public debate to clearly mark the difference between even the most extreme types of abhorrent but legal speech and the use of information operations as part of criminal conduct;

– Established a foundation for stronger sanctions and other forms of public pressure to disrupt the funding and functioning of Russian information operations, thus weakening Russia’s military and ‘hybrid war’ machine in Ukraine and across the world;

– Advanced legal principles and build the evidentiary basis necessary to pursue criminal cases against those responsible for integrating information operations into criminal acts.

...

20
11

cross-posted from: https://scribe.disroot.org/post/8404214

Archived version

...

For journalists, the space to operate—already constrained in much of the Gulf—is narrowing further. Across the region, several countries (including the UAE, Qatar, and Jordan) have restricted access to conflict areas, warned of legal consequences for publishing footage, and drawn red lines around wartime reporting. These measures weaken independent coverage, elevate official narratives, and make it harder for the public to get an accurate account of events on the ground.

...

For ordinary internet users, the restrictions are just as severe. Since February, hundreds of people have reportedly been arrested across the region for social media activity linked to the war. In many Gulf states, the legal infrastructure enabling this is already well-established: expansive cybercrime and media laws criminalize vaguely defined offenses such as “spreading rumors,” “undermining public order,” or “insulting the state”. In wartime, these provisions become catch-all tools: flexible enough to apply to nearly any form of dissent.

...

In Bahrain, authorities have reportedly cracked down on people who protested or shared footage of the conflict online. The Gulf Centre for Human Rights has reported 168 arrests in the country tied to protests and online expression, with defendants potentially facing serious prison terms if convicted.

In the UAE, authorities have arrested nearly 400 people for recording events related to the conflict and for circulating information they described as misleading or fabricated. Police have claimed this material could stir public anxiety and spread rumors, and state-linked reporting has described the crackdown as part of a broader effort to defend the country from digital misinformation.

Saudi Arabia has also intensified restrictions, issuing a statement on March 2 banning the sharing of rumors or videos of unknown origin, and issuing a campaign discouraging residents from taking or posting photos. The campaign included a hashtag that reads “photography serves the enemy.” Journalists have been prevented from documenting the aftermath of airstrikes on the country. Kuwait, Qatar, and Jordan have adopted similar restrictions on wartime imagery and reporting.

Qatar’s Interior Ministry has arrested more than 300 people for filming, circulating, or publishing what the ministry deemed to be misleading information. Taken together, these measures show how quickly wartime speech is being folded into existing legal systems designed to punish dissent.

...

This is not just a series of isolated incidents. It is a regional playbook for silencing critics and narrowing the public record. Gulf states have long relied on censorship and surveillance; the war has simply made those methods easier to justify and harder to challenge.

...

It may be tempting to see these measures as temporary, but emergency powers—like the one enacted in Egypt following the 1981 assassination of Anwar Sadat that lasted for more than three decades—have a way of sticking around. Legal precedents that are set during wartime often become normalized—or reinvoked during times of crisis, as occurred in 2015, when France brought back a 1955 law related to the Algerian War of Independence amidst the Paris attacks.

...

21
18

[Former Forward reporter Larry Cohler-Esses] was hopeful that coverage of the synagogue’s destruction in Israeli press had the “potential to make Jewish readers of Jewish media outlets go, ‘Oh, they have synagogues there.’” But with the underplaying of the story in US media, it’s a missed teachable moment for news consumers generally.

More robust press coverage of the attack could have taught Americans that the Jews of Iran do have something to fear: Israel.

22
7
23
22

cross-posted from: https://mander.xyz/post/50631734

...

"We have created a huge team of kids, who understand how to broadcast government values and our organisation's values," Vladislav Golovin, a former soldier and chief of the general staff of Russia's Young Army cadets movement, said in a statement released by the group.

In a promotional video from the event, children were shown cheering a cadet racing against Golovin to see who could reload a sniper rifle the fastest.

Another organisation, the Movement of the First, runs competitions offering rewards for teenagers with the best blogs and biggest followings. 'Easy to radicalise'

The training camps are part of what Keir Giles, director of the UK-based Conflict Studies Research Centre, calls a "concentrated campaign to restore the prestige of the Russian military."

...

"These 14–16-year-olds have grown up in an environment where they have never known anything other than Putinism. This is their reality, and so we should not be surprised if these new efforts to spread information reflect that reality," he told AFP.

...

Social media content can be "direct and radical" or "very subtle, aimed not at generating support for Russia, but at decreasing solidarity with Ukraine," said Dietmar Pichler, a disinformation and propaganda analyst at INVED.

At the training camp in Moscow, the Young Army cadets were quick to grasp the power of their new skills.

"When you are the one behind the camera filming the entire process, making audiences happy, you realise ... you are the one who has aroused these emotions in people," a girl said in a promotional clip published by the organisers.

...

24
9

cross-posted from: https://mander.xyz/post/50629515

  • Chinese state media amplify Taiwan opposition voices to undermine DPP government, IORG data show
  • Campaign aims in part to undercut Taiwan's push to lift defense spending, Taiwan officials say
  • Taiwan counters with media-literacy efforts, DPP stresses strength over concessions to China
  • China’s Taiwan Affairs Office didn’t respond to questions about information warfare

...

As Chinese warships and fighter jets staged massive drills around Taiwan in December, a parallel action was unfolding on smartphone screens.

On Douyin, China’s version of TikTok, a news outlet run by the Chinese Communist Party posted a 51-second video of Taiwan opposition leader Cheng Li-wun accusing President Lai Ching-te of inviting Chinese aggression. Lai, Cheng said, was “dragging all 23 million of us” in Taiwan into a “dead end, a ​road to death” by pursuing independence. The clip quickly surfaced on Facebook, YouTube and other platforms popular in Taiwan.

...

Chinese state media outlets are increasingly amplifying Taiwanese critics of the island’s ruling Democratic Progressive Party (DPP), including influencers and politicians linked to the opposition Kuomintang (KMT), according to five Taiwanese security officials and data from ‌Taipei-based research group IORG.

China imports the public statements of leading KMT and other opposition figures that are critical of the Taiwan government and pumps them out in a torrent of anti-DPP messaging in Chinese state media and on social media platforms in China, according to the data and sources. Those clips are then reshared and often repackaged for consumption on platforms popular in Taiwan, including Facebook, TikTok and YouTube, as well as on Douyin, sometimes embellished or presented in ways that obscure China’s hand.

While China has in the past employed Taiwanese figures in its propaganda, it has turbocharged this information-warfare tactic, the Taiwan security officials said: Familiar voices and accents can sound more credible.

The goal is to discredit a government Beijing accuses of seeking independence, the officials said. And, with the DPP seeking $40 billion in extra defense outlays, the campaign also appears aimed at convincing Taiwanese that China’s ​military power is so overwhelming that it is futile for Taiwan to spend heavily on more American weapons, according to IORG and three of the security officials.

...

While Chinese preparations for military action against Taiwan continue, the information warfare ​is part of Beijing’s strategy of wearing down Taiwan without resorting to force. In this regard, Taiwan’s opposition KMT provides a valuable opening for China: The party has moved to seek closer ties with Beijing in a bid to head off what it says is a crisis made worse by the DPP government’s provocation of China.

...

Cheng, the KMT leader, was the top-ranked Taiwanese figure in the Chinese clips, featuring in 460 videos across 68 Douyin accounts and generating more than five million interactions, including likes, comments and shares. ‌The videos amplified her calls ⁠for “peace” with China, her criticism of President Lai as a “pawn” of external forces, and her characterization of the DPP’s stance on Taiwan independence as destructive. Once aired on Chinese state media and social media platforms, some of the clips were repackaged and posted on platforms popular in Taiwan.

...

Various influencers were also heavily cited by the Chinese outlets. Among them were Holger Chen Chih-han, a bodybuilder popular with younger audiences, and five retired senior military officials known for criticizing the DPP and Taiwan’s defenses.

“Happy birthday, motherland,” Chen said on a YouTube livestream in late September, ahead of China’s National Day. Short clips of the broadcast, in which he also said the people of Taiwan and China were “one family,” were later shared by Chinese state media outlets, including China News Service. Chen didn’t respond to a request for comment.

...

Taiwan’s intelligence officials recorded over 45,000 sets of inauthentic ​social-media accounts and 2.3 million pieces of disinformation on China-Taiwan issues last year, a January report by Taiwan’s National Security Bureau said. It described the goals of Beijing’s information warfare: to exacerbate divisions within Taiwan; weaken Taiwanese people’s will to resist; and win support for China’s stance.

...

25
12

cross-posted from: https://scribe.disroot.org/post/8380095

Archived version

Under the pretext of cultural cooperation, China sends propaganda officials to Europe and Estonia to promote a carefully crafted "China story." The ISS [Estonian Internal Security Service] wrote that this cultural engagement diverts attention from human rights abuses, unfair economic practices, and growing support for Russia in the war against Ukraine.

The Chinese Embassy in Estonia has stepped up cultural outreach by organizing events with local governments. Alongside these events, meetings are held with local political and business elites, providing favorable opportunities to establish contacts and shape attitudes toward China.

According to the ISS (opens pdf), efforts to cultivate a positive attitude begin with young people. For example, the Chinese ambassador is a frequent guest at Estonian general education schools and universities.

...

The ISS noted that last year the embassy published several native advertising articles in the media of Estonia and other Baltic states about "democracy," "fair" trade and Taiwan's "historical" status as part of China, even though the Chinese Communist Party has never governed Taiwan.

...

The embassy also pays local companies for media and public relations services to mediate contacts with Estonian journalists and find outlets willing to publish its articles.

...

According to the ISS, Chinese intelligence activities focus on science, technology and security. After establishing contact via LinkedIn, Chinese intelligence services attempt to lure influential individuals by inviting them to China or its neighboring countries. Travel to China creates recruitment opportunities.

...

The ISS separately highlighted the Estonian-Chinese Chamber of Commerce established in 2022, which has targeted Estonia's technology sector and contacted several startup and technology companies with proposals for lucrative cooperation with China. The proposals were not sent to company executives or general addresses but specifically to individuals responsible for technology. The founders of the chamber also sought to establish contacts with European Union institutions.

...

In 2025, the use of professional work platforms such as LinkedIn intensified in identifying intelligence targets, the ISS noted. As a new trend, instead of sending personalized messages, job advertisements are posted, and potential candidates are then contacted based on them.

Among candidates, work experience in government service and in the fields of foreign or security policy is particularly valued. Cooperation offers have been made to Estonian politicians, diplomats, ministry officials, members of the Defense League, researchers and others, the ISS wrote.

China has also begun using job advertisements more broadly. For example, in November last year, the United Kingdom's security service MI5 issued a spying alert to the UK Parliament and its staff, warning about Chinese recruiters and citing two LinkedIn accounts as examples. As early as October 2023, the head of MI5 stated that nearly 20,000 Britons had been contacted via LinkedIn.

view more: next ›

Fediverse vs Disinformation

2098 readers
75 users here now

Pointing out, debunking, and spreading awareness about state- and company-sponsored astroturfing on Lemmy and elsewhere. This includes social media manipulation, propaganda, and disinformation campaigns, among others.

Propaganda and disinformation are a big problem on the internet, and the Fediverse is no exception.

What's the difference between misinformation and disinformation? The inadvertent spread of false information is misinformation. Disinformation is the intentional spread of falsehoods.

By equipping yourself with knowledge of current disinformation campaigns by state actors, corporations and their cheerleaders, you will be better able to identify, report and (hopefully) remove content matching known disinformation campaigns.


Community rules

Same as instance rules, plus:

  1. No disinformation
  2. Posts must be relevant to the topic of astroturfing, propaganda and/or disinformation

Related websites


Matrix chat links

founded 2 years ago
MODERATORS