Misinformation practices have grown in complexity as digital and social media technologies become more sophisticated. tested, it is vital to redefine misinformation and assess how far initiatives in the field have adapted to the novel context of false information distribution practices.
Based on observations in the field, the boundaries between correct and incorrect information have become more fluid in many misinformation practices, including what information constitutes fake news or political propaganda, and so on.
There is also the issue of defining "fake news," especially regarding the parameters used for labeling content on social media and online media--whether by individual journalists or fact-checkers, editorial boards, platforms or the government.
Another challenge is proving who could be considered as actors responsible for spreading misinformation. Appelman et al. (2022) argued that, from a legal perspective, proving someone's intentions in sharing misinformation is very difficult. Evidence of awareness about spreading misinformation is required, regardless of whether the intent is to deceive, mislead, or cause greater harm. Deciphering someone's intent to spread misinformation, from the perspective of EngageMedia, does not provide further solutions for future actions.
The contrast between misinformation and disinformation makes understanding disinformation based on political propaganda even more complicated. One example is the "Russian disinformation" campaign, where the Russian government hired companies to spread misinformation, racial sentiments, and damaging statements towards political opponents. There was also the "fake campaign without fake news" which occurred during the 2024 presidential elections in Indonesia (Broda & Strömbäck, 2024; Tapsell, 2024).
There is no reason for fact-checkers to be overly concerned with the intent behind the misinformation campaign. Larger social media platforms can distinguish between unintended misinformation distribution and orchestrated operations involving buzzers, bots, or political trolls. The fact-checker interviewed by EngageMedia said that social media platforms have the capacity to monitor these activities and should have the ability to mitigate them:
"Yes, so I was once invited to the Twitter headquarters in the US in 2018. Over there, they explained that Twitter's engines let them see things we couldn't. Including whether this person is a buzzer and that this is their network. It turns out they are capable of seeing these things." - Fact-Checker from Civil Society Organization.
However, are the platforms truly fulfilling their responsibilities as mitigators? Frequent changes in platform rules, coupled with ownership changes motivated by political and business interests, will certainly influence a platform's future decisions. Unfortunately, this situation lies beyond the control of fact-checkers:
"Since 2018, Twitter has been equipped with a monitoring tool. So this is something that their engines should be capable of [doing]. This is a new feature, however, ... it may not be perfect from the get-go. So of course they will continue to refine it as things move along. Despite the significant recent layoffs at Twitter, they possess sufficient resources to fix this, hahaha. So that is the actual question." - Fact-Checker from Civil Society Organization.
The issue of content monitoring should be considered in the larger context of social media platform policy. For example, TikTok empowers the community to report content that violates the platform's regulations. However, TikTok continues to encounter problems due to the huge number of local languages in Southeast Asia, making violations harder to detect by its limited number of content moderators (Jalli, 2024).
In the context of state-sponsored political propaganda, the fact-checkers interviewed by EngageMedia recognized that the state could use misinformation to undermine democracy:
"Now the trend is shifting; [misinformation] is being used as a weapon for undermining democracy. We see in Indonesia that the government is becoming stronger, having a more dominant presence in digital spaces to exert greater influence on public opinion. Well, it turns out they are doing the same thing by hiring troops cyber; they produced fake accounts and ... sent out messages of disinformation to manipulate public information related to important issues, such as the revision of the KPK Law [law on Indonesia's anti-corruption institution], the omnibus law, the Criminal Code, and Papua." - Media.
Undoubtedly, the current spread of disinformation is more than just a source of polarization; it is also a tool of digital authoritarianism. The use of disinformation to silence activists, freedom of expression, and freedom of the press is proof of this. In the Fatia-Haris case, state actors silenced the two human rights defenders when they voiced concerns about oligarchs (DA, 2024) and the news of the sexual violence case in East Luwu, South Sulawesi, which was published by Project Multatuli, was labeled as a "hoax" by the state (Maharani & Krisiandi, 2021). These two cases are forms of repression. They silence those who seek to expose major scandals involving state actors.
Singapore, which has established the Protection from Online Falsehoods and Manipulation Act, provides another example of digital authoritarianism. The policy authorizes the Prime Minister to regulate internet content (Singapore Statutes Online, 2020). Restrictions on the distribution of content that do not align with government policies, including correcting, removing, and even blocking access to content that are considered falsehoods, could elicit debates or protests against the government (Han, 2019).
Information Muddle Wrapped in Influence Operations
One of the important aspects of the misinformation trend during the 2024 presidential election in Indonesia is that election-related material is "packaged in light, entertaining content" to avoid feeling like disinformation. Prabowo Subianto and Gibran Rakabuming's AI technology-clad campaigns, such as "Joget Gemoy" and "Oke Gas Oke Gas", are not actual representations of Prabowo. They are part of the campaign team's strategies to get the public to think of Prabowo-Gibran as cool and casual figures (Garnesia, 2024).
This contrasts with analysts' assessment that Prabowo was unable to control his emotions during the final debate on 7 January 2024. Prabowo, striking a power pose with hands on his hips, caused so many interruptions that the moderator had to regulate and pacify the situation (Santosa, 2024).
The conditions of the 2024 presidential campaign were completely different from 2019. Five years ago, the presidential campaign was dominated by polarization between nationalist and religious groups, as well as a theater of slander and lies that both elevated and denigrated certain candidates (Ronaldo & Damaiza, 2021). Although many expected religious polarization to return in the 2024 election, this did not happen (Pattisina, 2023). People rarely encountered misinformation that exploited certain issues or groups considered "enemies" in preceding elections, such as Chinese migrant workers, the Indonesian Communist Party (PKI), Shia Muslims, Hizbut Tahrir Indonesia, or even the LGBT community.
The "Gemoy Campaign" signaled a significant shift for a society weary of family WhatsApp groups' conflicts and social media disturbances, commonly referred to as "serious politics" (Nababan & Rahayu, 2024). The Gemoy Dance also sought to exploit the people's fondness for political entertainment on social media, and it is, in fact, not a novel aspect of the national political scene (Schultz, 2012). This phenomenon had been long observed, dating back to Golkar's 1971 campaign, which involved three hundred twenty-four artists from the 1971 Golkar Safari Artistic Team. The United Development Party (PPP) continued this practice in 1977 when they included popular dangdut artist Rhoma Irama in their campaign. (Wibisono, 2017). Political entertainment, such as the Gemoy Dance, appeared to be capable of diverting people's attention away from serious political issues.
However, political misinformation content did not completely disappear in the lead-up to the election, keeping factcheckers busy with the large amount of misinformative content. The narratives employed in the 2024 election no longer focused on slandering rival candidates.
Although many expected Prabowo's dark past and charges of horrific human rights violations to resurface as a central narrative in the 2024 presidential election, he now had a legion of endearing social media content to cover this up.
Furthermore, the "Gemoy Campaign" seemed to undermine the campaign focused on the president's vision and mission, which should be taken seriously. Prabowo simply danced when asked about his vision and mission.
Prabowo only had to remain in his own echo chamber and did not need to answer any questions from producers after a debate, or even bother attending his own campaign; the AI could do it all for him (Garnesia, 2024). On top of that, TikTok indeed further created echo chambers that reinforced pre-existing beliefs and biases (Jalli, 2023). His campaign team curated TikTok videos for his supporters, removing the need to dispute Prabowo's political policies or question his past.
Researchers have attempted to coin the term "missedinformation" to describe this phenomenon. It reflects the absence of popular historical narratives, leaving youth who did not witness the events of 1998 firsthand without context (Garnesia, 2024). Others have emphasized the importance of understanding this new concept of disinformation, with some referring to Prabowo's strategy as toxic positivity, similar to the 2022 campaign of Ferdinand Marcos Jr. in the Philippines (Tapsell, 2024; Garnesia, 2024; Curato, 2022).
EngagedMedia refers to this phenomenon as an information muddle arising from subtle and short-term social media manipulation (influence operations). Gregory Bateson (1987) introduced the term "muddle" whereas Bateman et al. (2021) established the concept of "influence operations."
According to Bateman et al., influence operations can be out to change someone's belief carried, shift voters' behavior, or provoke political violence. Bateman explained that these tactics could be quickly deployed through social media.
This strategy has been successful in shifting political views and behaviors in society. This is evident in the growth of vaccine scepticism or rising xenophobic sentiments, which grew rapidly and massively over social media. In less than a year, the "Gemoy Campaign" in Indonesia convinced 58.6 percent of the population to vote for Prabowo-Gibran.
The Bureau of Investigative Journalism influence operations as the spread of defines counter-information over social media. This does not necessarily need to be misinformation.
The "Gemoy Campaign" which successfully overshadowed the narrative about Prabowo's past, demonstrates this. Wanless and Pamment (2019) defined influence operations as a series of coordinated activities. It should be highlighted that the 2024 presidential election involved not only a shift in Prabowo's persona through the use of AI, but also a series of unethical behaviors on Jokowi's part to support Prabowo's victory through the use of all forces at his disposal, such as:
EngageMedia refrains from naming this phenomenon as propaganda due to its broad definition, which makes it challenging to distinguish propaganda in advertising, marketing, and public relations (Wanless & Pamment, 2019). Some definitions of propaganda are being updated to reflect changes in technological use, particularly on social media.
This has given rise to concepts such as "networked propaganda" or "computational propaganda" (Benkler, 2018; Wooley & Howard, 2016). The term "participatory propaganda" is also used as a model to represent propagandists' abilities to invite audiences to engage with, adapt, and spread propaganda through social media (Wanless & Berk, 2019). The concept of participatory propaganda is suitable for describing the "Gemoy Campaign," which was successful in attracting the masses on TikTok.
However, this concept does not adequately accommodate the strategies and forces deployed by the incumbent, which go beyond the norms of statesmanship (Supriatma, 2024).
EngageMedia argues that fact-checkers and civil society organizations should prioritize influence operations. At the same time, EngageMedia acknowledges Bateman, et al. (2021) admission that empirical research on the impact of influence operations on society is limited.
Policy limitations for evaluating the legitimacy of these subtle and sporadic political tactics, coupled with a lack of interpretation and implementation by law enforcement, may become a reinforcing factor for operations influence in the future.
This is an edited excerpt of the 2025 report entitled Misinformation Dynamics and the Compatibility of FactChecking Practices Today , prepared by EngageMedia. Full report is available at https://engagemedia.org/politics-fact-checking-indonesia/ .
For further information, please contact: EngageMedia, 8/225 Bourke Street , Melbourne, 3000, Victoria, Australia; e-mail: contact@engagemedia.org ; https://engagemedia.org/ .
References
Appelman, N., Dreyer, S., Bidare, PM, & Potthast, K. 2022, May 16. "Truth, Intention and Harm: Conceptual Challenges for Disinformation-Targeted Governance," Internet Policy Review , https://policyreview.info/articles/news/truth-intention-and-harm-conceptual-challengesdisinformation-targeted-governance/1668
Bateman, J., Hickok, E., Courchesne, L., Thange, I., & Shapiro, JN 2021, June 28. Measuring the Effects of Influence Operations: Key Findings and Gaps from Empirical Research - Carnegie Endowment for International Peace , Carnegie Endowment for International Peace, https://carnegieendowment.org/2021/06/28/measuring-effects-of-influence-operations-keyfindings-and-gaps-from-empirical-researchpub-84824.
Benkler, Y., Faris, R., & Roberts, H. 2018. Network Propaganda: Manipulation, Disinformation, and Radicalization in American Politics , Oxford University Press.
Broda, E., & Strömbäck, J. 2024. "Misinformation, Disinformation, and Fake News: Lessons from an Interdisciplinary," Systematic Literature Review . Annals of the International Communication Association, 48(2), 139-166, https://doi.org/10.1080/23808985.2024.2323736.
Curato, N., By, Calimbahin, C., Romero, SJE, & Arugay, A. 2022, March 28. The Philippines: Erasing History Through Good Vibes and Toxic Positivity , Heinrich Böll Foundation: Southeast Asia Regional Office. Heinrich Böll Foundation | Southeast Asia Regional Office, https://th.boell.org/en/2022/03/28/philippines-goodvibes-toxic-positivity.
DA, AT (n.d.). Dakwaan Jaksa Tak Terbukti, Haris-Fatia Divonis Bebas, hukumonline.com, www.hukumonline.com/berita/a/dakwaan-jaksa-tak-terbukti--haris-fatiadivonis-bebas-lt659bbd0877a37.
Garnesia, I. 2024, June 28. "Masa Lalu Adalah Masa Lalu": Kemenangan Prabowo & Toxic Positivity, Project Multatuli, https://projectmultatuli.org/masa-laluadalah-masa-lalu-kemenangan-prabowo-toxicpositivity/.
Han, K. 2019. Big Brother's Regional Ripple Effect: Singapore's Recent "Fake News" Law Which Gives Ministers the Right to Ban Content They Do Not Like, May Encourage Other Regimes in South-East Asia to Follow Suit. Index on Censorship, 48(2), 67-69. https://doi.org/10.1177/0306422019858296.
Jalli, N. 2024, January 9. "Navigating Algorithmic Bias Amid Rapid AI Development in Southeast Asia," The Conversation , https://theconversation.com/navigating-algorithmic-bias-amid-rapid-aidevelopment-in-southeast-asia.
Pattisina, EC. 2023, March 19. Polarisasi yang Didorong Sentimen Agama Berpotensi Terulang di Pemilu 2024, Kompas.id, www.kompas.id/baca/polhuk/2023/03/19/polarisasi-masyarakatberdasarkan-agama-berpotensi-terjadi-lagipemilu-2024.
Ronaldo, R., & Darmaiza, D. 2021. "Politisasi Agama dan Politik Kebencian pada Pemilu Presiden Indonesia 2019," Indonesian Journal of Religion and Society , 3(1), 33-48, https://doi.org/10.36256/ijrs.v3i1.150.
Santosa, LW. 2024, January 8. Pakar Gestur Nilai Prabowo Tunjukkan Emosi dalam Beragam Cara, Antara News, www.antaranews.com/berita/3905736/pakargestur-nilai-prabowo-tunjukkan-emosi-dalamberagam-cara.
Schultz, DA 2012. Politainment: The Ten Rules of Contemporary Politics: A Citizen's Guide to Understanding Campaigns and Elections.
Supriatma, M. 2024, February 20. Pilpres 2024: Kembalinya Pemilu Gaya Orde Baru. Project Multatuli, https://projectmultatuli.org/pilpres-2024-kembalinya-pemilugaya-orde-baru/
Tapsell, R. 2024, March 7. It's Time to Reframe Disinformation: Indonesia's Elections Show Why, Center for International Governance Innovation, www.cigionline.org/articles/its-time-to-reframedisinformation-indonesias-elections-show-why/.
Wanless, A., & Pamment, J. 2019. How Do You Define a Problem Like Influence?, Carnegie Endowment for International Peace. https://carnegieendowment.org/2019/12/30/how-do-you-define-problem-likeinfluence-pub-80716.
Woolley, SC, & Howard, PN 2016. "Political Communication, Computational Propaganda, and Autonomous Agents," International Journal of Communication , 10.