A new report explains how messaging platforms can be used for political propaganda or “information calculated to manipulate public opinion.”
Popular messaging apps such as WhatsApp and Telegram can increasingly be used for political purposes, including spreading political propaganda, experts say in a new report.
Researchers at New York University (NYU) surveyed 4,500 messaging app users in nine countries and interviewed political strategists in 17 countries to learn how agents of influence use platforms like WhatsApp and Telegram to manipulate public opinion.
According to the report, 62% of users surveyed had received political content on these apps, and 55% of this information came from people they did not know.
This is because platforms such as WhatsApp, Viber and Telegram “do not have the traditional mechanisms” for content moderation that other social media platforms have and because they exploit features that amplify misinformation, continues the report.
“While presenting themselves as platforms designed for secure and private communications between loved ones, some messaging applications monetize their products through functions that allow large-scale distribution and the virality of messages.
Paid features increase the reach of misinformation
According to the report, influencers use paid features on these messaging apps to access more people.
WhatsApp’s business platform offers subscribers a “green check mark” for verification, automated messaging and unlimited reach that amplifies their content, the report said.
WhatsApp users can also decide whether they want to receive messages from paid businesses, depending on the section FAQs of the application.
WhatsApp’s messaging policy states that government agencies are allowed to use its platform, but political parties, politicians and political campaigns are not. The company also told researchers it was putting “additional resources” in place during the election to ensure its policies are not violated.
However, the report notes that some users found workarounds, posing as actors or creating fake business names, getting verified on X and then using it as proof for commercial services from WhatsApp.
Viber works the same way: users can activate a setting to no longer receive messages, depending on the website of the application.
The report found a workaround in Ukraine, where political consultants obtained verified Viber accounts through a “partner” or messaging vendor.
Ukrainian actors then launched social media campaigns asking users to sign up to their mailing lists using QR codes that, unbeknownst to them, allowed people to receive communications from their groups, the report continues. .
On Telegram, everything user canpay less than 5 euros per month to benefit from many additional features such as automated messages, quick replies, profile badges and chatbot support.
This allows political operatives to pose as “official” accounts on Telegram without needing to be verified, according to the New York University report.
Telegram also lets anyone buy advertising space on its high-subscription channels, which the app says generate around 1 trillion views per month.
Rakuten, Viber’s parent company, said in a statement sent to Euronews Next that their policies and features “help them make informed decisions about what content to trust and engage with” on their app.
“We continue to develop our app and enforce our policies with our users in mind,” the statement read.
Euronews Next contacted Meta, the company behind WhatsApp, and Telegram for comment, but did not receive an immediate response.
Platforms also amplify misinformation
The paid features of these messaging apps amplify long-standing disinformation techniques, according to the report.
The first step to getting a message across is to create or infiltrate pre-existing groups on social media channels and since Viber doesn’t limit the number of participants in a community or channel, this “plays right into” their strategy , according to the report.
Even if the groups they infiltrate are considered apolitical, propagandists “exploit members’ stated interests to craft political messages that are likely to resonate,” according to the report.
Members of the groups can sometimes be “sock puppet” accounts, fake profiles made by bad actors to represent a person or company “with a particular point of view,” according to the report.
Although a common misinformation tactic on social media, “puppet” accounts are “arguably more problematic” on messaging apps because they are “more obscure.”
Propagandists can also cross-post, that is, post the same message on multiple social media channels at once.
On Telegram, for example, users create bots that automate content sharing on X. Another Indian app called ShareChat allows users to cross-post content from Telegram to WhatsApp and other Meta-owned platforms, like Facebook and Instagram.
The combination of all these tactics creates what researchers call “feedback loops,” where the same content continues to “appear in different parts of the platform ecosystem,” according to the report.
Recommendations for messaging apps
One of the challenges for these applications, the researchers say, is that while encryption can be used for concealment by propagandists, it can be useful for activists “exposed to the risk of surveillance.”
With this in mind, the authors offer a long list of recommendations for messaging app companies, such as setting limits on account creation and more rigorous monitoring of business accounts.
For policymakers, the report suggests including encrypted messaging platforms in existing regulations, without weakening them.
“The value of encrypted messaging for human rights defenders and society at large exceeds the threat of disinformation on encrypted chat apps,” the report reads.
One way to do this is to require companies to disclose content-neutral information about how their policies and enforcement systems work to combat misinformation.