Disinformation for Hire, a Shadow Industry, Is Quietly Booming

Disinformation for Hire, a Shadow Industry, Is Quietly Booming

In May, several French and German social media influencers received a strange proposal.

A London-based PR agency wanted to pay them to promote posts on behalf of a client. A neat three-page document detailed what to say and on which platforms to say it.

But he asked influencers to push not beauty products or vacation packages, as is typical, but lies tarnishing Pfizer-BioNTech’s Covid-19 vaccine. Stranger still, the Fazze agency has claimed a London address where there is no evidence that such a company exists.

Some recipients have posted screenshots of the offer. Exposed, Fazze cleaned up his social media accounts. That same week, Brazilian and Indian influencers posted videos echoing Fazze’s storyline to hundreds of thousands of viewers.

The scheme appears to be part of a secret industry that security analysts and U.S. officials say is exploding: hiring misinformation.

Private companies, straddling traditional marketing and the shadow world of geopolitical influence operations, sell services once primarily provided by intelligence agencies.

They sow discord, meddle in elections, sow false narratives and push viral conspiracies, mostly on social media. And they offer customers something valuable: denial.

“Rented disinfo actors employed by government or actors adjacent to government are getting more and more serious,” said Graham Brookie, director of the Atlantic Council’s Digital Forensic Research Lab, calling it an “emerging industry. booming “.

Similar campaigns have recently been uncovered promoting the ruling party in India, Egyptian foreign policy goals, and political figures in Bolivia and Venezuela.

Mr Brookie’s organization followed an operation in the middle of a mayoral race in Serra, a small town in Brazil. An ideologically promiscuous Ukrainian company has spurred several competing political parties.

In the Central African Republic, two separate operations flooded social networks with a duel of pro-French and pro-Russian disinformation. The two powers compete for influence in the country.

A wave of apparently organic anti-US messages in Iraq were followed by a public relations firm that was separately accused of faking anti-government sentiment in Israel.

Most of them can be traced back to underground companies whose legitimate services resemble those of a marketer or an email spammer.

Job postings and LinkedIn employee profiles associated with Fazze describe it as a subsidiary of a Moscow-based company called Adnow. Some Fazze web domains are registered as owned by Adnow, as German outlets Netzpolitik and ARD Kontraste first reported. Third party reviews describe Adnow as a struggling advertising service provider.

EU officials say they are investigating who hired Adnow. Sections of Fazze’s anti-Pfizer talking points look like promotional material for Russia’s Sputnik-V vaccine.

For-hire disinformation, while sometimes effective, becomes more sophisticated as practitioners iterate and learn. Experts say it is becoming more and more common in all parts of the world, overtaking operations carried out directly by governments.

The result is an accelerated increase in polarizing conspiracies, bogus citizen groups and fabricated public sentiment, deteriorating our shared reality beyond even the depths of recent years.

The trend emerged after the Cambridge Analytica scandal in 2018, according to experts. Cambridge, a political consulting firm linked to members of Donald J. Trump’s 2016 presidential campaign, has collected data on millions of Facebook users.

The controversy has drawn attention to common methods among social media marketers. Cambridge used its data to target hyper-specific audiences with personalized messages. He tested what resonated by tracking likes and shares.

The episode taught a generation of consultants and opportunists that there was a lot of money in social media marketing for political causes, all disguised as organic activity.

Some newcomers eventually came to the same conclusion that Russian agents did in 2016: disinformation works particularly well on social platforms.

At the same time, the backlash of influence peddling from Russia appears to have left governments reluctant to get caught, while demonstrating the power of such operations.

“There is, unfortunately, a huge market demand for disinformation,” Mr. Brookie said, “and many places in the ecosystem are more than willing to meet that demand.

Commercial firms carried out for-hire disinformation in at least 48 countries last year, nearly double the previous year, according to a study from the University of Oxford. The researchers identified 65 companies offering such services.

Last summer, Facebook suppressed a network of Bolivian citizen groups and journalistic fact-checking organizations. He said the pages, who had promoted lies supporting the country’s right-wing government, were bogus.

Researchers at Stanford University traced the content to CLS Strategies, a Washington-based communications company that had registered as a consultant with the Bolivian government. The company had carried out similar work in Venezuela and Mexico.

A spokesperson referred to the company’s statement last year that its regional head had been put on leave, but disputed Facebook’s accusation that the work qualified as foreign interference.

New technologies allow almost everyone to get involved. The programs batch generate fake accounts with profile photos that are difficult to trace. Instant metrics help refine effective messaging. The same goes for access to users’ personal data, which can be easily purchased in bulk.

The campaigns are rarely as sophisticated as those of government hackers or specialist companies like the Kremlin-backed Internet Research Agency.

But they seem to be cheap. In countries that mandate campaign finance transparency, companies report charging tens of thousands of dollars for campaigns that also include traditional advisory services.

The denial layer allows governments to sow disinformation more aggressively, at home and abroad, than might otherwise be worth the risk. Some contractors, when caught, have claimed that they acted without their client’s knowledge or only to win future contracts.

The platforms have stepped up their efforts to root out coordinated disinformation. Analysts notably credit Facebook, which publishes detailed reports on the campaigns it disrupts.

Yet some argue that social media companies also play a role in compounding the threat. The algorithms and design elements that drive engagement, search results, often prioritize content that divides and conspires.

Political norms have also changed. A generation of populist leaders, like Rodrigo Duterte of the Philippines, was formed in part through the manipulation of social media. Once in power, many institutionalize these methods as tools of governance and external relations.

In India, dozens of government-run Twitter accounts have shared posts on India Vs Disinformation, a website and set of social media feeds that claim to verify information about India.

India Vs Disinformation is, in fact, the product of a Canadian communications company called Press Monitor.

Almost all of the posts seek to discredit or scramble information unfavorable to the government of Prime Minister Narendra Modi, including on the heavy toll of Covid-19 in the country. An associated site promotes pro-Modi stories under the guise of press articles.

A Digital Forensic Research Lab report investigating the network called it an “important case study” in the rise of “disinformation campaigns in democracies.”

A representative for Press Monitor, who would identify only as Abhay, called the report completely false.

He only clarified that he had incorrectly identified his business as being based in Canada. When asked why the company listed a Toronto address, a Canadian tax registration and identified itself as “part of Toronto’s thriving tech ecosystem,” or why he was reached at a phone number from Toronto, he replied that he had business in many countries. He did not respond to an email asking for clarification.

A LinkedIn profile for Abhay Aggarwal identifies him as the managing director of Toronto-based Press Monitor and says the company’s services are used by the Indian government.

A set of pro-Beijing operations suggests the capacity for rapid change on the ground.

Since 2019, Graphika, a digital research company, has been following a network it dubbed “Spamouflage” for its early reliance on social spam platforms with content echoing Beijing’s line on geopolitical issues. Most of the posts received little to no engagement.

In recent months, however, the network has grown to hundreds of accounts with elaborate characters. Each has their own profile and their own post history which may seem genuine. They seemed to come from many different countries and backgrounds.

Graphika traced the accounts to a Bangladeshi content farm who wholesale them up and likely sold them to a third party.

The network is pushing harsh criticism against Hong Kong Democratic activists and American foreign policy. By coordinating without appearing to be, it created the appearance of organic shifts in public opinion – and often attracted attention.

The stories were amplified by a large media network in Panama, prominent politicians in Pakistan and Chile, Chinese YouTube pages, left-wing British commentator George Galloway and a number of Chinese diplomatic accounts.

A separate pro-Beijing network, discovered by a Taiwanese investigative body called The Reporter, operated hundreds of Chinese-language websites and social media accounts.

Disguised as news sites and citizen groups, they promoted Taiwanese reunification with mainland China and denigrated protesters in Hong Kong. The report found links between the pages and a Malaysian-based startup that offered netizens Singaporean dollars to promote the content.

But governments may find outsourcing this obscure work to come with risks as well, Brookie said. On the one hand, businesses are more difficult to control and may adopt unwanted messages or tactics.

On the other hand, companies organized around deception may be just as likely to turn these energies towards their customers, inflating budgets and charging for work that is never done.

“At the end of the day, crooks will scam online,” he said.

#Disinformation #Hire #Shadow #Industry #Quietly #Booming

Team GadgetClock
Team GadgetClock
Joel Gomez leads the Editorial Staff at Gadgetclock, which consists of a team of technological experts. Since 2018, we have been producing Tech lessons. Helping you to understand technology easier than ever.

Recent Articles

Related Stories

Stay on op - Ge the daily news in your inbox