A consulting firm called Wikistrat, whose owner has an incredibly low online profile, allegedly met with President Donald Trump’s campaign in 2016 to market an idea of cyber interference in America’s elections’ process using fake social media accounts, similar to what actually happened in 2016 by Russian troll farms.
The owner of that company has also reportedly met with investigators looking at Russia’s interference in the 2016 presidential campaign.
It’s unclear whether simulations by Wikistrat were ever carried out by the Trump campaign or not. Also unclear is whether Wikistrat was approached by any individual or campaign to conduct the simulations beforehand. But a former employee at the firm told The Daily Beast, which published a report on these events on Tuesday morning, that he has his doubts.
“At the time we were discussing the subject of cyber-interference in democratic processes, it seemed and felt like just another idle intellectual exercise and scenario planning project for political scientists,” former Wikistrat analyst Peter Marino told the publication. “But retrospectively, it feels a bit too on-the-nose not to be disturbing.”
We got a ton of internal communications from Wikistrat, a company whose CEO is a Mueller witness and who talked w DJTJr about a social media influence plan https://t.co/WiYqGamrQP
— Betsy Woodruff (@woodruffbets) January 30, 2019
The simulations were conducted days after Trump announced his presidential run. Months later in April 2016, the company’s founder, Joel Zamel, an Israeli-Australian social media expert with ties to Middle Eastern intelligence groups, met with Trump campaign member Rick Gates to discuss the idea of social media interference in political campaigns. Gates was once the deputy campaign chairman of Trump’s campaign, per reporting from the New York Times.
Zamel is an incredibly unknown figure — public images of him do not seem to exist online at all. However, his work allowed him to offer the idea up to Trump officials through another company he owns, Psy Group. He has also worked with Middle Eastern countries, including the UAE, which contracted him to conduct “war game” scenarios in the past involving political movements in Yemen, according to reporting from Haaretz.
The idea Zamel pitched to the Trump campaign is eerily similar to what actually happened with Russian troll farms during the campaign. Zamel reportedly told the campaign that it could create a cyber-interference scheme in which thousands of fake social media accounts could be used to spread support online for Trump, while simultaneously blasting his political opponents.
Zamel reportedly met another time with the Trump campaign, this time with the candidate’s son Donald Trump Jr., after Trump Sr. had won the Republican Party’s nomination. By this time, Russian “bots” were already implementing an idea similar to Zamel’s.
There is no evidence yet that Russian troll farms were doing so at the direction or in conjunction with the Trump campaign. Zamel was contacted and has spoken with Russia investigation special counsel Robert Mueller’s legal team regarding his pitches to the Trump team. Opinions on the matter, among those who worked for the company, are split — some say the campaign never pursued it, while others say it was partially implemented.
But the communications between team members at Wikistrat, that The Daily Beast was able to obtain, is definitely intriguing, to say the least. The ideas they discussed mirror, in many ways, the operations that Mueller’s team has alleged Russian troll farms engaged in during the campaign.
For example, one Wikistrat analyst pointed out that direct involvement from someone attempting to interfere with a campaign would be less preferable to finding outside actors to do so.
“[U]sing cyber-mercenaries to enact these operations would create a degree of indirection and a veneer of plausible deniability that would make it harder to clearly separate propaganda from facts,” that analyst wrote.
Another analyst touted the ways in which such internet trolls, acting on behalf of a campaign, could successfully use human reactions to their posts online to their advantage.
Troll farms “are not afraid to use provocative and confrontational language, as it is to their advantage if it leads to an emotional rise in the reader because the reader is then more likely to engage in debate, which in-turn, creates more buzz and attracts a greater audience, increasing the potential number of people exposed to this misinformation campaign,” that analyst opined.