‘Team Jorge’ unit exposed by undercover investigation
Group sells hacking services and access to vast army of fake social media profiles
Evidence unit behind disinformation campaigns across world
Mastermind Tal Hanan claims covert involvement in 33 presidential elections
Group sells hacking services and access to vast army of fake social media profiles
Evidence unit behind disinformation campaigns across world
Mastermind Tal Hanan claims covert involvement in 33 presidential elections
by Stephanie Kirchgaessner, Manisha Ganguly, David Pegg, Carole Cadwalladr and Jason Burke
Part 2 - The undercover footage
Given their expertise in subterfuge, it is perhaps surprising that Hanan and his colleagues allowed themselves to be exposed by undercover reporters. Journalists using conventional methods have struggled to shed light on the disinformation industry, which is at pains to avoid detection.
The secretly filmed meetings, which took place between July and December 2022, therefore provide a rare window into the mechanics of disinformation for hire.
Three journalists – from Radio France, Haaretz and TheMarker – approached Team Jorge pretending to be consultants working on behalf of a politically unstable African country that wanted help delaying an election.
The encounters with Hanan and his colleagues took place via video calls and an in-person meeting in Team Jorge’s base, an unmarked office in an industrial park in Modi’in, 20 miles outside Tel Aviv.
The secretly filmed meetings, which took place between July and December 2022, therefore provide a rare window into the mechanics of disinformation for hire.
Three journalists – from Radio France, Haaretz and TheMarker – approached Team Jorge pretending to be consultants working on behalf of a politically unstable African country that wanted help delaying an election.
The encounters with Hanan and his colleagues took place via video calls and an in-person meeting in Team Jorge’s base, an unmarked office in an industrial park in Modi’in, 20 miles outside Tel Aviv.
Hanan described his team as “graduates of government agencies”, with expertise in finance, social media and campaigns, as well as “psychological warfare”, operating from six offices around the world. Four of Hanan’s colleagues attended the meetings, including his brother, Zohar Hanan, who was described as the chief executive of the group.
In his initial pitch to the potential clients, Hanan claimed: “We are now involved in one election in Africa … We have a team in Greece and a team in [the] Emirates … You follow the leads. [We have completed] 33 presidential-level campaigns, 27 of which were successful.” Later, he said he was involved in two “major projects” in the US but claimed not to engage directly in US politics.
In his initial pitch to the potential clients, Hanan claimed: “We are now involved in one election in Africa … We have a team in Greece and a team in [the] Emirates … You follow the leads. [We have completed] 33 presidential-level campaigns, 27 of which were successful.” Later, he said he was involved in two “major projects” in the US but claimed not to engage directly in US politics.
It was not possible to verify all of Team Jorge’s claims in the undercover meetings, and Hanan may have been embellishing them in order to secure a lucrative deal with prospective clients. For example, it appears Hanan may have inflated his fees when discussing the cost of his services.
Team Jorge told the reporters they would accept payments in a variety of currencies, including cryptocurrencies such as bitcoin, or cash. He said he would charge between €6m and €15m for interference in elections.
Team Jorge told the reporters they would accept payments in a variety of currencies, including cryptocurrencies such as bitcoin, or cash. He said he would charge between €6m and €15m for interference in elections.
However, emails leaked to the Guardian show Hanan quoting more modest fees. One suggests that in 2015 he asked for $160,000 from the now defunct British consultancy Cambridge Analytica for involvement in an eight-week campaign in a Latin American country.
In 2017 Hanan again pitched to work for Cambridge Analytica, this time in Kenya, but was rejected by the consultancy, which said “$400,000-$600,000 per month, and substantially more for crisis response” was more than its clients would pay.
There is no evidence that either of those campaigns went ahead. Other leaked documents, however, reveal that when Team Jorge worked covertly on the Nigerian presidential race in 2015 it did so alongside Cambridge Analytica.
Alexander Nix, who was the chief executive of Cambridge Analytica, declined to comment in detail but added: “Your purported understanding is disputed.”
In 2017 Hanan again pitched to work for Cambridge Analytica, this time in Kenya, but was rejected by the consultancy, which said “$400,000-$600,000 per month, and substantially more for crisis response” was more than its clients would pay.
There is no evidence that either of those campaigns went ahead. Other leaked documents, however, reveal that when Team Jorge worked covertly on the Nigerian presidential race in 2015 it did so alongside Cambridge Analytica.
Alexander Nix, who was the chief executive of Cambridge Analytica, declined to comment in detail but added: “Your purported understanding is disputed.”
Team Jorge also sent Nix’s political consultancy a video showcasing an early iteration of the social media disinformation software it now markets as Aims. Hanan said in an email that the tool, which enabled users to create up to 5,000 bots to deliver “mass messages” and “propaganda”, had been used in 17 elections.
“It’s our own developed Semi-Auto Avatar creation and network deployment system,” he said, adding that it could be used in any language and was being sold as a service, although the software could be bought “if the price is right”.
Team Jorge’s bot-management software appears to have grown significantly by 2022, according to what Hanan told the undercover reporters. He said it controlled a multinational army of more than 30,000 avatars, complete with digital backstories that stretch back years.
Demonstrating the Aims interface, Hanan scrolled through dozens of avatars, and showed how fake profiles could be created in an instant, using tabs to choose nationality and gender and then matching profile pictures to names.
“It’s our own developed Semi-Auto Avatar creation and network deployment system,” he said, adding that it could be used in any language and was being sold as a service, although the software could be bought “if the price is right”.
Team Jorge’s bot-management software appears to have grown significantly by 2022, according to what Hanan told the undercover reporters. He said it controlled a multinational army of more than 30,000 avatars, complete with digital backstories that stretch back years.
Demonstrating the Aims interface, Hanan scrolled through dozens of avatars, and showed how fake profiles could be created in an instant, using tabs to choose nationality and gender and then matching profile pictures to names.
“This is Spanish, Russian, you see Asians, Muslims. Let’s make a candidate together,” he told the undercover reporters, before settling on one image of a white woman. “Sophia Wilde, I like the name. British. Already she has email, date birth, everything.”
Hanan was coy when asked where the photos for his avatars came from. However, the Guardian and its partners have discovered several instances in which images have been harvested from the social media accounts of real people. The photo of “Sophia Wilde”, for instance, appears to have been stolen from a Russian social media account belonging to a woman who lives in Leeds.
The Guardian and its reporting partners tracked Aims-linked bot activity across the internet. It was behind fake social media campaigns, mostly involving commercial disputes, in about 20 countries including the UK, US, Canada, Germany, Switzerland, Mexico, Senegal, India and the United Arab Emirates.
Hanan was coy when asked where the photos for his avatars came from. However, the Guardian and its partners have discovered several instances in which images have been harvested from the social media accounts of real people. The photo of “Sophia Wilde”, for instance, appears to have been stolen from a Russian social media account belonging to a woman who lives in Leeds.
The Guardian and its reporting partners tracked Aims-linked bot activity across the internet. It was behind fake social media campaigns, mostly involving commercial disputes, in about 20 countries including the UK, US, Canada, Germany, Switzerland, Mexico, Senegal, India and the United Arab Emirates.
This week Meta, the owner of Facebook, took down Aims-linked bots on its platform after reporters shared a sample of the fake accounts with the company. On Tuesday, a Meta spokesperson connected the Aims bots to others that were linked in 2019 to another, now-defunct Israeli firm which it banned from the platform.
“This latest activity is an attempt by some of the same individuals to come back and we removed them for violating our policies,” the spokesperson said. “The group’s latest activity appears to have centred around running fake petitions on the internet or seeding fabricated stories in mainstream media outlets.”
In addition to Aims, Hanan told reporters about his “blogger machine” – an automated system for creating websites that the Aims-controlled social media profiles could then use to spread fake news stories across the internet. “After you’ve created credibility, what do you do? Then you can manipulate,” he said.
“This latest activity is an attempt by some of the same individuals to come back and we removed them for violating our policies,” the spokesperson said. “The group’s latest activity appears to have centred around running fake petitions on the internet or seeding fabricated stories in mainstream media outlets.”
In addition to Aims, Hanan told reporters about his “blogger machine” – an automated system for creating websites that the Aims-controlled social media profiles could then use to spread fake news stories across the internet. “After you’ve created credibility, what do you do? Then you can manipulate,” he said.
Comments
Post a Comment