The election campaign that is being conducted on social networks provides politicians with an opportunity to promote themselves, along with countless false reports, inciting posts and ‘treasonous’ bots • In practice, Meta and YouTube do not put enough effort into enforcement in Israel • “Fake news is like a product of lack of governance in the south, we must establish a body that supervises About this”
The digital space is especially vulnerable and threatened during an election period.
Bots, fake profiles and Facebook groups set up to spread ideas or discredit candidates are the main techniques used by various parties to influence voters.
In addition to these, there are also threats from enemy countries, and surprisingly – they are less interested in which candidate will be elected, more important to them is to undermine trust in various institutions.
“We assume that Iran is operating in this space. Iranian psychology experts and intelligence personnel are working to undermine the government,” says Lotem Finkelstein, director of Check Point’s research division.
Iranian psychology experts are working to undermine the government.
President of Iran Raisi, photo: None
“At the level of the networks, it is possible to point to network activity that does not correspond to the existing polls, such as a party on the fringes of the political system, which we suddenly see very active on the networks. In terms of cyber, we are talking about disruption attacks – bringing down websites, for example the website of the Election Commission that shows the percentage of votes for a day The elections, this is an image target: ‘We hacked into the account of a certain party, we hacked into a politician’s phone’. If he was exposed to a cyber attack, this affects our consciousness. Iran’s cyber militias, such as the “Black Shadow” group, also act in accordance with the interests of the government in Iran. A sense of insecurity in the cyber space, which will also make us question the credibility of the results. If the Election Commission’s website is hit by a cyber attack, it may very well lead the residents to question the credibility of the results. It doesn’t have to be right or left, the goal is to undermine credibility.”
In the current election campaign, the Shin Bet warned against foreign interference in the elections. The Institute for National Security Studies warned against encouraging demonstrations – which would be combined by Iranian bots, and attempts to suppress voting in various sectors. The former head of the Shin Bet, Nadav Argaman, warned in 2019 that “a foreign country interfere in the elections.
I don’t know for whose benefit or to whose detriment.
I don’t know at this point how to identify the political interest, but it will intervene and I know what I’m talking about.” According to him, the foreign country will try to do this through hackers.
A long-time investor in his social networks.
Lapid, photo: from Facebook
“Foreign intervention usually does not try to promote one candidate or slander another candidate, the goal is to sow polarization, distrust in institutions, to inflame countries that are already internally polarized. Sometimes the same graphics that slander Netanyahu – you see it with Bennett as well. The emphasis is to further divide more and cause a loss of confidence in the democratic process. Highlighting police violence is another function; spreading fake news on election day, for example spreading that there is an attack near polling stations, with the aim of suppressing voting. This is a main goal of foreign influence. Spreading a fake that will disrupt and make people not go out and vote,” expands D Rev. Assaf Viner, VP of Regulation at the Israel Internet Association.
Models “at service” Ben Gvir
In March 2021, US National Intelligence published a report accusing Russian President Vladimir Putin of meddling in the US election. “The Russian government acted to influence public perceptions in the US,” the report said. In an article titled “Trolls for Trump: How Russia is Trying to Destroy Our Democracy,” the researchers tracked down 7,000 pro-Trump accounts on social media. The study detailed how the accounts defame critics of Russia’s activities in Syria, and spread lies about Hillary Clinton’s health. The Washington Post reported that the Kremlin paid For those accounts, trolls and bots, to defame Clinton.
“Russia tried to destroy democracy in the USA”: Putin and Trump, photo: AP
Even before Russia’s alleged use of bots to bias the results of the US elections, the Philippines already used an innovation in 2016. The president of the Philippines, Rodrigo Duterte, admitted that he paid an army of trolls on social networks to help him during the election campaign. And back to the 2022 elections – an initiative of “Fake Reporter” reported on many fictitious Twitter accounts promoting MK Itamar Ben Gabir.
But this time the bots were “improved” and got the faces of different women, some of them models from outside Israel, and some, as it turned out, porn actresses.
Following the publication, Twitter removed some of the profiles.
In “Otzma Yehudit” they claimed that they are not familiar with such activity: “We understand that all the comments on the left are fake and you think that everyone is operating with bots and artificial comments, but we are sorry to disappoint you.”
“It’s not just that pictures of young women with the same profiles are chosen for certain sectors, there is a lot of psychology that shapes those campaigns,” explains Finkelstein.
Meet “Bat Gvir” – “Karin”.
An example of a profile opened in 817 with a stolen photo of a German model.
Before the reunion with @bezalelsm last Friday, Karin began praising and encouraging Ben Gabir to split from @zionutdatit and run for the Knesset on a separate list.
And she is not alone.
— fake reporter |
FakeReporter (@FakeReporter) August 30, 2022
“There is no one who will collect a price from those who publish fake news. There is no body that enforces it. Fake news is like a product of a lack of governance in the south. We must establish a body that oversees it,” declares Dori Ben Israel, advertiser and founder of the site ‘Mazbala’.
“The bots have gone out of proportion, any user who is not real is called a bot. If a person opens a profile on Twitter and tweets – there is still a person behind it, a ‘sock puppet’ account. This is a buzzword. Bibi is the king of bots, but not the ones attributed to him, that It’s light years away. Bibi has now released a Telegram bot that answers you. I don’t think there is anyone who comes close to the extent of the budget that the Likud invests. It could be that Netanyahu invests in one election round – a budget that Gantz, Lapid and Shaked invest in five election rounds.”
“King of the bots”.
Netanyahu, photo: from Facebook
Do you think that the networks in Israel have a big influence on the election results?
“For sure. This is the town square, it’s a place for discourse and influence. Look at what Netanyahu has been running in recent months compared to other politicians who are running individual messages. The networks are major platforms.”
“You see a correlation between someone’s popularity on social networks and what happens afterwards in practice. Bibi has a monstrous page and a presence on every platform. Shaked, on the other hand, barely passes the blocking percentage and is also very weak on social networks. Lapid is very dominant overall and is very strong on the networks Lapid has been investing in his social networks since ancient times. Gantz is an unusual phenomenon in that he is less influential on the networks, but he brings mandates.”
MKs’ expenses on public relations, 2021, photo: None
In your opinion, a network anchor can influence the choice of young people at the ballot box?
“It’s not necessarily to influence politically, it’s to influence in terms of the mindset of what’s happening and that’s what influences them politically. Let’s say young Arabs – it’s known that YouTube and Facebook flood them with content that arouses in them all kinds of negative emotions about Israel or about Jews, and this lights the match for unethical behavior That’s right. When you go to YouTube and it shows you videos, you don’t know what the considerations are,” Dori concludes.
Why do the networks challenge us in the elections?
“Rapid, viral and non-transparent distribution of information. The whole thing about fake news, without us knowing who is behind it. In the old world, it is clear that if I read something in the newspaper or see something on TV, we know who is behind the expression. In the networks there is an arena of public discourse that can be biased the voices that are heard there, we can’t always know who is behind the voices that are heard there. The ability to tailor persuasive messages regarding election matters to different types of people according to their psychological characteristics. Consciously or unconsciously we provide a lot of information on the networks,” explains Assaf from the Internet Association.
“In Israel they treat less.”
Meta CEO Mark Zuckerberg, Photo: AP
What’s the problem with bots?
“The bots are an opportunistic network economy – today they help one candidate and tomorrow they will help another. Because of this understanding that this is a double-edged sword, we have to condemn it. It distorts the election system. It changes according to the price multiplier. The more money a certain side has, that’s what it looks like The reality on the networks is skewed to the same side,” says Lotem from Check Point.
To deal with all the dangers of the digital space in the elections, there are three axes that need to do work: the first axis – the enforcement authorities, the election committee and the police;
Second axis: the public to report;
And a final axis: the platforms themselves.
“There are a number of different bodies that are supposed to respond, depending on the type of matter. The security authorities are responsible for thwarting foreign influence. The Election Commission – its main role is to enforce the marking of propaganda broadcasts, such as finding out who is behind certain messages on the social network. We depend on people’s reports of illegal content Or misleading on social networks,” Assaf elaborates.
Who is responsible for handling and enforcing?, Photo: None
The meta has community rules that it sets and enforces.
There are rules that prohibit fake accounts and what we see, especially in Israel, is that the networks do not always enforce the community rules, and even make wrong decisions when it comes to non-English content.
Reports about accounts with fake names are not answered, so we understand that there is a systematic failure to enforce these rules in Israel.”
“An unusual phenomenon – less influential in the networks but brings mandates.”
Gantz, photo: from Gantz’s Twitter
“Facebook’s content management systems, because they are built on artificial intelligence, are adapted to Western languages - this is what we trained them for, and they are not successful when it comes to other languages. Therefore, it is clear to us that these things are not enforced – I do not know if it is a lack of technological ability? A lack of resources ?Priorities? Apparently the Israelis get a less good product than their counterparts in Europe and the USA.
For the past two years Meta has been preparing to maintain the integrity of the elections in certain countries – it also understands the fear of abuse – it established a team in the US and Brazil to deal with fake news phenomena, before that it did it in the Philippines and India. Our question – why not in Israel?”
Is the state doing enough?
“There have been thought processes in Israel for the past two years regarding the regulation of networks and offensive content and fake news. A committee of the Ministry of Communications was established which is supposed to formulate an outline for the regulation of social networks. There is still no proposal on the table on behalf of the government bodies. The state cannot produce a remedy until it has understood where these failures come from. Meta does not provide information – how many reports are there in Israel, what is Meta’s handling order? Are these automated systems or personnel trained in handling?”.
About a month ago, Meta admitted that it did not employ enough content reviewers to deal with the sharp increase in the amount of inflammatory posts during the Wall Guard operation.
It is evident that the technology giant strives to learn lessons and improve, but it seems that the road is still long.
Meta’s response: “Every day we block millions of fake accounts from registering even before they report to us. We work with over 80 fact-checking organizations that check and rate content in more than 60 languages around the world, and in Israel we work with Reuters.” No comment received from Google to the claims made in the article on YouTube.
were we wrong
We will fix it!
If you found an error in the article, we would appreciate it if you shared it with us