It’s a fair bet that the New Zealand general election will not cause much of a ripple inside 55 Savushkina Street.
That is the St Petersburg address of the headquarters of the Russian Internet Research Agency (pictured above), which played havoc with the Brexit vote and the US Presidential election.
It may well be gearing up for another campaign among the all-too-susceptible voters of the United States, but we can be reasonably certain that the only danger we face is a bit of mischief during a troll’s lunch break.
New Zealand voters are far more likely to be targeted by home-grown trolls. It has already begun.
Over the weekend a billboard that looked like a Labour Party hoarding was posted on Facebook. The content was such that it obviously was not created by the party. The wording and the image were a slap in the face to Labour.
A quick Google image search threw up what I am fairly sure is the same image as on the poster – a photograph of Jacinda Ardern take by a Stuff photographer. Easy enough to cut and paste from the web.
I am no expert in digital manipulation detection but, in order to check whether the text had been added to an image, I submitted the image to fotoforensics.com, which conducts a pixel-by-pixel analysis. Its Error Level Analysis suggested to me that it was the product of a Photoshop session. Creating or altering an image in Photoshop produces a rainbow effect around the affected area that is detected by ERA. Here is the image and the ERA analysis. Make up your own mind.
Why did I go to all that trouble for an image that was so obviously a fake? Particularly when a version of the poster itself appeared on the Cameron Slater-linked website The BFD on Saturday. I did so to demonstrate what facilities are readily available to media (and the public) to detect fake images. Similar services such as Deepstar are available to detect fake videos, although that is proving to be a cat-and-mouse game as artificial intelligence improves quality of deep fakes (most, nonetheless, remain reasonably easy to detect with the right software).
The detection software is available, and it is vital that its use becomes routine for news media that increasingly look to the Internet as direct sources.
Those direct sources are not, of course, limited to lifting images from social media. Far more routine is the use of social media as the source of information. Just think of the number of heart-rending tales you have read that, usually near the end, happen to mention the unfortunates have set up a Give-a-little page for donations. The online fundraising platform is a rich source of pain and misery for click hungry newsrooms.
Social media will be trawled throughout the election campaign by reporters looking for angles that take the political contest beyond dry policy and he-said-she-said. This time there are referenda on recreational marijuana and euthanasia that also provide rich veins for prospecting on social media.
Politics have long been stock-in-trade for local trolls, and journalists have become skilled at weeding them out from legitimate discourse. However, Russia’s Internet Research Agency (IRA) has become adept at hiding in plain sight and their tactics will have permeated to other sources of disinformation.
Two researchers in South Carolina, Darren Linvill and Patrick Warren, examined IRA tweets generated during the 2016 presidential election and they found patterns emerging. Some were obvious, such as targeting people with particular world views and shaping messages accordingly. However, their findings on what happened to the messages over time is revealing.
Initially, trolls produce original material and replies to establish themselves in a ‘network’. As the ‘network’ matures, they transition to engaging in a mix of original material and re-tweets but their direct engagement declines over time. Then, as things start to run out of steam, the IRA trolls switch to what the researchers called an ‘amplification’ phase. It is here that their activity becomes harder to detect as their work. They begin to retweet large volumes of messages from outside sources that have picked up on the themes they created…and amplified them in their own words. The researchers point to the importance of the part these amplifiers play in the spread of disinformation: The trolls’ purpose is not to make themselves prominent but to promote real voices articulating implanted information.
Journalists may not detect the relationship with that original sources and legitimise what may be falsehoods. The amplifiers may be the disinformation equivalent of money launderers and sometimes that legitimisation will be as unwitting as a reporter’s subsequent use of the material.
Amplification does not even require the messenger to be one of the troll’s fellow travellers. Retweeting or sharing by detractors may still achieve the original intention. Which brings us back to the ‘Labour’ billboard. When I saw it at the weekend, it had not been posted on my page by its creator. It had been shared by someone revealing what they believed to be dirty tricks. It is the nature of social media that the image will now be crawling its way around Facebook in a hit-or-miss journey that will find its bogus message resonating with a certain percentage of those who see it.
No doubt it has also been seen by reporters, who dismiss it as satire or unsubtle mischief-making. It will have already been dismissed by Labour supporters and embraced by the party’s detractors.
Who is to say, however, that the next message will not be cloaked in greater subtlety and carry a more insidious message? We have seen the power of email messages alleging wrongdoing among our politicians. Hamish Walker, then Andrew Falloon, then Iain Lees-Galloway met their political end, which will have emboldened the dark side of social media.
Media lack the means – and the legal authority – to track external email trails but they do have the means to analyse social media. Software developed to allow commercial enterprises to track the success of social marketing campaigns can be repurposed. The South Carolina researchers, for example, used such a platform (Social Studio) to analyse the Russian IRA tweets.
The fact that New Zealand will not attract the undivided attention of the IRA (or of Chinese troll factories) is no cause for media complacency. Disinformation is an ever-present danger no matter where one lives in this post-truth world. And, because politics is the art of manipulation in pursuit of power, general elections are fertile ground.
The danger is that vigilant surveillance will be a task too far for newsrooms struggling to maintain output in the wake of Covid-driven redundancies. That makes them vulnerable. A collective effort and pooling of resources may be the only way to mount a defence.
Social media users themselves will divide into two groups: Critical thinkers who will not take what they see at face value but will apply reasoned judgment; and those who will believe what they want to believe – irrespective of whether it is fact or fiction – and reject anything that does not fit with their world view. Like Covid-19, there is no known vaccine to protect the latter group from being played for fools.
Not all images will be as obvious as the Labour billboard. Who is to say that someone was present when a photograph appears to suggest they were not. Or that two people apparently in deep conversation never actually met on that occasion? Here are two examples. The first is a crude Soviet example of the former: Nikola Yezhov, a secret police official, was purged by Stalin in more ways than one. The second demonstrates how to create an impromptu summit out of an empty chair.