The fake news pandemic

The fake news pandemic

The Digital Alarmist

Roger Marshall a computer scientist, a newly minted Luddite and a cynic

Perhaps it may come as no surprise to the reader that it is possible these days for anyone with rudimentary computing knowledge and a social media account to create and disseminate fake news. However, what should really surprise the reader is that it can all be done in real time, the societal implications of which are still being analyzed by select academic institutions and some of the news media.

Of all the digital diseases to afflict a global society interconnected by a world of smart phones and assorted digital devices, the most cancerous is the one caused by totally believable fake news. Fake news created in real time differs from ‘ordinary’ fake news in that it is purely dynamic and enables hackers (State-sponsored or otherwise) to introduce or suppress digital content while an event is being live-streamed. It is a pandemic on the horizon and represents an extraordinary threat to democratic institutions, national and global financial stability, and communal harmony. Above all, it can start wars, intentionally or otherwise.

In the social media world of Facebook, WhatsApp, YouTube and Twitter, subscribers tend to look only at the positive aspects of staying ‘connected’ with their family and friends but conveniently ignore the negative aspects. This Pollyanna approach to the social web by users leaves them quite unprepared to deal with potential misuse of the minute details of their lives and persona which they have willingly posted through audio, video and text messages.

What does it take to create and propagate fake news in real time? The recipe is quite simple. It involves just three steps and is akin to starting a forest fire. To start a forest fire, all you need to do is gather some wood, light a match and fan the flames. Likewise, to start a fake news epidemic, you need some incident, suitably modify the data associated with the incident and post the manufactured incident using social media, and lastly, enable trolls and chatbots to rapidly propagate the fictitious incident until a ‘trending’ or ‘recommended for you’ status is achieved on various social media, at which point, the fake news attains a life of its own. The incident in question may involve a real person, a real place and a real event currently in progress or one which may have occurred in the past. The sustainability of the fake incident on the web is very much determined by how believable the fake version is and not so much on how long it stays undetected.

Totally believable fake news can have potentially lethal and political consequences as can be seen from the following example. Assume candidate X is competing against incumbent Y in some election. If X’s remark “Y wants to raise taxes. Vote him out” shows up in an audio-visual segment on the web as “Y wants to raise taxes. Take him out”, two likely scenarios are possible. In the first scenario, Y gets killed since some rabid supporter of X thought he was following X’s orders. In the second scenario, Y may conclude that X wants him killed and ask the authorities to prosecute X for advocating violence. Note that exactly one single syllable word has been changed in the original remark to yield the disastrous modified version.

While simple alterations in a text message can drastically change the meaning of the message, such alterations are easily detectable. The same cannot be said of sound and image patterns which have been altered. Tampering with audio and video recordings requires accurate splicing techniques to match sound patterns for word insertion or word deletion and simultaneously coordinating these changes with facial movements involving the speaker’s mouth, jaw, lips and eyes. To ensure believability of what is being shown of a speaker or what is being said by the speaker on the web requires no more than 30 minutes of audio recordings of the speaker and several thousand images of the speaker taken from various angles. However, this data is readily available from social media postings as is a variety of open source AI-enabled facial recognition and speech recognition software.

In this era of fake news, the adage ‘Disbelieve everything you see or hear’ seems appropriate – it is not all that different from the code of conduct symbolized by the three wise monkeys (‘see no evil, speak no evil, hear no evil’). To which we might add ‘do no evil’, Google’s motto, which it abandoned years ago.

DH Newsletter Privacy Policy Get top news in your inbox daily
Comments (+)