— Clare Hooper (@ClareJHooper) July 9, 2020
Posted on behalf of Ian Brown
In Thursday’s “Meet the Author” session Dame Wendy Hall (Exec Director of Southampton’s Web Science institute) spoke to Prof. Phil Howard Director of the Oxford Internet Institute about his work in Web Science and his latest book “Lie Machines”.
Dame Wendy introduced the topic of our dependency on information sourced from the Web (particularly during the Covid-19 pandemic) and the extent to which this information can be trusted.
Phil introduced the work of his Computational Propaganda group which employs more than a dozen people focussed on improving the quality of public life through social data science designed to counter the effects of political misinformation, thereby strengthening our trust in vital institutions and more broadly supporting democracy.
Phil’s current work focuses on what has been called “Lie Machines”. He defines these as a dyad of a social (political) element and a technical (algorithmic) mechanism putting an untrue claim into the infosphere in the service of an ideology.
Recent examples of Lie machines include disinformation campaigns to undermine the UK 5G roll-out, to accuse Bill Gates of orchestrating Coronavirus and alleging that governments are planning to implant RFID chips in the human population for tracking/control purposes. Whilst some claims may seem easily recognisable as fraudulent, others can be highly complex/nuanced stories existing/developing across multiple social platforms.
The OII have studied the activities of more than 30 countries using mixed methods (both Quant/Qual approaches) and whilst it is freely acknowledged that many countries (including US/UK) engage in propaganda through social media, the research has shown particularly high levels of activity in Russia and more recently China. It is partcularly noteable where foreign governments are seen to be extremely active on social media platforms that are unavailable within their own country to their own citizens. Phil’s team have tracked the growth of state-sponsored propaganda programs rising from 28 countries in 2017 to 70 countries in 2019.
The tracking and attribution of disinformation sources and where fake accounts are managed is a key activity and Phil summarised their analysis dating back to 2017 of more than 3’500 fake accounts managed in St. Petersburg, which were presented as US citizens apparently talking about politics. Many social media platforms are affected by this practice and substantial quantites of fake account data has been handed to US authorities by Google, Facebook, Twitter and Instagram as part of investigations into election tampering.
For example it has been suggested that key US swing states had been targeted by circulating disinformation on how/when to vote and attempting voter supression by encouraging key demographics ( e.g. women, african american voters, hispanic voters) to stay away from the polls by spreading discontent/disinformation racial and gender issues.
The challenge, Phil argues, is two-fold:
1. Modern lobbyists (both domestic and foreign) now have access to affordable methods to substantially further their political goals using ethically dubious methods and where the repercussions/penalties for doing so (even if caught) are moderate.
2. The best quality and most complete data about the public (voters) is not in public hands but rather privately held by Silicon Valley companies and is widely sold and leveraged with little oversight.
Whilst these tools are not (yet) powered by AI, as protagonists learn to combine sources of big social data with deep fakes and other sophisticated techniques in an effective way then Phil predicts that political campaigns will spend considerable amounts of money to employ such techniques to influence voters and win elections.
There are significant risks in allowing disinformation campaigns to run unchallenged.
– Undermining scientific guidance through conspiracy theories
– Generating racial and cultural tensions leading to civil unrest
– Undermining the trust of populations in public institutions
– Tampering with process of democratic elections
What can we do?
Phil encourages us to get smarter about the systems we use: understanding what our devices are doing/sharing, curating our information sources, checking 3rd party sources rigorously before re-posting, and trying to identify and eliminate bots as information sources wherever possible.
As we prepare for the next US election, labs around the world are studying mis-information, fake news and highlighting co-ordinated attempts to influence outcomes, but Phil warns there is a wider challenge.
We express ourselves through our data flows and the inferences about us are wider than just political. The profiles that are being built about us are continually refined through all our actions and thus getting control of our data, Phil argues, is both vital and urgent if we are to avoid cynical/unethical manipulation by third parties.
Dame Wendy asked Phil to characterise or give examples of the best case and worst case scenarios in this space. Phil responded:
– The Worst Case Future. Our device manufacturers merge with our social network data holders (e.g. FB/Samsung become a single controller of both the sensors capturing social data and the holder of detailled social data profiles)
– The Best Case Future. The widespread availability of devices (separate from social platforms) that make explicit where your data is going and how it used, combined with the ability to add/remove groups who can access your data in order to support those you trust ( e.g. NHS, academic groups researching for public benefit ) and restrict data to groups harvesting data for commercial profit and manipulation purposes.
Philip N. Howard is a professor and writer. He is the Director of the Oxford Internet Institute at Oxford University and is the author of several books including, most recently: Lie Machines: How to Save Democracy from Troll Armies, Deceitful Robots, Junk News Operations, and Political Operatives from Yale University Press.
There is a special discount available for Phil’s book use the code YLIES at the checkout to get a 30% discount.