And now for the really fun part…ethics and the Web

Professor Dame Wendy Hall opening the event

This post is authored by Anni Rowland Campbell and was originally posted on her blog, Intersticia

Last week the Web Science Institute hosted it’s first Ethics Symposium.  The event drew together a wonderfully diverse audience who were treated to with two excellent presentations from Woodrow Hartzog and Mireille Hildebrandt plus a vigorous panel discussion moderated by Thanassis Tiropanis.

Over the past six months Thanassis, Caroline Wilson, Leanne Fry and I have been working on our Ethics, Law and the Web Observatory project, of which this symposium was a key component.  The project itself seeks to “develop legal and ethical best practice in the deposit, sharing and re­using of deposited data in Web Observatories” as an “examplar Social Machine“.

Professor Dame Wendy Hall opened the seminar describing Web Science as “Data Science on steroids“.  Why is this?  Because Web Science seeks to not only understand the technical aspects of the Web as it develops, but to simultaneously contextualise these within the dynamic social frameworks of humanity itself.

As the Web is a living entity which is changing and evolving, so Web Science is ever changing and inter-disciplinary in its approach.  It is a boundary object which is both plastic enough to adapt to differing needs and constraints, yet robust enough to maintain a common identity. 

So, what does this mean in practice?  And how does it help to understand what the Web might look like in the future?

I am not a technologist but for me the Web is quite well defined by this entry on Wikipedia which states that

(t)he World Wide Web (WWW) is an information space where documents and other web resources are identified by URLs, interlinked by hypertext links, and can be accessed via the Internet.

Originally the Web was described in HTML rendered pages on computer screens, but as our “screens” are changing, so are the information spaces that feed them.  Nowhere is this more obvious than in the rapidly evolving areas of virtual, mixed and augmented reality, where the digital and physical worlds are becoming increasingly blended and symbiotic.

This is where the importance of design comes in to play, and in his opening address Woodrow Hartzog gave an insightful and quite disturbing presentation on the co-evolution of design and information architecture, stressing that

  1. Design matters for privacy
  2. Privacy Law should take design seriously
  3. Design should be rooted in both Consumer Protection and Surveillance Law.

Why is this so?  Because, quite simply, Design is Everywhere.  Everything is Design.

This was music to my ears because it links to J.J. Gibson’s Theory of Affordances which states that

the world is perceived not only in terms of object shapes and spatial relationships but also in terms of object possibilities for action (affordances) — perception drives action.

Leveraging Gibson, in his Design of Everyday Things, Donald Norman examines the relationship between design and affordances, and concludes that, at least in theory, good design should make affordances explicit.

So, what does this mean for our digital artefacts and the experiences they generate?  Norman examines this in The Design of Future Things as a way to begin to understand how our intelligent objects of the future may impact on our behaviours, but this is a space that we are only just beginning to appreciate and comprehend.

In my Digital Savvy workshops I focus on getting participants to become conscious of the link between experience and affordance with particular reference to the various affordances of the everyday information tools in both physical (analogue) versus electronic (digital) form.  This could be a press release, a book, a legal notice or a contract.

Hartzog’s key message is that

in each case there is a direct relationship between transaction costs and privacy.

So many times I have heard people tell me that they happily utilise social media apps because they are free; so many times I hear myself trying – mostly in vain – to tell them this is not the case, you are paying for them with your data.  The problem is that, as yet, most people don’t attribute value to their data, and it is difficult for people to imagine something which is intangible and of which they have no direct experience.

So, here are two examples of emerging digital currencies which I have found useful.

Firstly, I recently gave a presentation to the Board of the Money Advice Trust and we talked about emerging forms of monetary value.  The example I gave was of Pavegen which is a start-up company that not only creates smart flooring tiles which generate electricity, but has a mobile App where each footstep collected is converted into a digital currency that can be used to reward loyalty or to donate to charitable causes.  In other words they directly convert footsteps to electricity to a monetary currency which can be transacted.

Secondly, the current hype around Pokemon Go illustrates how Nintento has created an addictive game that merges the physical and digital worlds, generating huge amounts of data to be mined and monetised.  Initially the game had a huge impact on Nintendo’s share price but the more important point is that raised by Director Oliver Stone who describes Pokemon Go as “a new level of invasion” which demonstrates the commercial and invasive power of Surveillance Capitalism (echoing Zuboff’s latest ideas).  For an explicit example of this just think about the fact that 85% of every advertising dollar in the US goes to … you guessed it, Google.  And a huge part of that is due to the stickiness and ease of use of Google’s well designed user interface.

What this demonstrates only too clearly is that design is all important because most people have no idea of how their behaviours are being manipulated by the social machines with which they interact.

So, who makes the decisions about how the Web is designed and how users experience it?

On the Seminar Panel discussion Caroline Wilson stressed that whilst the law is black and white – something is legal or not legal – when it comes to ethics this is a moving target which links to evolving and changing societal values and beliefs.  Just because something is acceptable one day, does not mean it is acceptable the next, particularly if society becomes more informed and conscious of what is going on around them.

No where is this more apparent than in the area of our learning social machines and artificial intelligence. In her presentation Mireille Hildebrandt focused on data-driven agency, in other words, the emerging world in which things (as in machines) perceive their environment, receive feedback, learn and then teach themselves.  Already these machines are learning to anticipate human behaviours and to a very large extent we are becoming data bait – the cognitive resources for data driven applications where our behaviours are being modified to better suit the machines.

Samuel Arbesman states that we are entering the entanglement age where it is becoming increasingly difficult for humans to understand how the systems we have built actually work actually work due to both the incremental complication of legacy systems over time, as well as the increasing complexity of the machines themselves.

The truth is that we are now living in a human constructed jungle and it is crucial that our laws are used to protect us from our creations as much as from ourselves.  In her presentation Mireille emphasized we need to create a level playing field, where both the companies who make and sell the social machines, and the users who provide the data, both understand and appreciate the contractual relationship they are entering in to, and that this relationship is made transparent through responsible, articulate and judicious design.

Is this a tall order?  I fervently hope not, but what is required is that the best minds on the planet now focus themselves on these questions and collectively educate those charged with making and enforcing public policy to educate themselves about the potential scenarios that lie ahead.  This includes not just the existential threats to humankind, as foreshadowed by some of our great scientists, but more likely the fact that

If we don’t begin reprogramming technologies with a different set of values, we may lose the opportunity to take back control — and maybe even lose our humanness itself.  (Douglas Rushkoff)

I am thrilled that the Web Science Institute has now entered into this conversation, and a huge congratulations to all who organised the morning which generated a good deal of thoughtful conversation and debate.

But we have only just begun.  As Arbesman concludes

We must now walk humbly with our technology.

Save

Save

Leave a Reply

Your email address will not be published. Required fields are marked *