#websci21: Review of Day 2

Data Literacy workshop

A number of themes were raised in the data literacy workshop which took place at WebSci21 on 22nd of June. These included questions surrounding data literacy and its place in the classroom, how adults from disadvantaged backgrounds could access it, and the ethics of data literacy in fields such as data science. Evidence of need for these discussions was presented by one academic who cited their experience of trying to teach younger children about data literacy to be met with questions such as “where does data go” and “how will automation change things?”

The workshop raised a number of questions regarding how to bridge gaps between working with data and the understanding of what power and meaning the data actually holds. A key question surrounding the definition of data literacy – how is it defined? Is it about digital skills, or does it apply to concepts such as justice, or ethics…? There featured an engaging discussion on the rolling out of a European-wide framework for data literacy, however, it was clear that a key challenge lies in the different structures and purposes embodied by individual frameworks.

Web Science Research Infrastructure Workshop 

This workshop embodied the interdisciplinary discussions championed by the Web Science ethos. A digital humanities angle was present in the material and research, with a number of scholars undertaking work that required archival, historic, or musical knowledge. A number of presentations were had, ranging from research on how to approach modeling infrastructures on the web by proposing tools to help create diagrammatic forms of connection and interaction. Further interdisciplinary approaches were also being used to create new lines of thinking on regards to creative sonification, through the use of knowledge graphs. There also featured discussions on the importance of online archival creations, storage and maintenance on the Web.

“Coded Bias” Spotlight Panel

 The film “Coded Bias” reveals how facial recognition software and automated decision-making have unprecedented power to reproduce bias at scale. As companies and governments increasingly outsource their services to entities which employ more machines and more machine learning, we can now see that algorithms are being used to decide what information we see, who gets hired, who gets health care, and who gets undue police scrutiny. The panel discussed the steps that can be taken by companies and individuals to change the way in which ordinary human bias and ignorance is encoded into our digitally-driven world. It is critical that we help enable machines to avoid the same mistakes as people have historically made.

The BBC World Service Digital Planet programme featuring our panel was broadcast two hours later:

 

Leave a Reply

Your email address will not be published. Required fields are marked *