{"id":5479,"date":"2020-07-12T09:21:06","date_gmt":"2020-07-12T09:21:06","guid":{"rendered":"http:\/\/blog.soton.ac.uk\/wsi\/?p=5479"},"modified":"2020-07-12T09:21:06","modified_gmt":"2020-07-12T09:21:06","slug":"websci20-workshop-explanations-for-ai-computable-or-not-by-robert-thorburn","status":"publish","type":"post","link":"https:\/\/blog.soton.ac.uk\/wsi\/websci20-workshop-explanations-for-ai-computable-or-not-by-robert-thorburn\/","title":{"rendered":"#WebSci20 &#8211; Workshop Explanations for AI: Computable or not? by Robert Thorburn"},"content":{"rendered":"<p><em>Posted on behalf of Robert Thorburn<\/em><\/p>\n<p>Day two of the 2020 Web Science conference saw a series of workshops covering topics ranging from Cyber Crime to Digital (In)Equality. The fourth of these workshops, chaired by Prof Sophie Stalla-Bourdillon, investigated whether explanations for AI are computable. Focus areas included the participation of AI systems in socially sensitive decision making and how to approach such systems when they function as black boxes.<\/p>\n<p>The workshop took the form of four papers and associated discussions, with the University of Southampton\u2019s Kieron O\u2019Hara opening proceedings with his paper entitled: \u201cIn No Circumstance Can or Should Explanations of AI Outputs in Sensitive Contexts Be Wholly Computable.\u201d O\u2019Hara advances the position that computed accounts of outputs from AI systems can not solely be used as explanations of decisions made by those systems. This, in part, rests on an understanding of computation as an act of derivation relating only to elements contained within the system under study. Further complicating the issue are legislative requirements such as those of the GDPR.<\/p>\n<p>Next, Jennifer Cobbe from the University of Cambridge presented her paper entitled: \u201cReviewable Automated Decision-Making.\u201d Reviewability was presented as a broader and more inclusive counter to traditional auditability. Specifically, due to the former\u2019s inclusion of issues such as deployment context, engineering decisions, and other factors to form a socio-technical understanding of automated decision-making. Rounding out the paper presentations, Perry Keller and Gerard Canal both from Kings College London introduced their papers respectively titled \u201cPaternalism in the Public Governance of Explainable AI\u201d and \u201cTrust in Human-Machine Partnerships\u201d. Both papers yet again explored the richer but also more challenging understanding of AI decision-making that results when a socio-technical approach is taken.<\/p>\n<p>Directly following the paper presentations, Niko Tsakalakis from the University of Southampton, introduced the Provenance-driven &amp; Legally-grounded Explanations for Automated Decisions (PLEAD) project. PLEAD is an interdisciplinary undertaking aimed at explaining \u201c<em>the logic that underlies automated decision-making<\/em>\u201d. Notably, the project aims to provide a provenance interface.<\/p>\n<p>Finally, the workshop was concluded by way of an open discussion by the presenters both of each other\u2019s points and also of questions raised by attendees. A first key point of discussion was the role of data provenance in understanding AI functionality, specifically with regards to data flows across an organisation. It was also mentioned that this process could be undermined by inconsistencies in data treatment and naming conventions. This discussion of data flow over time then finally lead into a consideration of real-time AI. It was proposed that explaining AI decision-making in real-time is not truly feasible at present due to technical limitations. This is however mitigated by such a request being highly unlikely, with explanations generally being requested after the fact.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Posted on behalf of Robert Thorburn Day two of the 2020 Web Science conference saw a series of workshops covering topics ranging from Cyber Crime to Digital (In)Equality. The fourth of these workshops, chaired by Prof Sophie Stalla-Bourdillon, investigated whether explanations for AI are computable. Focus areas included the participation of AI systems in socially sensitive decision making and how &#8230;<\/p>\n","protected":false},"author":99807,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"jetpack_post_was_ever_published":false,"_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":false,"_jetpack_newsletter_tier_id":0,"_jetpack_memberships_contains_paywalled_content":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":false,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[1],"tags":[],"class_list":["post-5479","post","type-post","status-publish","format-standard","hentry","category-uncategorized","column","threecol"],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"jetpack_shortlink":"https:\/\/wp.me\/p44UOk-1qn","jetpack-related-posts":[],"_links":{"self":[{"href":"https:\/\/blog.soton.ac.uk\/wsi\/wp-json\/wp\/v2\/posts\/5479","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/blog.soton.ac.uk\/wsi\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blog.soton.ac.uk\/wsi\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/blog.soton.ac.uk\/wsi\/wp-json\/wp\/v2\/users\/99807"}],"replies":[{"embeddable":true,"href":"https:\/\/blog.soton.ac.uk\/wsi\/wp-json\/wp\/v2\/comments?post=5479"}],"version-history":[{"count":1,"href":"https:\/\/blog.soton.ac.uk\/wsi\/wp-json\/wp\/v2\/posts\/5479\/revisions"}],"predecessor-version":[{"id":5480,"href":"https:\/\/blog.soton.ac.uk\/wsi\/wp-json\/wp\/v2\/posts\/5479\/revisions\/5480"}],"wp:attachment":[{"href":"https:\/\/blog.soton.ac.uk\/wsi\/wp-json\/wp\/v2\/media?parent=5479"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blog.soton.ac.uk\/wsi\/wp-json\/wp\/v2\/categories?post=5479"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blog.soton.ac.uk\/wsi\/wp-json\/wp\/v2\/tags?post=5479"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}