TAS Governance Node

This Node will explore how to devise a framework for the regulation of autonomous systems - to help them respond to complexities of the world they function in.

In order for autonomous systems to become trustworthy, we will need frameworks of informal and formal governance that must be co-designed with the systems themselves.  Our Node will explore the design of the frameworks for responsibility and accountability, and tools to help the ecosystem of developers and regulators embed these values in systems.

Our Node brings together computer science and AI specialists, legal scholars, AI ethicists, as well as experts in science and technology studies and design ethnography.  Together we are developing a novel software engineering and governance methodology that includes:

  • New frameworks that help bridge gaps between legal and ethical principles (including emerging questions around privacy, fairness, accountability and transparency) and an autonomous systems design process that entails rapid iterations driven by emerging technologies (including, eg, machine learning in-the-loop decision making systems)
  • New tools for an ecosystem of regulators, developers and trusted third parties to address not only functionality or correctness, but also questions of how systems fail, and how one can manage evidence associated with this to facilitate better governance
  • Evidence base from full-cycle case studies of taking autonomous systems through regulatory processes, as experienced by our partners, to facilitate policy discussion regarding reflexive regulation practices.
People rushing

Node members

Students at work

Project Partners

AI brain concept

Node Research Overview