This post is also available in: frfrançais (French)

Project Overview and Status

The Law Commission of Ontario’s (LCO) multiyear AI, ADM and the Justice System project brings together policymakers, legal professionals, technologists, NGOs and community members to discuss the development, deployment, regulation and impact of artificial intelligence (AI), automated decision-making (ADM) and algorithms on access to justice, human rights, and due process.

The catalyst for this project is the extraordinary growth in the use of these technologies by governments and public agencies across the world. AI and ADM systems are increasingly being used to make decisions affecting personal liberty, government benefits, regulatory compliance and access to important government services.  The growth in this technology has been controversial:  Questions about racial bias, “data discrimination,” “black box” decision-making and public participation have surfaced quickly and repeatedly when AI and ADM systems are used by governments.  These issues, and others, raise new and complex law reform issues that have not yet been addressed in Canada.

The LCO has assembled an expert Advisory Group to provide input over the course of the project.

Core Projects

The first phase of this project addresses several foundational issues that arise when this technology is introduced by governments, including human rights, due process, legal accountability and public participation.  Major initiatives in the first phase of this project include:

AI and Automated Decision-Making in the Criminal Justice System

AI and algorithms are increasingly being used to support police and judicial decision-making in criminal justice systems across the world.  In October 2020, the LCO published The Rise and Fall of Algorithms in the American Justice System:  Lessons for Canada.  This Issue Paper provides Canadians with important insights and lessons about the use of AI and ADM in criminal justice.  This paper addresses issues such as data discrimination, the “metrics of fairness”, “scoring”, algorithmic accountability, best practices and public participation.

More on the LCO’s AI and Automated Decision-Making in the Criminal Justice System project is available here.

AI and Automated Decision-Making in the Civil/Administrative Justice System

This project is the civil and administrative law equivalent of the LCO’s criminal justice system project.  This project considers the use of AI, ADM and algorithms in regulatory investigations, government benefit determinations and to support decision-making in the civil and administrative justice systems.

The LCO’s Issue Paper, AI, ADM and Government Decision-making, is expected in the spring 2021.

More on the LCO’s AI and Automated Decision-Making in the Civil/Administrative Justice System project is available here.

Regulating Government Use of AI and Automated Decision-Making

Governments across the world are increasingly using AI and automated decision-making (ADM) systems to determine government entitlements, prioritize public services, predict policing and support decisions regarding bail and sentencing.

The LCO’s Regulating AI: Critical Issues and Choices report is a ground-breaking analysis of how to regulate AI and automated decision-making (ADM) systems used by governments and other public institutions.  The report discusses key choices and options, identifies regulatory gaps, and proposes a comprehensive framework to ensure governments using AI and ADM systems protect human rights, ensure due process and promote public participation.

More on the LCO’s Regulating Government Use of AI and Automated Decision-Making project is available here.

Multidisciplinary Workshops on AI, Automated Decision-making and the Law

The LCO is organizing a series of collaborative, multidisciplinary workshops to discuss AI, automated decision-making and the law.  The workshops bring a wide range of stakeholders to collectively learn about the legal, operational, technological and practical issues and challenges of developing, deploying and regulating these technologies.  LCO workshops include:

  • In November and December 2020, the LCO organized a workshop in partnership with the provincial government’s Ontario Digital Service addressing disclosure, bias, due process and public participation in government AI and ADM systems. The LCO’s report, Legal Issues and Government AI Development, identifies major themes and practical insights to assist governments considering this technology.
  • In December 2019, the LCO hosted an invitational forum on automated decision-making in the civil and administrative justice system. The event brought together more than 30 policy makers, lawyers, jurists, technologists, academics, and community organizers to share experiences, discuss issues, and consider law reform options.
  • In March 2019, the LCO partnered with The Citizen Lab, the International Human Rights Program at the University of Toronto Faculty of Law and the Criminal Lawyers Association to host Canada’s first multidisciplinary forum addressing predictive policing, citizen profiling, and automated bail and sentencing.

The LCO’s AI, ADM and the Justice System project is partially funded by a Law Foundation of Ontario Justice and Technology Grant.

Other LCO Digital Rights Initiatives

Artificial Intelligence and Human Rights

On Tuesday, February 4, 2020 the LCO participated in York University’s Annual Inclusion Day, an event that focuses on the theme of belonging and exploring ways to promote a greater sense of belonging at York. This event is a joint partnership between the LCO, and York University’s Centre for Human Rights, Equity & Inclusion (REI) and the President’s Advisory Committee on Human Rights Sub-Committees.

From 10:00 AM – 12:00 PM, the LCO hosted a panel at the Helliwell Centre in Osgoode Hall Law School, titled Artificial Intelligence and Human Rights at YorkU: A Panel Discussion on Impacts and Opportunities.

Moderated by Ryan Fritsch, Counsel with the Law Commission of Ontario, the panel of speakers included:

  • Insiya Essajee, Counsel, Ontario Human Rights Commission
  • Professor Trevor Farrow, Osgoode Hall Law School
  • Professor Regina Rini, Department of Philosophy, York University
  • Professor Ruth Urner, Department of Engineering & Computer Science, Lassonde School of Engineering

AI, Access to Justice and Legal Aid

In June 2019, the LCO presented a paper, AI and Automated Decision-Making: lmpact on Access to Justice and Legal Aid, to the 2019 International Legal Aid Group global conference.  The paper discussed the impact of technology on access to justice for low-income communities and presented ideas for how legal aid plans plan can respond.

 

AI for Lawyers:  A Primer on AI in Ontario’s Justice System