Project Overview

The Law Commission of Ontario Criminal AI Lifecycle Project is the first legal and policy reform project in Canada to review how artificial intelligence (AI) impacts each stage in a criminal justice matter.

The five stages are:

  • police investigations
  • Crown consideration of charges
  • the role of risk assessment in bail, sentencing, and community diversion
  • trial and appeal, and
  • systemic oversight.

The “lifecycle” approach has two goals. First, to help government and criminal justice sector institutions be proactive about the risks and benefits of AI on access to justice, due process, human rights, and civil liberties. Second, to help these institutions identify and adopt concrete, practical and complimentary approaches to the shared challenges of AI.

Why is this project timely?

Around the world, police, prosecutors, judges, corrections officers, and community safety services increasingly use AI systems to support their decision-making. Yet AI-informed choices raise profound questions about impacts on access to justice, due process, human rights, and civil liberties.

Early – and in some cases controversial – examples of AI in Canadian criminal justice include facial recognition, predictive policing, de-encryption of digital devices, and identification of high-risk inmates.

Jurisdictions outside of Canada further employ AI to conduct sentencing and bail assessments, engage in probabilistic DNA analysis, and to profile public activities on social media.

Canada has yet to widely adopt these AI technologies. But experience from other jurisdictions suggests this is certain to change. Other jurisdictions further demonstrate how AI is often implemented unevenly. In criminal justice this results in conflicts between institutions. These conflicts create barriers, delay process, and may fail to protect rights.

The LCO believes a thoughtful and intentional approach is necessary to introduce AI into Canada’s criminal justice system. Crucially, the introduction of AI must also support established legal rights and principles. This requires a legal and policy framework that anticipates the risks and benefits of AI systems. It also requires a framework that is systemically aware to help institutions adopt complimentary responses to the shared challenges of AI.

To achieve these objectives the LCO’s five-part review expects to engage with a wide array of potential issues. For example, the papers may consider:

  • the unique characteristics of AI in supplementing or supplanting human decision-making in relation to Charter rights and due process
  • how to be proactive and manage the impacts of AI through instruments like directives, impact assessment tools, procurement policies, and policy frameworks. These may be specific to criminal law and distinct from similar instruments made to guide government decision making, administrative and civil law, and in the private sector
  • necessary litigation resources to ensure a full and fair defense
  • the use of AI for good, such as monitoring justice systems for bias and discrimination, or more quickly scheduling proceedings
  • requirements and standards of AI transparency, notice, disclosure, data bias, explainability, and performance validation
  • gaps in existing legislation such as the Criminal Code of Canada, Evidence Act, Provincial Offences Act, and others
  • the assertion of proprietary trade secrets
  • policies that guide the Crown
  • the role of public participation and other external technological oversight mechanisms
  • wrongful convictions, systemic human rights, and civil liberties impacts
  • the role of expert witnesses
  • the KYT (“know your tech”) training needs of judges, Crowns, defense counsel, and other court officers to ensure a robust role for human-in-the-loop oversight and safeguards against undue deference to AI recommendations

 

How will the project develop and who is participating in it?

The LCO has invited leading practitioners and experts to author reports on each stage considered in the lifecycle analysis. The reports will identify issues and challenges raised by AI and pose key questions and options for legal and policy reform. The authors will work on the reports simultaneously, in parallel, to better identify and discuss themes and issues that cut across the different stages. Publication of the five reports will support broad and inclusive public consultations on the identified issues and options.

The authors of the five papers are:

Police Investigations Lynda Morgan, Defense Counsel
Addario Law Group and Co-Chair, Osgoode Hall Annual TechCrime Program

Dubi Kanengisser, Policy Advisory
Toronto Police Services Board

Crown Consideration of Charges Alpha Chan, Detective and CISO
Toronto Police Services

Mabel Lai, Crown Law Office – Criminal
Ministry of the Attorney General

Risk Assessments in Bail, Sentencing, and Community Diversion Gideon Christian, Professor of Law
University of Calgary

Dina Zalkind, Criminal Policy Counsel &
Criminal Duty Counsel
Legal Aid Ontario

Trials and Appeals Paula Thompson, Strategic Initiatives
Ministry of the Attorney General

Eric Neubauer, Defense Counsel
Neubauer Law, and Co-Chair Criminal Lawyers Association Technology Committee

Systemic Oversight Mechanisms Brenda McPhail, Director, Privacy, Technology & Surveillance Program, Canadian Civil Liberties Association

Jagtaran Singh, Legal Counsel
Ontario Human Rights Commission

Marcus Pratt, Director of Policy
Legal Aid Ontario

Advisory Committee

An external Advisory Committee also supports the project. Advisory Committee members meet from time to time to review the work of the project and offer advice and guidance. The members of the Advisory Committee are:

  • Marcus Pratt Legal Aid Ontario
  • Dina Zalkind Legal Aid Ontario
  • Rosemarie Juginovic Chief Justice, Ontario Superior Court of Justice
  • Gerald Chan Stockwoods LLP Barristers
  • Lynda Morgan Addario Law Group
  • Eric Neubauer Neubauer Law
  • Dubi Kanengisser Toronto Police Services Board
  • Alpha Chan Toronto Police Service
  • Brenda McPhail Canadian Civil Liberties Association
  • Jane Mallen Ministry of the Attorney General, LCO Board of Governors
  • Mabel Lai Ministry of the Attorney General
  • Paula Thompson Ministry of the Attorney General
  • Diana Grech Strategic Analytics Unit, Ministry of the Attorney General
  • Rosanna Giancristiano Director, Court Operations, Ministry of the Attorney General
  • Michelina Longo Director, External Relations, Ministry of the Solicitor General
  • Jessica Mahon Ministry of the Solicitor General
  • Michael Swinburne Senior Policy Advisor, Canadian Human Rights Commission
  • Jagtaran Singh Ontario Human Rights Commission
  • David Murakami Wood Incoming Professor, Department of Criminology, University of Ottawa
  • Gideon Christian Faculty of Law, University of Calgary
  • Daniel Konikoff PhD Candidate, Centre for Criminology and Sociolegal Studies, University of Toronto

 

What work is already complete?

The LCO has developed a series of papers examining the impact of AI in the criminal, civil, and administrative contexts. We also identify critical issues and choices for regulation and compare Canadian and EU approaches. The complete series of papers is available at https://www.lco-cdo.org/ai.

Two papers are most relevant to the Criminal AI Lifecycle Project: