This post is also available in: frfrançais (French)


Algorithms, automated decision-making (ADM) and artificial intelligence (AI) are a new frontier for access to justice, human rights, dispute resolution and due process.  The first phase of LCO’s multiyear Digital Rights project considers two areas of technology law reform that potentially affect all Ontarians:  the use of algorithms in the justice system and consumer protection in the “digital marketplace.” 


The LCO’s strategic priorities for 2017—2020 mandate the organization to give priority consideration to technology law reform issues. The influence of technology is pervasive; it is spreading to new areas of the private and public sectors, shaping lives like never before, and raising serious questions about access to justice. Today there are computer algorithms that adjudicate criminal sentencing and bail conditions; private “gig economy” platforms that decide who works and who doesn’t; and “smart cities” built with sensors in public spaces. From these and other examples the LCO has begun to identify specific areas where there is evident need for sophisticated, thoughtful legal analysis in the Ontario context.

This project is partially funded by the Law Foundation of Ontario Justice and Technology Grant.

What kinds of “digital rights” are arising?

There is no single definition of “digital rights.” Digital rights include topics as diverse as digital inclusion and access; a digital bill of rights; rights in a smart city; “digital due process;” regulatory sandboxing; social scoring and algorithmic black boxing; digital democracy; new frameworks for informed online consent; and more. These are important, timely, and controversial issues. Our goal is to better understand these issues and to help shape a digital rights agenda for Ontario and beyond. As with all LCO projects, we undertake research and conduct public consultations to better identify and recommend law reforms that are concrete, precise, and responsive to the access to justice and equality rights of Ontarians. To accomplish this, we accept three important things as starting points:

  • Technology has created a significant new frontier for access to justice. This is not to say that access to justice has otherwise been addressed – far from it. But technology raises new and crucial questions rooted in fundamental due process: how do you know when an algorithm has been involved in determining an application for a job, social entitlements, or is adjudicating your legal claim? Can an algorithm explain the decision it has rendered, and should it be required to? How do you cross-examine an algorithmic decision maker?
  • Technology stakeholders are different. Technology platforms and products shape behaviour, filter information, automate activities, and set terms and conditions for use. They are unique in actively being able to incorporate access to justice in the everyday where it matters most to people, be that in the public or private sphere. Technological stakeholders may thus have a much different and much more active role to play in law reform initiatives.
  • New digital rights will find innovation in existing legal principles. There is considerable need for the legal imagination to connect new digital rights to the array of potential legal approaches. For example, what if the focus of privacy law shifted from individual consent to collective consumer rights? Could protections from discrimination guaranteed under Ontario’s Human Rights Code be incorporated directly into an algorithm, and this ensure structural compliance?



Core Projects


Artificial Intelligence in Ontario’s Civil and Administrative Justice System

This project is the civil justice system equivalent to the LCO’s AI and Criminal Justice System project.  This project considers the use of AI and algorithms in regulatory investigations, government benefit determinations, and to support decision-making in the civil and administrative justice systems.  As part of this the project, in December 2019 the LCO hosted an invitational forum on automated decision-making in the civil and administrative justice system. The forum was a cross-disciplinary event designed to promote innovative, collaborative and multi-disciplinary analysis and recommendations.  The LCO invited more than 30 policy makers, lawyers, jurists, technologists, academics, and community organizers to share experiences, discuss issues, and consider law reform options.  We also invited leading US advocates and researchers to help us learn from the American and international experience. 

Presenters included:

  • Kevin DeLiban (Legal Aid Arkansas), Martha Owen (Deats, Durst, Owen & Levy P.L.L.C.), and Christiaan van Veen (Director, Digital Welfare State and Human Rights Project at NYU Law), who shared case studies of AI in administrative decision-making;
  • Professor Jennifer Raso (University of Alberta, Faculty of Law) and Raj Anand (Partner at WeirFoulds, former Chief Commissioner of the OHRC), who discussed legal issues that arise from the use of AI and automated decision-making in government decision-making; and
  • Nele Achten (Berkman Klein Center for Internet and Society, Harvard University), Benoit Heshaies (Treasury Board of Canada Secretariat, Government of Canada), Amy Bihari (Ontario Digital Service, Government of Ontario), and Professor Julia Stoyanovich (NYU), who discussed regulatory efforts underway in the area of AI and civil justice

Forum materials included:

  • Event report (forthcoming)
  • December 10th Forum Background Materials




Consumer Protection in the Digital Marketplace

This project considers how to better protect consumers in Canada’s “digital marketplace.”  The LCO is partnering with the Centre for Law, Technology and Society at the Faculty of Law, University of Ottawa to make recommendations to consumer protection legislation to address online “terms of service” and “click consent” contracts. Coming soon: check back for updates, or sign up using the box above to get on our mailing list!