This post is also available in: Français (French)
The LCO’s strategic priorities for 2017—2020 mandate the organization to give priority consideration to technology law reform issues. The influence of technology is pervasive; it is spreading to new areas of the private and public sectors, shaping lives like never before, and raising serious questions about access to justice. Today there are computer algorithms that adjudicate criminal sentencing and bail conditions; private “gig economy” platforms that decide who works and who doesn’t; and “smart cities” built with sensors in public spaces. From these and other examples the LCO has begun to identify specific areas where there is evident need for sophisticated, thoughtful legal analysis in the Ontario context.
What kinds of “digital rights” are arising?
There is no single definition of “digital rights.” Digital rights include topics as diverse as digital inclusion and access; a digital bill of rights; rights in a smart city; “digital due process;” regulatory sandboxing; social scoring and algorithmic black boxing; digital democracy; new frameworks for informed online consent; and more. These are important, timely, and controversial issues. Our goal is to better understand these issues and to help shape a digital rights agenda for Ontario and beyond. As with all LCO projects, we undertake research and conduct public consultations to better identify and recommend law reforms that are concrete, precise, and responsive to the access to justice and equality rights of Ontarians. To accomplish this, we accept three important things as starting points:
- Technology has created a significant new frontier for access to justice. This is not to say that access to justice has otherwise been addressed – far from it. But technology raises new and crucial questions rooted in fundamental due process: how do you know when an algorithm has been involved in determining an application for a job, social entitlements, or is adjudicating your legal claim? Can an algorithm explain the decision it has rendered, and should it be required to? How do you cross-examine an algorithmic decision maker?
- Technology stakeholders are different. Technology platforms and products shape behaviour, filter information, automate activities, and set terms and conditions for use. They are unique in actively being able to incorporate access to justice in the everyday where it matters most to people, be that in the public or private sphere. Technological stakeholders may thus have a much different and much more active role to play in law reform initiatives.
- New digital rights will find innovation in existing legal principles. There is considerable need for the legal imagination to connect new digital rights to the array of potential legal approaches. For example, what if the focus of privacy law shifted from individual consent to collective consumer rights? Could protections from discrimination guaranteed under Ontario’s Human Rights Code be incorporated directly into an algorithm, and this ensure structural compliance?
What we have done so far
- March 2019: the LCO, The Citizen Lab, the IHRP, and the Criminal Lawyers’ Association partnered together to organize the first Canadian multidisciplinary forum on Algorithms, Artificial Intelligence and Automated Decision-Making in the Criminal Justice System. Topics discussed include predictive policing and the use of automated decision-making in bail and sentencing. To read materials from this event click here.
- November 2018: The LCO convened a roundtable to elicit the advice of over a dozen subject-matter experts on how best to scope and sequence a series of digital rights law reform projects spanning automated decision-making, consumer protections, and precarious work.
- May 2018: The LCO partners with the Mozilla Foundation to host a Roundtable on Digital Rights and Digital Society that brought together a dozen key policy makers, legal scholars, lawyers, technologists, academics and community organizers to discuss a digital rights agenda for Ontario and beyond
- May 2018: The LCO hosts an International Conference on Defamation Law in the Age of the Internet, cited by one participant as among the most comprehensive analysis of a defamation law framework to date, anywhere in the world
- May 2018: The LCO hosts a panel presentation at RightsCon, Canada’s largest conference on digital rights and digital society, discussing Reforming Intermediary Responsibility: Testing a Human Rights Centred Framework Beyond the Liability and Immunity Divide
- September 2017: The LCO’s Class Actions Project received funding from the Department of Justice to establish a public, online, open data catalogue of class action cases and information. This is the first of its kind in Canada and will support ongoing access to justice and law reform research across the country
- April 2016: The LCO partners with Legal Aid Ontario to host an Open Data, Open Government Symposium, looking at concrete issues of transparency, accountability, and “big data” for government, courts, and tribunals in Ontario
To support a digital rights agenda for Ontario, the LCO is interested in developing a series of research and law reform projects to commence in 2019. The LCO anticipates releasing a project plan in early 2019. The LCO is also interested in identifying experts and potential partners. In March 2019, the LCO is collaborating with colleagues from The Citizen Lab, the Criminal Lawyers’ Association of Ontario, and the International Human Rights Program at the University of Toronto, to host a one-day discussion forum on algorithms, automated decision-making, and artificial intelligence in the criminal justice system. We will also continue to build on work under way in the Defamation Project, and the report on the Digital Rights, Digital Society Roundtable. This roundtable report highlights several major themes where law reforms are increasingly coming into question. These are questions like accountability and digital due process for algorithms that now automatically filter content, price goods, determine eligibility for bail, and make medical recommendations; platform accountability and transparency beyond the choice to either opt-in or miss-out; modernizing labour rights in a gig economy of precarious work; and better supporting digital equality and digital civil society in an era of smart cities and asymmetric platform power.