-
1 Introduction
In 1939, Dorothy, the Tin Man and the Scarecrow were walking down the Yellow Brick Road, fearing what was in the forest, chanting, ‘lions, and tigers, and bears, oh my!’ What they found out as their adventure unfolded was that the lion they feared was a pussycat and became a great friend, and that there were other things in the forest that should have concerned them.
By 2019, justice systems around the world were well on their way down their own Yellow Brick Road, integrating information and communication technology (ICT) into court systems and processes. Some assumed that technology would be a pussycat and become a great friend to the courts, but some echoed the fears of Dorothy and her companions: lions and tigers and bears became bits and bytes and apps. Early in 2020 the emergence of the COVID-19 pandemic served as a ‘big bang’ for online dispute resolution (ODR) development and use. We are, in 2021, well along the road towards full integration of technology into the courts and every other dispute resolution system, worldwide.
What lurks out there in the technology forest surrounding the courts? Are there pussycats and friends, or are there dangers? We suggest the answer is yes to both friends and dangers. It would be impossible to list, let alone discuss, all of the issues about which we should be aware as ICT moves increasingly into the courts. In this article we want to highlight three issues, or three areas, that we think hold the potential to be friends of the court – pussycats, if you will – but which may offer a bit of danger if not managed well.
We will start by looking at the impact ICT has on our definition of and conception of access to justice (A2J). Next, we will discuss the fact of and the perception of digital divides. Finally, we will discuss the role artificial intelligence (AI) and algorithms may play in the justice system. As what citizens of Louisiana in the United States would call lagniappe (a little extra), we will end with some thoughts about where ODR associated with justice systems may be headed.
As a final note before beginning, we should be clear about what we mean when we use the term ODR in relation to courts and formal justice systems. There are many acceptable definitions, depending on the venue in which one is working. For our discussion of ODR and the justice system, we can be guided, with caveats, by the definition suggested by the National Center for State Courts (NCSC) in the United States:Court-related Online Dispute Resolution (ODR) is a public facing digital space in which parties can convene to resolve their dispute or case. Three essential components differentiate court-related ODR from other forms of technology-supported dispute resolution: The first is that the program operates exclusively online…. The second is that the program is explicitly designed to assist litigants in resolving their dispute or case, rather than a technology platform to support judicial or court staff decision-making…. Third, the program is hosted or supported by the judicial branch ….1x National Center for State Courts, “What is ODR?”, available at: www.ncsc.org/odr/guidance-and-tools (last accessed January 2021).
The caveat is that defining ODR, even ODR connected to the courts, as an ‘all online’ process with no face-to-face interaction is too limiting to be taken as the final word. It is the case that ICT has been integrated into a wide variety of dispute resolution systems, including courts, so that technology complements or is embedded in processes that include online work and face-to-face work. Another, more expansive definition of ODR is:
ODR is not just the development of automated systems for disputes handled entirely online. ODR, in the broader sense, is simply the intelligent application of information and communication technology to any conflict engagement process.2x Daniel Rainey, “Conflict Engagement and ICT: Evolution and Revolution,” The International Journal of Online Dispute Resolution, Vol. 3, No. 2, 2016, p. 81.
But, back to NCSC’s definition, the first line of the definition stresses the idea of a ‘public facing’ digital space. That signals a major change in attitude and approach to court-related technology over the past few years. Beginning in the early 2000s, law firms and courts had access to technology that made life easier for lawyers, judges, and court officers. Case management software, document assembly software, legal databases, and eventually e-filing schemes were inward facing. They assisted and helped lawyers and the courts handle information and cases more efficiently, but were largely invisible to the clients and parties who interacted with the justice system. Beginning well before the pandemic and accelerated by the events of 2020, the focus of ODR technology related to the courts has become increasingly outward facing.3x Outward-facing commercially available ODR apps include platforms like Modria (www.tylertech.com/resources/videos-and-webinars/online-dispute-resolution-modria-in-action), Matterhorn (https://getmatterhorn.com/) and ImageSoft (https://imagesoftinc.com/courts/online-dispute-resolution/).
The shift in focus from inward facing to outward facing has significant implications for how the justice system interacts with citizens, and indeed affects the way we think about the very definition of A2J. Adding the phrase ‘intelligent application of communication technology’ to the essential definition of ODR raises the very real potential for AI to become a partner in resolving disputes online. These are the issues we will address later in this chapter. -
2 Access to Justice
It has become axiomatic in the ODR community, and in the justice community at large, that increasing the use of technology in the justice system will greatly increase A2J.
The conception or definition of A2J that is most often used is tied directly to ease of access to formal justice systems – the courts – and the fairness of those formal justice systems once they are accessed. In its ‘Rule of Law Index,’ the World Justice Project uses four factors to define adherence to the rule of law, and, one could argue, access to justice: Constraints on Government Power, Absence of Corruption, Open Government and Fundamental Rights.4x World Justice Project, Rule of Law Index 2020, pp. 12-13, available at: https://worldjusticeproject.org/sites/default/files/documents/WJP-ROLI-2020-Online_0.pdf (last accessed January 2021). Another measure, used by the Legal Services Corporation in the United States, relies on the number of potential cases that arise and how many of them are in some way served by the justice system.5x See Legal Services Corporation, 2019 Annual Report, available at: https://lsc-live.app.box.com/s/boo2b9zitjdmhmh964t25ne2540flg0r (last accessed January 2021).
In their 2017 comprehensive report, LSC declared that, “… 86% of the civil legal problems reported by low-income Americans received inadequate or no legal help.”6x Legal Services Corporation, 2017 Justice Gap Report, available at: www.lsc.gov/media-center/publications/2017-justice-gap-report (last accessed January 2021). The World Justice Project’s 2020 Rule of Law Index listed the United States as 21st among 128 countries, still in the top tier, but well down the list in A2J. Clearly, there is work to be done on A2J, in the United States and around the world, but legal scholars are optimistic about the impact of ICT on A2J. Richard Susskind argues:ODR offers the promise of robust and yet radically less expensive dispute resolution. And while it may still seem alien or outlandish for some lawyers, policymakers and opinion formers of today, few of the doubters belong to the Internet generation. Future generations, for whom working and socialising online is second nature, may regard ODR as an entirely natural facility, much more so than conventional courts.7x Richard Susskind, “Forward”, in ODR Theory and Practice, 2nd Edition (Eleven International Publishers, 2021, forthcoming).
Embedded in Susskind’s comment are some keys to whether the use of ICT – ODR – in the justice system will in fact explosively increase A2J.
One key is whether the newly available avenues into the justice system will change the perception of the justice system among potential users who have traditionally been denied, or have chosen to avoid, access. Working with a musician in West Africa to explain how to protect intellectual property rights in an international marketplace, we were met with this declaration: “The courts are just what rich guys use to cheat poor guys.” If technology is going to increase A2J, courts are going to have to find a way to convince generations of potential parties who have felt excluded and unwelcome that technology offers a new day.
A colleague of ours grew up in a small town in the southern United States. As a young black man he walked by a downtown laundry with a sign in the window that declared, ‘Whites Only’. He left the small town, became a very successful businessman, and much later when visiting his home town he passed the same laundry and noted that the sign was gone. His comment was, “Taking the sign down doesn’t do much. I won’t go in there just because I can. They need to let me know I’m welcome to go in there.” We face much the same situation with ODR and the courts. Technology may mean that everyone can go in – but it does not mean that everyone will go in unless we take steps to change the perception of the justice system that it has rightfully earned over the decades.
Another key lies in the very definition of ‘justice’. As we have noted, traditional definitions of A2J revolve around access to the courts. As Susskind noted, in an ODR world, justice can and will mean more than ‘conventional courts’. At the World Justice Forum in 2019, the argument that the fundamental definition of A2J should be broadened beyond access to the courts was presented.A basic reconsideration of the basic elements of access to justice … would add at least three basic human rights … The right to exist … The right to control one’s identity … [and] The right to access opportunity ….8x Daniel Rainey, Scott Cooper, Donald Rawlins, Kristina Yasuda, Tey Al-Rjula, and Manreet Nijjar, “Digital Identity for Refugees and Disenfranchised Populations,” The International Journal of Online Dispute Resolution, Vol. 6, No. 1, 2019, p. 22.
If Susskind and others are correct, A2J will come to mean access to fair treatment, access to problem-solving and access to information, in addition to access to the courts. An interesting finding in the review of the State of Utah’s ODR pilot project is that citizens who have a problem that may involve the law do not see the courts as a source of ‘dispute resolution’.9x Stacy Butler, Sarah Mauet, Christopher L. Griffin, Jr., and Mackenzie S. Pish, “The Utah Online Dispute Resolution Platform: A Usability Evaluation and Report”, 8 September 2020, available at: https://law.arizona.edu/utah-online-dispute-resolution-platform-usability-evaluation-and-report (last accessed January 2021). So, ODR, by its very name, is somewhat disconnected in the public mind from what happens in the court system. It is clear that in addition to the obvious work that needs to be done to integrate technology into the justice system, we are going to have to do a lot of education – making everyone feel welcome to walk through a newly opened door.
In the United States, starting in the 1990s, there began a push to use alternative dispute resolution (ADR) approaches (primarily mediation) instead of relying on courts and the formal justice system to handle every dispute. Beginning with the Administrative Dispute Resolution Act of 1990, through the Alternative Dispute Resolution Act of 1998, the federal government in the United States integrated ADR into the A2J ecosystem.10x For a review of relevant US federal statutes on ADR, see ADR.gov, Key ADR Statutes, available at: www.adr.gov/adrguide/04-statutes.html (last accessed January 2021). The 1998 statute required federal trial courts to make ADR programs available to litigants. State and local courts followed suit.
Initially, the use of ADR was post-filing – one was referred to a mediator or conciliator after filing with the court and being directed to ADR by a judge. Even before the explosion of ODR after 2019, it was becoming more common to see ADR used pre-filing, as a precursor to litigation or arbitration. The increased use of ODR platforms has made it easier to access ADR as well as the courts, and the current trend, exemplified in the United States by platforms like Modria11x www.tylertech.com/products/modria. and Matterhorn,12x https://getmatterhorn.com/. is to encourage potential litigants to enter ODR programs as a first step, with litigation or arbitration as a last resort. Colin Rule has suggested, when speaking of ODR and the future of justice, that a DNMEA approach to A2J should be the norm: Diagnosis-Negotiation-Mediation-Evaluation-Appeal.
Diagnosis occurs when parties interact privately with ODR systems driven by algorithms and AI; Negotiation is facilitated by online systems working with primary parties; Mediation brings either AI or human mediators into the picture; and Evaluation and Appeal move the parties into a traditional legal environment, albeit online.13x See Colin Rule, “Online Dispute Resolution and the Future of Justice,” available at: www.colinrule.com/writing/future.pdf (last accessed January 2021). As we move more and more towards the use of ODR to manage disputes, the future of A2J appears to rely less on judgements in a court or by an arbiter, and more by guided interaction among primary parties. In a world with ODR, A2J is no longer your grandfather’s, or even your father’s A2J. -
3 Digital Divides
Is there a ‘digital divide’ – a lack of access to the Internet that excludes some individuals from using resources available to those who have regular access to robust Internet connections? The bad news is that, yes, there is a digital divide, even in the most technologically advanced countries. It is, in fact, one of the lions, and tigers, and bears, out in the ODR forest. In fact, the heavy reliance on Internet connections for online learning during the COVID pandemic has underscored the problem with the digital divide, even for those lower-income families who have tenuous access to the Internet through mobile phones. The good news is that although the digital divide exists, the impact that it has on A2J may be exaggerated. Or more precisely, its impact may be exaggerated.
The numbers give us a snapshot of one aspect of the digital divide. Worldwide, the number of individuals, of any age or location, who access the Internet, using any device, has increased from 40.7% in 2014 to a projected 53.7% in 2021.14x J. Clement, “Worldwide Internet User Penetration”, 23 July 2019, Statista, available at: www.statista.com/statistics/325706/global-internet-user-penetration/ (last accessed January 2021). This prediction suggests that, during 2021, more than half the individuals in the world will log on to the Internet at least once each month. 2014 is also a landmark year because more Americans used mobile devices than desktops or laptops to access the Internet.15x James O’Toole,”Mobile apps overtake PC Internet Usage in US,” CNN Business, 28 February 2014. Available at https://money.cnn.com/2014/02/28/technology/mobile/mobile-apps-internet/. Other countries in other parts of the world reached that milestone earlier.
The digital divide, defined by ability to access the Internet, obviously is narrower in developed nations. North America has the highest percentage of penetration (over 90%) and Asia has the lowest (almost 60%). So, one may rightly argue that technology access and the digital divide are real issues. But percentages do not tell the entire story.
Even in a country for 90% Internet penetration, many people find themselves on the wrong side of the divide. A significant percentage of individuals who live in remote areas, or who rely on potentially expensive connections through mobile phones (if they have a phone at all), or who do not have the financial resources to afford an Internet-capable device have difficulty accessing the newly opened doors to justice. Even those who can theoretically use publicly available Internet connections at libraries or other locations may live in areas where access to transportation or the cost of transportation keeps them from bridging the divide. The digital divide is, for these people, very real.
But the impact of the digital divide can be more difficult in perception than in fact. For example, in Cameroon, Internet penetration in 2020 reached 30% – only 3 in 10 were reportedly able to access the Internet.16x Digital 2020 Cameroon, available at: https://datareportal.com/reports/digital-2020-cameroon (last accessed January 2021). At an ODR brainstorming session, representatives from Cameroon argued that even though the 30% figure was probably true, they knew that virtually everyone in Cameroon had some access to an Internet-capable smartphone (even if it was a shared or borrowed device), so they designed an ODR project using smartphones and WhatsApp (which they said is commonly used to communicate in Cameroon) that they argued would be available to virtually everyone in the country.
The digital divide is real, and it is something that must be addressed as the justice system becomes more and more reliant on ODR technology. Intelligent design, focus on mobile technology, extending Internet access to lower-income communities and other design approaches for ODR platforms can reduce the impact of the digital divide. Is it a lion, tiger or bear? Yes, but we think in the long run it is a beast that can be at least partially tamed.
The other divides, however, may be more difficult to bridge. We have already discussed the divide that exists between those who trust and will approach the justice system and those who do not trust or do not understand the justice system. That divide is arguably far wider than the digital divide.
An even more insidious and hard to bridge divide involves prejudice and bias, the existence of which has been acknowledged widely. The NCSC devotes an entire section of its website to resources discussing how to understand and deal with bias of many kinds.17x NCSC Gender and Racial Fairness Resource Guide, available at: www.ncsc.org/topics/access-and-fairness/gender-and-racial-fairness/resource-guide. Last accessed January 2021.
Integrating ODR technology into the courts was commonly thought to be a way to reduce bias and close the gap between the way minority and majority populations have been treated in the justice system. There is some evidence that ODR can work to do just that.
The State of New Jersey developed technology to assess the risk of releasing accused individuals without bail instead of incarcerating them pre-trial. At the time the program was initiated, approximately 40% of the state’s prison population consisted of the accused who could not make bail. That number dropped significantly, as did the overall crime rate, after the system was in use.18x See Diana Debruzzo, “New Jersey Set Out to Reform Its Cash Bail System. Now the Results are In,” Arnold Ventures, available at: www.arnoldventures.org/stories/new-jersey-set-out-to-reform-its-cash-bail-system-now-the-results-are-in (last accessed January 2021); and David J. Reimel, III, “Algorithms and Instruments: The Effective Elimination of New Jersey’s Cash Bail System and Its Replacement,” Penn State Law Review, Vol. 124, No. 1, 12 November 2019, pp. 195-216, available at: www.pennstatelawreview.org/print-issues/algorithms-instruments-the-effective-elimination-of-new-jerseys-cash-bail-system-and-its-replacement/ (last accessed January 2021).
Other efforts to close the divide have not been as successful. In Kentucky, a similar technology approach to cash bail was implemented as the result of a 2011 law. The results did not mirror New Jersey’s results.Before the 2011 law took effect, there was little difference between the proportion of black and white defendants granted release to await trial at home without cash bail. After being mandated to consider a score predicting the risk a person would reoffend or skip court, the state’s judges began offering no-bail release to white defendants much more often than to blacks. The proportion of black defendants granted release without bail increased only slightly, to a little over 25 percent. The rate for whites jumped to more than 35 percent.19x Tom Simonite, “Algorithms Should’ve Made Courts More Fair. What Went Wrong?”, Wired, 5 September 2019, available at: www.wired.com/story/algorithms-shouldve-made-courts-more-fair-what-went-wrong/ (last accessed January 2021).
A recent review of a Virginia sentencing programme revealed problems with ‘baked-in’ bias, and an early review of the COMPAS programme widely used in U.S. courts was argued to have similar problems.20x See Megan T. Stevenson and Jennifer L. Doleac, “Algorithmic Risk Assessment in the Hands of Humans”, available through SSRN at: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3489440 (last accessed January 2021), and Alex Chohlas-Wood, “Understanding Risk Assessment Instruments in Criminal Justice”, available at Brookings: www.brookings.edu/research/understanding-risk-assessment-instruments-in-criminal-justice/ (last accessed January 2021).
In summary, the digital divide is probably feared more than it need be, and can be overcome using a variety of approaches. The other divides are deeper and harder to bridge. What do we do about them? That is largely a question related to the use of algorithms and AI. -
4 Artificial Intelligence
An unexpected consequence of the 2020 global pandemic has been the exponential increase of digital data generated daily by the explosion of remote worker reliance on technology tools hosted in the cloud.21x Tony Seba and James Arbib, “We are Approaching the Fastest, Deepest, Most Consequential Technological Disruption in History”, 5 October 2020. Available on the FastCompany website at: www.fastcompany.com/90559711/we-are-approaching-the-fastest-deepest-most-consequential-technological-disruption-in-history (last accessed January 2021). The legal sector is no different than any other in this respect.
As the growth of digital data transforms technology and its commercial uses, it will also serve to transform the economy, social norms and legal relationships:Now, we are predicting the fastest, deepest, most consequential technological disruption in history and with it, a moment civilization has never encountered before. In the next 10 years, key technologies will converge to completely disrupt the five foundational sectors – information, energy, food, transportation, and materials – that underpin our global economy, and with them every major industry in the world today. Costs will fall by 10 times or more, while production processes become an order of magnitude (10×) more efficient, using 90% fewer natural resources and producing 10 times to 100 times less waste.
These technological disruptions are turning the prevailing extraction and exploitation, scarcity and central control model of production on its head, driving a new model of localized creation from limitless, ubiquitous building blocks – a world built not on coal, oil, steel, livestock, and concrete, but on photons, electrons, DNA, molecules and (q)bits.22x Zacks Equity Research, “Microsoft (MSFT) Keen on Building Subsea Data Center Network”, 15 September 2020. Available on the Nasdaq website at: www.nasdaq.com/articles/microsoft-msft-keen-on-building-subsea-data-center-network-2020-09-15 (last accessed January 2021).The digital transformation will modify how most industry sectors evolve, including the practice of law and the education of lawyers. Although not without significant challenges, technology advances will alter the nature of justice and education and how they are delivered.23x Jean R. Sternlight, “Justice in a Brave New World?”, Connecticut Law Review, Vol. 52, 2020, p. 213 (last accessed January 2021).
The growth of digital data invites the advanced development of AI technology designed to convert huge troves of binary digits into economic value and efficiency. Digital data has been termed ‘the new oil’: deep underground and inaccessible to most.24x Suzanne Rob, “High Scrutiny of Hi-tech Data Practices”, Legal Week (online), 31 January 2020 (last accessed January 2021). The treasures to be found in this resource require refining and distribution to the consumer. The complex intersection of international laws is a key example of a use case for AI applications which would fundamentally disrupt traditional legal practices and the role of legal professionals.25x Ashley Deeks, “High-Tech International Law”, George Washington Law Review, Vol. 88, 2020, p. 574.
As the delivery of legal services radically evolves through digital transformation, legal professionals and academics must keep pace in terms of the nature, efficiency and cost-effectiveness of the services they provide. The next generation of communication technology has been unveiled by the pandemic and the rush to remote working.26x Ethan Murray, “The Next Generation of Office Communication Tech”, Harvard Business Review, 9 October 2020 (last accessed January 2021).
Our world also is one which is in a desperate search for justice. Charles Dickens’ A Tale of Two Cities is the terrifying story of what happens to individuals, communities and nations when injustice reigns. It is a story of extremes and of the havoc wreaked by such extremes, as the famous opening lines suggest:It was the best of times, it was the worst of times, it was the age of wisdom, it was the age of foolishness, it was the epoch of belief, it was the epoch of incredulity, it was the season of Light, it was the season of Darkness, it was the spring of hope, it was the winter of despair, we had everything before us, we had nothing before us, we were all going direct to Heaven, we were all going direct the other way….27x Charles Dickens, A Tale of Two Cities (New York: Bantam Classic), 1989.
Many decades later, technology has accelerated commerce, social interaction and a perception of injustice that might exceed the era about which Dickens wrote: the French Revolution. Can technology advances also serve the interests of justice and help avoid civil unrest? Can AI-driven dispute resolution bring both efficient conflict management and address the deeper needs of justice in a technology-driven culture? These are essential questions for the legal community, mediators and other dispute resolution professionals, including ODR practitioners.
In order to preserve the course of legal frameworks without losing the high ground of principled rule-making, standard setting and practical application, prompt action is required. Legal and ODR practitioners must coalesce to present a united front to persuade their constituents (clients and others looking to them for guidance) that AI will serve to safeguard human legal rights, responsibilities and remedies.4.1 The Technology-Enabled Collaboration Environment
The 2020 pandemic mobilized the primary client source for lawyers – the corporate in-house legal department. In the post-pandemic economy, more than 80 global in-house legal and compliance officers described their number one operational goal using the terms ‘digital transformation’, ‘technology strategy’ and ‘automation’.28x E Leigh Dance and Jon Pedersen, “In-House Leaders Prepare Midyear to Advance on Digital and Operational Goals”, Corporate Counsel (online), 9 July 2020 (last accessed January 2021). Specifically, through digital transformation these corporate legal leaders seek to achieve higher value work and increased efficiency by:
lessening time spent on lower-value tasks, allowing teams to focus on high-value-add missions;
streamlining organic productivity of what will be an increasingly remote workforce;
driving organizational efficiency and productivity; and
enabling better risk assessment and mitigation through IT/data analytics.29x Ibid.
In-house legal departments are far more focused on collaboration and the technology tools that facilitate it than their outside counsel counterparts.30x “In-house Counsel Go Collaboration Crazy”, Corporate Counsel Business Journal blog, 9 October 2020 (last accessed January 2021). The COVID-19 surge for in-house counsel was 305%, compared to 43% for outside counsel. In a buyers’ market, client expectations cannot be ignored. Legal Project Management (LPM) practitioners will be well served to incorporate the digital transformation goals of the corporate clients they seek to serve in order to maximize their value in the legal services ecosystem.
The Gartner Hype Cycle is an annual in-depth analysis of technology trends and their potential to impact the industries they serve. Legal technologies are analysed in terms of their present ability to impact the legal industry. In developing technology applications, hype usually precedes practicality in terms of consumer usefulness.
While many legal technology applications have yet to fulfil their promise, Gartner identifies the four emerging trends which are beginning to meet the needs of the legal industry in 2020 as found on the ‘slope of enlightenment’ and the ‘plateau of productivity’. They include:enterprise legal management (various integrated legaltech applications chosen to strategically meet the needs of the business);
subject rights requests (user agreements which approve data usage);
predictive analysis (AI applications using business data to anticipate risks and opportunities); and
process automation (control dashboard for which scripts can be written to automate routine, repetitive, rule-based, predictable tasks).31x Rob van der Muelen, “4 Key Trends in the Gartner Hype Cycle for Legal and Compliance Technologies”, Smarter With Gartner Newsletter, 21 September 2020 (last accessed January 2021).
The practice of law and ODR can ill afford to be disengaged from the digital transformation of everything.
4.2 The Practice of Law and ODR is Being Transformed by AI
In order to gauge the degree to which legal and ODR professionals (both practitioners and academics) are prepared to assist in maintaining a balanced view and usage of AI, greater understanding and participation in the global debate about ‘ethical AI’ is essential. How knowledgeable are we? How engaged are we prepared to be?
As primary participants in the search for ‘liberty and justice for all’, we must increase our involvement in all things related to AI as the pervasive ecosystem of human existence going forward.
It is not surprising that the digital transformation being adopted by businesses is already impacting the practice of law. The pandemic in combination with emerging technology development has delivered a “one two punch to the profession that will inevitably transform and reshape it in ways that would not have been thought possible years ago.”32x Christopher Suarez, “Disruptive Legal Technology, COVID-19, and Resilience in the Profession”, South Carolina Law Review, Vol. 72, 2021, p. 393 (last accessed January 2021).
AI and its impact on the legal profession raise ethical concerns requiring lawyer knowledge, use and counsel to conform to the American Bar Association (ABA) Model Code of Professional Responsibility. Comment 8 to Rule 1.1, which was added in 2012, expands on the concept of competent representation in the light of technological advancements in the legal field:To maintain the requisite knowledge and skill, a lawyer should keep abreast of changes in the law and its practice, including the benefits and risks associated with relevant technology, engage in continuing study and education and comply with all continuing legal education requirements to which the lawyer is subject.33x Current Developments 2019-2020, Nicole Yamane, “Artificial Intelligence in the Legal Field and the Indispensable Human Element Legal Ethics Demands”, The Georgetown Journal of Legal Ethics, Vol. 33, 2021, pp. 877, 883 (last accessed January 2021).
Globally, and no less in the United States, A2J is decreasing, rather than expanding.
The United Nations defines access to justice as ‘a basic principle of the rule of law’. The United Nations also explains what access to justice entails: ‘In the absence of access to justice, people are unable to have their voice heard, exercise their rights, challenge discrimination or hold decision-makers accountable’.34x Ibid., p. 885.
AI applications in the practice of law can 1) increase individual resort to self-help applications such as ODR and legal ‘bots’ that guide a person through legal processes, and 2) ‘by allowing lawyers to work more efficiently, allowing them to serve more clients’ in less time and at less cost.35x Ibid., pp. 886, 887. The ethical paradox in the use of AI in legal services is that lawyers cannot accept AI legal constructs without oversight and validation. Nor can AI substitute for human legal counsel and judgement.36x Ibid., p. 889. Either outcome could arguably constitute the failure of a lawyer to properly supervise legal services provided under their professional responsibility or the unauthorized practice of law.37x Ibid.
Not all commentators on the state of the legal profession facing technological disruption are ‘the sky is falling’ alarmists. Increasingly, observers are finding more reason for lawyers to improve their value as legal advisors than fearful of an autonomous machine takeover of their noble profession.In a tech-driven world, lawyers must strive to stay relevant and technologically competent. Understanding even the rudimentary aspects of computers and smartphones allows lawyers to relate to their clients. More importantly, such knowledge allows a lawyer to respond to a client’s legal issues sensibly, imbued with a nuanced comprehension of how their problems arose. In order to uphold their obligation to clients, lawyers have to accept that traditional legal solutions may no longer cut it in today’s high-tech environment.38x Thomas R. Moore, “The Upgraded Lawyer: Modern Technology and Its Impact on the Legal Profession Legal Profession”, University of the District of Columbia Law Review, Vol. 21, 2021, pp. 27, 28 (last accessed January 2021).
The apparent practical, ethical and economic issues assumed by the growth of AI in the legal profession begs the question as to where they will learn to thrive in the digitally transformed world?
4.3 The Path Forward to Ensure Ethical AI
Lawyers, legal academics and ODR practitioners are engaged in educating the human population regarding the risks and opportunities of AI. Most of the academics focused on ‘ethical AI’ tend to be engineering or public policy professors. However, the more multidisciplinary these coalitions become, the more effective they can be. The pervasive impact of digital transformation necessitates a more broadly constructed approach to addressing these issues. Law, medicine, pharmacology, engineering, architecture, literature, philosophy, psychology and neuroscience are but a few of the areas of expertise needed to create collaborative initiatives to deal with the wide-ranging impact of AI on all the world’s citizens and organizations.
For example, the Institute of Electrical Electronic Engineers (IEEE) has dedicated much of its resources to developing a foundation for establishing standards for ethical AI: The IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems.39x “The IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems” (last accessed January 2021). Its mission is:To ensure every stakeholder involved in the design and development of autonomous and intelligent systems is educated, trained, and empowered to prioritize ethical considerations so that these technologies are advanced for the benefit of humanity.40x Ibid.
Among the committees created to further this mission on behalf of the IEEE and the global population is its Education Committee. The group is composed of educators and professors from multiple professional disciplines including engineering, education, law, philosophy and public policy.41x One of the authors is a member of this IEEE committee. Issues such as data privacy, data ownership, bias and verification are matters every profession, business and individual needs to know more about in order to avoid exploitation and insure AI applications are as beneficial as possible. The means to audit and brand AI applications as trustworthy cannot be left to those whose economic benefit is derived from the data generated and its value on an open market. Like standards for unleaded gasoline and regulatory enforcement, AI must be subject to similar constraints that further its value and limit its potential for harm.42x One of the authors has proposed a Call for an AI Constitution, presented at the International Conference on Artificial Intelligence and the Law, 2020 (last accessed January 2021).
In early 2021, the European Union published the results of its research and recommendations for the development of ethical AI.43x Feasibility Study, Ad Hoc Committee on Artificial Intelligence (CAHAI), Counsel of Europe (last accessed January 2021). Nine principles were pronounced as the foundations of AI.Human Dignity: AI deployers should inform individuals that they are interacting with an AI system whenever confusion may arise, and individuals should be granted the right to refuse interaction with an AI system whenever this can adversely impact human dignity.
Prevention of Harm to Human Rights, Democracy and the Rule of Law: AI systems should be developed and used in a sustainable manner, and AI developers and deployers should take adequate measures to minimize any physical or mental harm to individuals, society and the environment.
Human Freedom and Human Autonomy: Individuals should have the right to effectively contest and challenge decisions informed or made by an AI system and the right to decide freely to be excluded from AI-enabled manipulation, individualized profiling and predictions.
Non-Discrimination, Gender Equality, Fairness and Diversity: Member States should impose requirements to effectively counter the potential discriminatory effects of AI systems deployed by both the public and private sectors, and to protect individuals from their negative consequences.
Principle of Transparency and Explainability of AI Systems: Individuals should have the right to a meaningful explanation of how an AI system functions, what optimization logic it follows, what type of data it uses, and how it affects one’s interests, whenever it generates legal effects or has similar impacts on individuals’ lives. The explanation should be tailored to the particular context, and should be provided in a manner that is useful and comprehensible for an individual.
Data Protection and the Right to Privacy: Member States should take particular measures to effectively protect individuals from AI-driven surveillance, including remote biometric recognition technology and AI-enabled tracking technology, as this is not compatible with the Council of Europe’s standards on human rights, democracy and the rule of law.
Accountability and Responsibility: Developers and deployers of AI should identify, document, and report on potential negative impacts of AI systems on human rights, democracy and the rule of law, and put in place adequate mitigation measures to ensure responsibility and accountability for any harm caused. Member States should ensure that public authorities are able to audit AI systems, including those used by private actors.
Democracy: Member States should take adequate measures to counter the use or misuse of AI systems for unlawful interference in electoral processes, for personalized political targeting without adequate transparency mechanisms, and more generally for shaping voters’ political behaviours and manipulating public opinion.
Rule of Law: Member States should ensure that AI systems used in justice and law enforcement are in line with the essential requirements of the right to a fair trial. They should pay due regard to the need to ensure the quality, explainability and security of judicial decisions and data, as well as the transparency, impartiality and fairness of data processing methods.44x Ibid.
Which of those principles is unaffected by a legal framework? Lawyers, legal academics and ODR specialists must participate in these groundbreaking initiatives. The legal academy must be the fertile ground in which research, writing, teaching and exploration of the parameters of AI takes place. Practitioners must bring their practical expertise to bear and ODR technologists must share their unique perspectives. It cannot be done alone in any of these silos. It must be deployed across, and in collaboration with, multiple professional disciplines.
-
5 Where Justice Is Headed
So, as we travel down the road towards more integration of ICT into justice systems, both formal and informal, what is around the bend?
First, it seems safe to say that as time goes by we will see more and more integration of technology into the way we handle disputes. Some may still be troubled by this, but for the most part, our COVID experience has taught us that what might have seemed difficult or impossible is, in fact, quite possible. As a state Supreme Court judge said to doubters in a meeting discussing his court’s experience with ODR, ‘technology is here to stay – get over it’.45x Comment by Justice Constandinos Himonas at a roundtable discussion in 2019.
If we are correct about the increasing use of technology to access justice, it is imperative that all of the stakeholders in the justice system have a voice as platforms, algorithms and AI are integrated. And by all stakeholders we mean not just judges and lawyers and mediators, but representatives of all of the pursuers of justice. In an interview with the authors, the Co-Director of the National Center for Technology and Dispute Resolution (NCTDR) argued that, for example, individuals who have gone through a divorce (online or offline) and who may have had either good or bad experiences with mediation or the courts probably have a lot to tell developers about how to program ODR apps that work well for all parties.46x Leah Wing, Co-Director, NCTDR, online interview, 26 January 2021. The ability of stakeholders to influence the development of ODR apps and platforms presupposes two things, neither of which is a given.
First it assumes that there is some way for lawyers, judges, mediators and users to actually give upfront guidance to developers and providers rather than waiting until an app is rolled out to comment on its utility. This is obviously difficult, but working with established organizations to develop standards and norms is one way to begin to influence developers – being active in organizations like the NCTDR, The International Council for Online Dispute Resolution (ICODR), the International Mediation Institute (IMI), the ABA Dispute Resolution Section, the Association for Conflict Resolution (ACR), IEEE, TechforJustice and others47x http://odr.info/, https://icodr.org/, https://imimediation.org/en/, www.americanbar.org/, https://acrnet.org/. https://techforjustice/org (last accessed January 2021). is a good beginning. Specifically, the ABA Dispute Resolution Section in conjunction with TechforJustice launched in early 2021 the Collaborative Coalition for Equal Justice initiative to purposely bring diverse teams together to address pressing issues facing justice and human rights.48x www.techforjustice.org/ibo-summit/ (last accessed January 2021). These problem-solving teams will be intentionally multidisciplinary, bringing relevant professionals together out of legal, ODR, academic, engineering, human rights, data science, medicine and any other professional expertise required to exert influence on the co-creation of solutions to benefit stakeholder/constituents, especially among the marginalized in our world.
Second, it assumes that the development process is transparent. This, again, is not a given. There is some precedent for developers of algorithms and AI platforms to maintain the ‘black box’ without revealing the programming that takes in data and produces recommendations or decisions. For example, in the United States, the much discussed Eric Loomis case49x See Leagle, available at: www.courts.ca.gov/documents/BTB24-2L-3.pdf (last accessed January 2021). sets up questions about a ‘technical black box’ and a ‘legal black box’ in connection with the programming behind COMPAS and its impact on a particular litigant. Basically, the court ruled that the developers did not have to reveal the proprietary programming elements behind the AI. The stakeholders had the ability to argue that the decisions made by the programmers, and therefore the output of their AI system, result in bias or harm, but they could not examine the assumptions that were baked into the AI.50x See Megan Garber, “When Algorithms Take the Stand,” The Atlantic, 30 June 2016, available at: www.theatlantic.com/technology/archive/2016/06/when-algorithms-take-the-stand/489566/ (last accessed January 2021).
The lesson here is that input after the fact is difficult, if possible at all, and is taken only after development is complete and the technology is affecting litigants and parties. The ODR community needs to find ways to set boundaries and standards ahead of development.
Another safe prediction is that ADR through ODR will continue to grow as a separate approach to justice outside the courts. This is already happening. As discussed earlier in this chapter, DNMEA approaches to online negotiation and mediation are becoming the norm in some court systems, and the ability of ODR entry points into dispute resolution that are cheaper and faster than traditional litigation will, as Susskind noted, just make sense to the generations coming along who will be the disputants and the dispute resolvers of the future. In early 2021, the state courts of Tennessee launched a pilot ODR project working with Matterhorn to address medical debt, evictions and other ‘small claims’ disputes through an asynchronous ‘chat’ room for parties and their lawyers which incorporates mediators and real-time Zoom meetings to fully ‘virtualize’ these dispute before the courts are engaged through litigation.
Finally, the increasing use of technology will demand that lawyers, judges and others working in the field of dispute resolution develop new skills, and new ways of working. As the use of technology becomes more common, we will all learn what we know and what we do not know about how to be successful online. There are some things we can predict.Dispute resolution practitioners will not need to become programmers, nor will they need to be expert in the various skills that go into the development of complex ODR apps and platforms. They will, however, need to be educated enough about the technology they use to understand the implications of its use, and be able to explain the advantages and the dangers of using the technology to their clients and parties.
Dispute resolution practitioners will also have to understand and learn how to compensate for the changes in the communication environment offered by work in video, audio and text. If one takes face-to-face communication as the ‘gold standard’, the changes in verbal, non-verbal and environmental elements of communication that come with working online offer new areas of study for practitioners of all kinds.
Dispute resolution practitioners will come to realize that successful work with parties online requires a higher level of preparation and monitoring that does working face-to-face. As one of our colleagues who now engages in online arbitration said, ‘this takes a lot more of my time than in-person arbitration’.
Finally, dispute resolution practitioners will need to understand that working online affects every one of the basic ethical principles connected to the law and ADR. How do we guarantee self-determination? How do we guarantee privacy? How do we guarantee or even understand power imbalances?
There are a lot of pussycats and friends out there in the ODR forest, but there are some things that could be harmful, and in fact some that have already been harmful. It is imperative that we not just embrace technology: we should be appropriately wary of technology. As with all powerful engines of change, caution and deliberation can give rise to a better world.
-
1 National Center for State Courts, “What is ODR?”, available at: www.ncsc.org/odr/guidance-and-tools (last accessed January 2021).
-
2 Daniel Rainey, “Conflict Engagement and ICT: Evolution and Revolution,” The International Journal of Online Dispute Resolution, Vol. 3, No. 2, 2016, p. 81.
-
3 Outward-facing commercially available ODR apps include platforms like Modria (www.tylertech.com/resources/videos-and-webinars/online-dispute-resolution-modria-in-action), Matterhorn (https://getmatterhorn.com/) and ImageSoft (https://imagesoftinc.com/courts/online-dispute-resolution/).
-
4 World Justice Project, Rule of Law Index 2020, pp. 12-13, available at: https://worldjusticeproject.org/sites/default/files/documents/WJP-ROLI-2020-Online_0.pdf (last accessed January 2021).
-
5 See Legal Services Corporation, 2019 Annual Report, available at: https://lsc-live.app.box.com/s/boo2b9zitjdmhmh964t25ne2540flg0r (last accessed January 2021).
-
6 Legal Services Corporation, 2017 Justice Gap Report, available at: www.lsc.gov/media-center/publications/2017-justice-gap-report (last accessed January 2021).
-
7 Richard Susskind, “Forward”, in ODR Theory and Practice, 2nd Edition (Eleven International Publishers, 2021, forthcoming).
-
8 Daniel Rainey, Scott Cooper, Donald Rawlins, Kristina Yasuda, Tey Al-Rjula, and Manreet Nijjar, “Digital Identity for Refugees and Disenfranchised Populations,” The International Journal of Online Dispute Resolution, Vol. 6, No. 1, 2019, p. 22.
-
9 Stacy Butler, Sarah Mauet, Christopher L. Griffin, Jr., and Mackenzie S. Pish, “The Utah Online Dispute Resolution Platform: A Usability Evaluation and Report”, 8 September 2020, available at: https://law.arizona.edu/utah-online-dispute-resolution-platform-usability-evaluation-and-report (last accessed January 2021).
-
10 For a review of relevant US federal statutes on ADR, see ADR.gov, Key ADR Statutes, available at: www.adr.gov/adrguide/04-statutes.html (last accessed January 2021).
-
13 See Colin Rule, “Online Dispute Resolution and the Future of Justice,” available at: www.colinrule.com/writing/future.pdf (last accessed January 2021).
-
14 J. Clement, “Worldwide Internet User Penetration”, 23 July 2019, Statista, available at: www.statista.com/statistics/325706/global-internet-user-penetration/ (last accessed January 2021).
-
15 James O’Toole,”Mobile apps overtake PC Internet Usage in US,” CNN Business, 28 February 2014. Available at https://money.cnn.com/2014/02/28/technology/mobile/mobile-apps-internet/.
-
16 Digital 2020 Cameroon, available at: https://datareportal.com/reports/digital-2020-cameroon (last accessed January 2021).
-
17 NCSC Gender and Racial Fairness Resource Guide, available at: www.ncsc.org/topics/access-and-fairness/gender-and-racial-fairness/resource-guide. Last accessed January 2021.
-
18 See Diana Debruzzo, “New Jersey Set Out to Reform Its Cash Bail System. Now the Results are In,” Arnold Ventures, available at: www.arnoldventures.org/stories/new-jersey-set-out-to-reform-its-cash-bail-system-now-the-results-are-in (last accessed January 2021); and David J. Reimel, III, “Algorithms and Instruments: The Effective Elimination of New Jersey’s Cash Bail System and Its Replacement,” Penn State Law Review, Vol. 124, No. 1, 12 November 2019, pp. 195-216, available at: www.pennstatelawreview.org/print-issues/algorithms-instruments-the-effective-elimination-of-new-jerseys-cash-bail-system-and-its-replacement/ (last accessed January 2021).
-
19 Tom Simonite, “Algorithms Should’ve Made Courts More Fair. What Went Wrong?”, Wired, 5 September 2019, available at: www.wired.com/story/algorithms-shouldve-made-courts-more-fair-what-went-wrong/ (last accessed January 2021).
-
20 See Megan T. Stevenson and Jennifer L. Doleac, “Algorithmic Risk Assessment in the Hands of Humans”, available through SSRN at: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3489440 (last accessed January 2021), and Alex Chohlas-Wood, “Understanding Risk Assessment Instruments in Criminal Justice”, available at Brookings: www.brookings.edu/research/understanding-risk-assessment-instruments-in-criminal-justice/ (last accessed January 2021).
-
21 Tony Seba and James Arbib, “We are Approaching the Fastest, Deepest, Most Consequential Technological Disruption in History”, 5 October 2020. Available on the FastCompany website at: www.fastcompany.com/90559711/we-are-approaching-the-fastest-deepest-most-consequential-technological-disruption-in-history (last accessed January 2021).
-
22 Zacks Equity Research, “Microsoft (MSFT) Keen on Building Subsea Data Center Network”, 15 September 2020. Available on the Nasdaq website at: www.nasdaq.com/articles/microsoft-msft-keen-on-building-subsea-data-center-network-2020-09-15 (last accessed January 2021).
-
23 Jean R. Sternlight, “Justice in a Brave New World?”, Connecticut Law Review, Vol. 52, 2020, p. 213 (last accessed January 2021).
-
24 Suzanne Rob, “High Scrutiny of Hi-tech Data Practices”, Legal Week (online), 31 January 2020 (last accessed January 2021).
-
25 Ashley Deeks, “High-Tech International Law”, George Washington Law Review, Vol. 88, 2020, p. 574.
-
26 Ethan Murray, “The Next Generation of Office Communication Tech”, Harvard Business Review, 9 October 2020 (last accessed January 2021).
-
27 Charles Dickens, A Tale of Two Cities (New York: Bantam Classic), 1989.
-
28 E Leigh Dance and Jon Pedersen, “In-House Leaders Prepare Midyear to Advance on Digital and Operational Goals”, Corporate Counsel (online), 9 July 2020 (last accessed January 2021).
-
29 Ibid.
-
30 “In-house Counsel Go Collaboration Crazy”, Corporate Counsel Business Journal blog, 9 October 2020 (last accessed January 2021).
-
31 Rob van der Muelen, “4 Key Trends in the Gartner Hype Cycle for Legal and Compliance Technologies”, Smarter With Gartner Newsletter, 21 September 2020 (last accessed January 2021).
-
32 Christopher Suarez, “Disruptive Legal Technology, COVID-19, and Resilience in the Profession”, South Carolina Law Review, Vol. 72, 2021, p. 393 (last accessed January 2021).
-
33 Current Developments 2019-2020, Nicole Yamane, “Artificial Intelligence in the Legal Field and the Indispensable Human Element Legal Ethics Demands”, The Georgetown Journal of Legal Ethics, Vol. 33, 2021, pp. 877, 883 (last accessed January 2021).
-
34 Ibid., p. 885.
-
35 Ibid., pp. 886, 887.
-
36 Ibid., p. 889.
-
37 Ibid.
-
38 Thomas R. Moore, “The Upgraded Lawyer: Modern Technology and Its Impact on the Legal Profession Legal Profession”, University of the District of Columbia Law Review, Vol. 21, 2021, pp. 27, 28 (last accessed January 2021).
-
39 “The IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems” (last accessed January 2021).
-
40 Ibid.
-
41 One of the authors is a member of this IEEE committee.
-
42 One of the authors has proposed a Call for an AI Constitution, presented at the International Conference on Artificial Intelligence and the Law, 2020 (last accessed January 2021).
-
43 Feasibility Study, Ad Hoc Committee on Artificial Intelligence (CAHAI), Counsel of Europe (last accessed January 2021).
-
44 Ibid.
-
45 Comment by Justice Constandinos Himonas at a roundtable discussion in 2019.
-
46 Leah Wing, Co-Director, NCTDR, online interview, 26 January 2021.
-
47 http://odr.info/, https://icodr.org/, https://imimediation.org/en/, www.americanbar.org/, https://acrnet.org/. https://techforjustice/org (last accessed January 2021).
-
48 www.techforjustice.org/ibo-summit/ (last accessed January 2021).
-
49 See Leagle, available at: www.courts.ca.gov/documents/BTB24-2L-3.pdf (last accessed January 2021).
-
50 See Megan Garber, “When Algorithms Take the Stand,” The Atlantic, 30 June 2016, available at: www.theatlantic.com/technology/archive/2016/06/when-algorithms-take-the-stand/489566/ (last accessed January 2021).
DOI: 10.5553/IJODR/235250022021008001001
International Journal of Online Dispute Resolution |
|
Article | Bits and Bytes and Apps – Oh My!Scary Things in the ODR Forest |
Keywords | access to justice, digital divide, Artificial Intelligence, algorithms, Online Dispute Resolution |
Authors | Daniel Rainey en Larry Bridgesmith |
DOI | 10.5553/IJODR/235250022021008001001 |
Show PDF Show fullscreen Abstract Author's information Statistics Citation |
This article has been viewed times. |
This article been downloaded 0 times. |
Suggested citation
Daniel Rainey and Larry Bridgesmith, "Bits and Bytes and Apps – Oh My!", International Journal of Online Dispute Resolution, 1, (2021):3-19
Daniel Rainey and Larry Bridgesmith, "Bits and Bytes and Apps – Oh My!", International Journal of Online Dispute Resolution, 1, (2021):3-19
This article addresses three issues related to online dispute resolution (ODR) that offer promise, and may carry risks for those who develop, provide, and use technology to address disputes and confects. The authors offer some principles to guide the use of technology, and some predictions about the future of ODR. |