DOI: 10.5553/IJODR/235250022017004002004

International Journal of Online Dispute ResolutionAccess_open

Conference Paper

Artificial Intelligence and Online Dispute Resolution Systems Design

Lack of/Access to Justice Magnified

Keywords ODR, ethics, alternative dispute resolution, technology, dispute system design, artificial intelligence
Authors
DOI
Show PDF Show fullscreen
Abstract Author's information Statistics Citation
This article has been viewed times.
This article been downloaded 0 times.
Suggested citation
Leah Wing, "Artificial Intelligence and Online Dispute Resolution Systems Design", International Journal of Online Dispute Resolution, 2, (2017):16-20

    Recent scholarship and innovative applications of technology to dispute resolution highlight the promise of increasing access to justice via online dispute resolution (ODR) practices. Yet, technology can also magnify the risk of procedural and substantive injustice when artificial intelligence amplifies power imbalances, compounds inaccuracies and biases and reduces transparency in decision making. These risks raise important ethical questions for ODR systems design. Under what conditions should algorithms decide outcomes? Are software developers serving as gatekeepers to access to justice? Given competing interests among stakeholders, whose priorities should impact the incorporation of technology into courts and other methods of dispute resolution? Multidisciplinary collaboration and stakeholder engagement can contribute to the creation of ethical principles for ODR systems design and transparent monitoring and accountability mechanisms. Attention to their development is needed as technology becomes more heavily integrated into our legal system and forms of alternative dispute resolution.

Dit artikel wordt geciteerd in

      Tens of thousands of algorithms impact our daily lives, providing greater efficiency, protection and access as well as reducing our choices and placing us at greater risk of exploitation. The capacity of algorithms to powerfully influence us and the lack of transparency about how they are being utilized highlight the need for scrutiny as we seek to understand their impact on access to justice.1xJ. de Werra, ‘ADR in Cyberspace: The Need to Adopt Global Alternative Dispute Resolution Mechanisms for Addressing the Challenges of Massive Online Micro-Justice’, Swiss Review of International & European Law, 2016, pp. 289-306; E. Katsh & O. Rabinovitch-Einy, Digital Justice: Technology and the Internet of Disputes, Oxford, Oxford University Press, 2017; S.J. Shackelford & A.H. Raymond, ‘Building the Virtual Courthouse: Ethical Considerations for Design, Implementation, and Regulation in the World of ODR’, Wisconsin Law Review, Vol. 3, 2014, pp. 614-657; and L. Wing, ‘Ethical Principles for Online Dispute Resolution: A GPS Device for the Field’, International Journal of Online Dispute Resolution, Vol. 3, No. 1, 2016, pp. 12-29. This is important given the explosion in the use of technology for handling e-commerce disputes in particular, in a landscape in which so many are inaccessible to court redress.2xAlthough there is change afoot as legal systems around the world begin to incorporate technology; for example, the development of the online courts in England and Wales, the Hangzhou Internet Court in China, several state courts in the United States undertaking pilot projects, and British Columbia’s small claims tribunal which began using online dispute resolution in 2017 and handled 14,000 cases in its first 7 months, available at: www.americanbar.org/news/abanews/aba-news-archives/2018/02/british_columbiaodr.html (accessed 6 June 2018). While the application of technology to dispute resolution was introduced with hope and a promise to expand access justice, we are finding that it can also magnify the risk of procedural and substantive injustice.3xKatsh & Rabinovitch-Einy, 2017. See related discussions in C. Menkel-Meadow, ‘Is ODR ADR?’ International Journal of Online Dispute Resolution, Vol. 3, No. 1, 2016, pp. 4-7; N. Welsh, ‘ODR: A Time for Celebration and the Embrace of Procedural Safeguards’, 15th International Forum on Online Dispute Resolution, The Hague, May 2016, available at: www.adrhub.com/profiles/blogs/procedural-justice-in-odr (accessed 7 October 2016); L. Wing, ‘AI & ODR Systems Design: Access to Justice Ethical Challenges & Opportunities Magnified’, Online Dispute Resolution Forum, Paris, June 2017; and L. Wing, 2016, pp. 12-29. Briefly, I will discuss several ways that artificial intelligence (AI)4x‘AI’ is used here as a broad interpretation of what can fall under the category of artificial intelligence. and big data, specifically, can exponentially increase efficiency, access to participation, and even creativity as machine learning, for example, can be harnessed to generate innovative agreement options,5xKatsh & Rabinovitch-Einy, 2017; A.R. Lodder & J. Zeleznikow, Enhanced Dispute Resolution Through the Use of Information Technology, Cambridge, Cambridge University Press, 2010. and I will explore how they can also amplify power imbalances and reduce transparency in decision making, thereby decreasing access to justice6x See A. Barsky, ‘The Ethics of App-Assisted Family Mediation,’ Conflict Resolution Quarterly, Vol. 34, No. 1, Fall 2016, pp. 31-42; A.H. Raymond & S.J. Shackelford, ‘Technology, Ethics, and Access to Justice: Should an Algorithm Be Deciding Your Case?’, Michigan Journal of International Law, Vol. 35, Spring 2014, pp. 485-524; Welsh, May 2016, available at: www.adrhub.com/profiles/blogs/procedural-justice-in-odr (accessed 7 October 2016); and L. Wing, ‘AI & ODR Systems Design: Access to Justice Ethical Challenges & Opportunities Magnified,’ Online Dispute Resolution Forum, Paris, 2017. and raising serious ethical questions for online dispute resolution (ODR) systems design.7xL. Wing, 2017; and L. Wing, C. Menkel-Meadow, & J. Martinez, ‘Ethics, Technology, and Dispute Resolution Systems Design’, American Bar Association Dispute Resolution Section Conference, Washington, DC, April 2018.
      The ways in which we design ODR systems and manage data within them are central to whether they magnify the risk or the opportunities for access to justice. These new risks and opportunities in ODR are influenced by exponential growth in data size and collection (big data), data processing capabilities, AI and their integration with mega systems;8xFor example, blockchains, national and private health systems, social media platforms, etc. as well as the lack of/potential for transparency and accountability. Simultaneous to acknowledging that it poses dilemmas, it is vital to highlight that technology greatly enhances efficiency (time, costs and reduction of need for human intervention) and communication (both the speed and types).9xFor a rich discussion on this, see Katsh & O. Rabinovitch-Einy, 2017. Additionally, big data and AI are already demonstrating the ability to contribute to dispute prevention and detection through data analysis and machine learning to make possible ongoing systems re-design for early dispute detection and handling which can reduce risk, liability, harm, inefficiency and injustice.10xKatsh & Rabinovitch-Einy, 2017. These attributes can not only increase access to justice but also make possible the handling of more complex disputes and facilitate effective management of exponentially more cases, issues and stakeholders than face-to-face alternative dispute resolution (ADR). This is particularly good news when we note that by 2019 it is anticipated that there will be more than one billion e-commerce disputes annually11xC. Rule, Workshop on Private International Online Dispute Resolution, Stanford University, April 2017. and clearly, the overall number is much higher when disputes in other sectors such as labor, family and health care are included.
      Turning our attention to the risks helps elucidate how the incorporation of AI within ODR systems design can reduce access to procedural and substantive justice. The unintentional use of incomplete and inaccurate data by AI can escalate the negative impact; for example, in one study of medical records, “the authors reported that some piece of inaccurate information was present in 81 per cent to 95 per cent of patient records.”12xChan et al. cited in Katsh & Rabinovitch-Einy, 2017, p. 94. In such circumstances, inaccurate and incomplete data can exponentially increase health and/or legal risks not only for the specific patients and medical professionals involved but for even greater numbers of people when it is incorporated into big data used for algorithmically structured machine learning, insurance policy development, adjudicatory decision making or guidance for negotiated agreements.13xFor an excellent in-depth analysis, see Katsh & Rabinovitch-Einy, 2017. Another risk can occur when a system is designed in ways that benefit those in power or repeat players at the expense of others.14xThis can be the case whether or not it is intentionally designed for that outcome. Let’s take an example in which inequality is purposefully structurally determined both procedurally and substantively into the software. A system can be designed to harness the power of big data and machine learning to identify characteristics15xThese characteristics can be based on social group categories or personalized data, for example: location of purchaser in a high- or low-income neighbourhood, gender, race, age, country of purchase, purchase power based on credit card usage, the net worth of the complaining business and the likely impact of the complainant’s social media footprint on others who have significant purchasing power. of a complainant to reduce costs and risks for repeat players. The data can then be used to provide different processes and outcomes dependent upon the characteristics of the complainant. For example, a company could employ algorithms in its in-house ODR platform that determines what is offered to a specific customer16xIn business-to-customer (B2C) disputes. or business17xIn business-to-business (B2B) disputes. complainant based on their assessed characteristics and the complainant’s relationship with the company. The algorithms can be used to determine the likelihood of increasing loyalty to the company if the complainant is given access to particular types of ODR or steps in an ODR process; or it can assess the outcome least costly to the company that the complainant is likely to settle for without escalating his or her complaint to social media or breaking the complainant’s business relationship with the company. While the use of AI to assess the specific characteristics of parties can reduce their access to justice, it can also, ironically, be the lack of attention to the actual life circumstances of some parties that can result in unequal access and even exclusion from ODR processes. By not sufficiently considering the impact of the digital divide and not applying universal design principles to ODR platforms for those with disabilities,18xFor a detailed analysis of the impact of online dispute systems design in constructing or reducing barriers for those with disabilities, see D. Larson & L. Feingold, ‘ODR for All: Digital Accessibility and Disability Accommodations in Online Dispute Resolution,’ Mediate.com, May 2018, available at: www.mediate.com/articles/larsond2.cfm (accessed 8 June 2018); and see also C. Menkel-Meadow, ‘Is ODR ADR?’ International Journal of Online Dispute Resolution, Vol. 3, No. 1, 2016, p. 5. ODR systems are often designed without adequately addressing the realities of differential technological access, needs and knowledge.
      The significant impact that big data and AI can have on access to justice requires us to face the ethical implications of the centrality of AI and other forms of technology to dispute handling. Under what conditions should algorithms decide outcomes?19x See Raymond & Shackelford, Spring 2014, pp. 485-524. Should big data specialists control access to justice? How do we regulate the interface between AI, big data and the impact of platform designs on the delivery of justice? Which priorities should impact the development of ODR systems and who should decide? Who should be responsible for the creation and maintenance of regulation, monitoring and accountability? Based on which ethical standards and developed by whom? And quite importantly, while we are busy researching and analyzing these concerns, who benefits and who is most at risk while we lack agreed and effective mechanisms? We have a long history of ethical concerns regarding access to justice for ADR, for example, in terms of power imbalances and disproportionate benefits for repeat players. When we add the impact of big data and AI, we can see that these specific ethical dilemmas are not only magnified but new ethical concerns emerge as well. Examining the risks helps to highlight the importance of addressing the lack of transparency and monitoring needed to ensure ethically driven ODR systems designs that would expand rather than reduce access to justice.
      Without international standards, monitoring and global, cross-jurisdictional regulation of ODR, is the software designer becoming a gatekeeper for access to justice?20xThanks to Vikki Rogers for this powerful and apt metaphor. Recognizing some of the potential and risks related to ODR, a growing number of governmental entities have created legislation or begun discussions about its regulation.21x See A. Wiener, ‘Regulations and Standards for Online Dispute Resolution: A Primer for Policymakers and Stakeholders,’ 2001, available at: www.mediate.com/articles/awiener2.cfm (accessed 22 August 2016); and Wing, 2016, pp. 12-29. Wide stakeholder engagement in that process is crucial, and the ODR field has much to contribute not only to influence legislation and regulation, but in leading the way in their creation.22xN. Ebner & J. Zeleznikow, ‘No Sheriff in Town: Governance for the ODR Field’, Negotiation Journal, Vol. 32, No. 4, 2016, pp. 297-323; D. Rainey, ‘Third-Party Ethics in the Age of the Fourth Party’, International Journal of Online Dispute Resolution, Vol. 1, No. 1, 2014, pp. 37-56; and Wing, 2016, pp. 12-29. And in seeking to do so, we face the fact that not all stakeholders23xConsider the diversity of stakeholders which include, among others, advocates (consumer, etc.), businesses, courts, consumers, governments, in-house customer service dispute resolvers, ODR platform providers, ODR practitioners and software designers. share the same priorities, knowledge, responsibilities and power to influence the design, functioning and regulation of ODR systems. Given this, how can we best encourage thoughtful, inclusive and productive stakeholder engagement? And, how can we ensure a focus on reducing barriers and enhancing access to justice when AI and big data are utilized in ODR systems design? Thus, there is a growing call for ethically driven ODR systems design and for the development of monitoring and accountability mechanisms24xEbner & Zeleznikow, 2016, pp. 297-323; Shackelford & Raymond, 2014, pp. 614-657; A.J. Schmitz & C. Rule, The New Handshake: Online Dispute Resolution and the Future of Consumer Protection, Chicago, IL, ABA Book Publishing, 2017, p. 62; Welsh, May 2016, available at: www.adrhub.com/profiles/blogs/procedural-justice-in-odr (accessed 7 October 2016); Wing, June 2017; and Wing, et al., April 2018. based on shared ethical principles for ODR.25xL. Wing, ‘Lack of/Access to Justice Magnified: Ethics, AI and Online Dispute Resolution Systems Design,’ Law and Society Annual Conference, Toronto, June 2018; Wing, et al., April 2018; and Wing, June 2017. I offer that Ethical Principles for ODR can serve as a GPS for helping to guide us on our journey towards ensuring that access to justice and fair resolution processes are enhanced and not restricted through the application of technology. The National Center for Technology and Dispute Resolution’s Ethics Initiative has worked in international and interdisciplinary collaborations on Ethical Principles for ODR that are values and not rules 26x See Wing, 2016, pp. 12-29. that hopefully can help in creating monitoring and accountability mechanisms for the ethical design and functioning of ODR processes. They are built on shared values to provide consistency across jurisdictions; to be responsive to context (i.e., technology, sector, jurisdiction and culture); and to serve as a guide in the creation of legislation, regulation and standards for ODR.
      Ongoing multidisciplinary collaboration and stakeholder engagement can further the implementation of Ethical Principles for ODR and in particular, aid in their use in developing standards for ODR systems design and creating transparent monitoring and accountability mechanisms. Recently, one initiative has seen the Ethical Principles for ODR being translated into a set of Ethical Standards for ODR by the International Council on Online Dispute Resolution.27xAvailable at: http://icodr.org/index.php/standards/ (accessed 29 July 2018). Further such work is needed as technology is becoming more integrated into our legal systems and forms of ADR, offering both tremendous risk and potential. In such a context, it is ever more urgent to consider how to structurally determine ethical online dispute systems design to address longstanding and new barriers to access to justice.

    Noten

    • 1 J. de Werra, ‘ADR in Cyberspace: The Need to Adopt Global Alternative Dispute Resolution Mechanisms for Addressing the Challenges of Massive Online Micro-Justice’, Swiss Review of International & European Law, 2016, pp. 289-306; E. Katsh & O. Rabinovitch-Einy, Digital Justice: Technology and the Internet of Disputes, Oxford, Oxford University Press, 2017; S.J. Shackelford & A.H. Raymond, ‘Building the Virtual Courthouse: Ethical Considerations for Design, Implementation, and Regulation in the World of ODR’, Wisconsin Law Review, Vol. 3, 2014, pp. 614-657; and L. Wing, ‘Ethical Principles for Online Dispute Resolution: A GPS Device for the Field’, International Journal of Online Dispute Resolution, Vol. 3, No. 1, 2016, pp. 12-29.

    • 2 Although there is change afoot as legal systems around the world begin to incorporate technology; for example, the development of the online courts in England and Wales, the Hangzhou Internet Court in China, several state courts in the United States undertaking pilot projects, and British Columbia’s small claims tribunal which began using online dispute resolution in 2017 and handled 14,000 cases in its first 7 months, available at: www.americanbar.org/news/abanews/aba-news-archives/2018/02/british_columbiaodr.html (accessed 6 June 2018).

    • 3 Katsh & Rabinovitch-Einy, 2017. See related discussions in C. Menkel-Meadow, ‘Is ODR ADR?’ International Journal of Online Dispute Resolution, Vol. 3, No. 1, 2016, pp. 4-7; N. Welsh, ‘ODR: A Time for Celebration and the Embrace of Procedural Safeguards’, 15th International Forum on Online Dispute Resolution, The Hague, May 2016, available at: www.adrhub.com/profiles/blogs/procedural-justice-in-odr (accessed 7 October 2016); L. Wing, ‘AI & ODR Systems Design: Access to Justice Ethical Challenges & Opportunities Magnified’, Online Dispute Resolution Forum, Paris, June 2017; and L. Wing, 2016, pp. 12-29.

    • 4 ‘AI’ is used here as a broad interpretation of what can fall under the category of artificial intelligence.

    • 5 Katsh & Rabinovitch-Einy, 2017; A.R. Lodder & J. Zeleznikow, Enhanced Dispute Resolution Through the Use of Information Technology, Cambridge, Cambridge University Press, 2010.

    • 6 See A. Barsky, ‘The Ethics of App-Assisted Family Mediation,’ Conflict Resolution Quarterly, Vol. 34, No. 1, Fall 2016, pp. 31-42; A.H. Raymond & S.J. Shackelford, ‘Technology, Ethics, and Access to Justice: Should an Algorithm Be Deciding Your Case?’, Michigan Journal of International Law, Vol. 35, Spring 2014, pp. 485-524; Welsh, May 2016, available at: www.adrhub.com/profiles/blogs/procedural-justice-in-odr (accessed 7 October 2016); and L. Wing, ‘AI & ODR Systems Design: Access to Justice Ethical Challenges & Opportunities Magnified,’ Online Dispute Resolution Forum, Paris, 2017.

    • 7 L. Wing, 2017; and L. Wing, C. Menkel-Meadow, & J. Martinez, ‘Ethics, Technology, and Dispute Resolution Systems Design’, American Bar Association Dispute Resolution Section Conference, Washington, DC, April 2018.

    • 8 For example, blockchains, national and private health systems, social media platforms, etc.

    • 9 For a rich discussion on this, see Katsh & O. Rabinovitch-Einy, 2017.

    • 10 Katsh & Rabinovitch-Einy, 2017.

    • 11 C. Rule, Workshop on Private International Online Dispute Resolution, Stanford University, April 2017.

    • 12 Chan et al. cited in Katsh & Rabinovitch-Einy, 2017, p. 94.

    • 13 For an excellent in-depth analysis, see Katsh & Rabinovitch-Einy, 2017.

    • 14 This can be the case whether or not it is intentionally designed for that outcome.

    • 15 These characteristics can be based on social group categories or personalized data, for example: location of purchaser in a high- or low-income neighbourhood, gender, race, age, country of purchase, purchase power based on credit card usage, the net worth of the complaining business and the likely impact of the complainant’s social media footprint on others who have significant purchasing power.

    • 16 In business-to-customer (B2C) disputes.

    • 17 In business-to-business (B2B) disputes.

    • 18 For a detailed analysis of the impact of online dispute systems design in constructing or reducing barriers for those with disabilities, see D. Larson & L. Feingold, ‘ODR for All: Digital Accessibility and Disability Accommodations in Online Dispute Resolution,’ Mediate.com, May 2018, available at: www.mediate.com/articles/larsond2.cfm (accessed 8 June 2018); and see also C. Menkel-Meadow, ‘Is ODR ADR?’ International Journal of Online Dispute Resolution, Vol. 3, No. 1, 2016, p. 5.

    • 19 See Raymond & Shackelford, Spring 2014, pp. 485-524.

    • 20 Thanks to Vikki Rogers for this powerful and apt metaphor.

    • 21 See A. Wiener, ‘Regulations and Standards for Online Dispute Resolution: A Primer for Policymakers and Stakeholders,’ 2001, available at: www.mediate.com/articles/awiener2.cfm (accessed 22 August 2016); and Wing, 2016, pp. 12-29.

    • 22 N. Ebner & J. Zeleznikow, ‘No Sheriff in Town: Governance for the ODR Field’, Negotiation Journal, Vol. 32, No. 4, 2016, pp. 297-323; D. Rainey, ‘Third-Party Ethics in the Age of the Fourth Party’, International Journal of Online Dispute Resolution, Vol. 1, No. 1, 2014, pp. 37-56; and Wing, 2016, pp. 12-29.

    • 23 Consider the diversity of stakeholders which include, among others, advocates (consumer, etc.), businesses, courts, consumers, governments, in-house customer service dispute resolvers, ODR platform providers, ODR practitioners and software designers.

    • 24 Ebner & Zeleznikow, 2016, pp. 297-323; Shackelford & Raymond, 2014, pp. 614-657; A.J. Schmitz & C. Rule, The New Handshake: Online Dispute Resolution and the Future of Consumer Protection, Chicago, IL, ABA Book Publishing, 2017, p. 62; Welsh, May 2016, available at: www.adrhub.com/profiles/blogs/procedural-justice-in-odr (accessed 7 October 2016); Wing, June 2017; and Wing, et al., April 2018.

    • 25 L. Wing, ‘Lack of/Access to Justice Magnified: Ethics, AI and Online Dispute Resolution Systems Design,’ Law and Society Annual Conference, Toronto, June 2018; Wing, et al., April 2018; and Wing, June 2017.

    • 26 See Wing, 2016, pp. 12-29.

    • 27 Available at: http://icodr.org/index.php/standards/ (accessed 29 July 2018).


Print this article