The Future of Healthcare
-
1 Introduction
The healthcare sector is ripe for disruption by new technologies like generative AI. With the meteoric rise of ChatGPT and other natural language systems developed by OpenAI and made public in late 2022, I became fascinated by the potential applications of this technology to online dispute resolution (ODR) in healthcare. The capabilities of these systems have rapidly evolved, as evidenced by Claude 2,1x www.anthropic.com/product. a new large language model (LLM), created by Anthropic and backed by Google, which can ingest upwards of 70,000 words per prompt. Moreover, beta testing integration of AI into platforms2x https://workspace.google.com/blog/product-announcements/generative-ai. like Gmail and Google Docs foreshadows its seamless infusion into our digital lives. In this article, I will argue that ODR enabled by advanced AI could provide extraordinary benefits for improving access, efficiency and outcomes in healthcare disagreements globally. The recent explosion of generative AI capabilities opens new doors for automating and enhancing conflict resolution – doors which the healthcare sector would be remiss not to walk through. By embracing these tools, we can take a tremendous step towards patient-centred, data-driven solutions to healthcare disputes around the world.
-
2 Relevance to ODR
Ethan Katsh, a pioneer in ODR, notes that
The manner in which information is currently employed in healthcare is highly inefficient, which slows down communication and can, as a result, reduce the emergence and discovery of problems.3x Katsh, E., et al. (2011, Summer). Is there an app for that? Electronic health records (EHRs) and a new environment of conflict prevention and resolution. Law and Contemporary Problems, 74, 31-56.
The current inefficiencies in healthcare information systems that Katsh points out exacerbate communication breakdowns and delay problem discovery. However, the capabilities of advanced AI systems foreshadow seamless integration into healthcare’s digital infrastructure. This sets the stage for transformative gains through ODR. Online dispute resolution is a “mechanism for resolving disputes through the use of electronic communications and other information and communication technology”.4x UNCITRAL Technical Notes on Online Dispute Resolution – United Nations. (2016). https://uncitral.un.org/sites/uncitral.un.org/files/media-documents/uncitral/en/v1700382_english_technical_notes_on_odr.pdf. ODR offers a wide range of approaches to resolving conflicts without going to court. Key ODR processes include negotiation, mediation, arbitration and variations combining online and offline elements. Effective ODR shares core principles of fairness, transparency, independence, due process and accountability. ODR can provide faster, cheaper justice than traditional litigation. It expands access to dispute resolution for consumers and companies involved in e-commerce and other online transactions. Overall, ODR gives parties more choice over how to resolve disputes efficiently and equitably.
ODR platforms enabled by natural language processing can ingest volumes of data and rapidly extract insights to enhance communication between parties. The user-centred design principles integral to effective ODR systems will facilitate timely discovery of problems. Katsh argues that it isthe importance of trying to anticipate what disputes and problems are likely to arise as the transition proceeds over the next several years, why they are occurring, and what might be done to prevent or respond to them.5x Katsh et al. (2011).
By anticipating what problems are likely to arise, we will be in a much better position to respond to those problems.
Interoperable health IT is theability of different information systems, devices and applications (systems) to access, exchange, integrate and cooperatively use data in a coordinated manner, within and across organizational, regional and national boundaries, to provide timely and seamless portability of information and optimize the health of individuals and populations globally.6x Epalm. (2021, August 25). Interoperability in healthcare. HIMSS. https://www.himss.org/resources/interoperability-healthcare.
As noted by Katsh, interoperable health IT can improve individual patient care by providing complete, accurate and searchable health information at the point of care, allowing for more informed decision-making, earlier diagnosis, reduced adverse events, and more efficient delivery of care without unnecessary tests or delays. It also has the potential to enhance the quality and reliability of healthcare through a better understanding of each patient’s history, risks and likely response to treatments.7x Katsh et al. (2011). This “…reveals how extensively the goal of quality healthcare is dependent upon high-quality information and efficient communication”.8x Ibid. However, as electronic medical record (EMR) usage increases, issues around the accuracy of records will arise. Inevitably, EMR will present a common problem “…arising out of patients looking at their records: questions and disputes about the accuracy, meaning, and content of the record”.9x Ibid. Although HIPAA grants patients the right to request amendments, currently “there is a right but, to date at least, no efficient means to obtain a meaningful remedy” in the case of [EMRs].10x Washington, L., Katsh, E., & Sondheimer, N. (2009, November). Dispute resolution: Planning for disputed information in EHRS and PHRS. Journal of AHIMA, 80(11), 25-30. To avoid disputes escalating, systems need built-in functionality to help patients understand and submit amendment requests.
One of the oldest principles of law is that there is no right without a remedy. In the case of EMRs, it is clear there is a right (that is, to amend) but, to date, no efficient means to obtain a meaningful remedy.11x Ethan Katsh et al. (2011).
This is where ODR comes into play. With a rapidly evolving digital world,
ODR is the only approach to dispute resolution and prevention that can play a role not only in a highly complex future but one in which change is occurring at a rapid pace.12x Abdel Wahab, M., Rainey, D., & Katsh, E. (2012). Online dispute resolution: Theory and practice. Eleven International Publishing.
-
3 Why Should You Care?
An efficient ODR system is imperative for a healthy civil society. “If you don’t give people an accessible way to resolve their disputes, you undermine civil society.”13x Davis, A., & Salter, S. (2021, October). The Shannon Salter Interview – ODR. Civil Resolution Tribunal, access to justice, innovation. Other. https://youtu.be/RNqtsn5-yPw?si=qEWw-yS1d9S5D_1j. eBay’s ODR system, which purportedly resolves over 60 million disputes annually, exemplifies a successful model. “Analysis of eBay’s data revealed that buyers preferred expedient case resolution rather than prolonged proceedings. This finding aligns with the maxim ‘justice delayed is justice denied.’”14x University of Missouri. (2020). Library guides: Online dispute resolution: Companies implementing ODR. Companies Implementing ODR – Online Dispute Resolution – Library Guides at University of Missouri Libraries. https://libraryguides.missouri.edu/c.php?g=557240&p=3832247.
By studying the data uncovered in the dispute resolution process, eBay has managed to uncover common sources of problems and to structure information and services on its site so that these problems do not recur.15x William B. Ury et al., Getting Disputes Resolved: Designing Systems to Cut the Costs of Conflict (1988).
ODR systems like eBay’s demonstrate that embedding conflict resolution mechanisms within an organization’s structure facilitates fair and timely dispute redress. This prevents trivial disagreements from escalating while promoting institutional accountability. Ultimately, robust ODR systems deliver efficient justice and strengthen communal bonds by prioritizing accessible dispute mitigation. An agile digital infrastructure for conflict resolution is thus imperative for an ethical and harmonious civil society. Furthermore, Ury, Brett, and Goldberg16x Rabinovich-Einy, O., & Katsh, E. (2012). Technology and the future of dispute systems design. Harvard Negotiation Law Review, 17(Spring), 151-199, 155. argued that patterns of disputes can be predicted in closed settings. Therefore, institutionalizing avenues for addressing disputes would allow conflicts to be handled more effectively than reactive measures (as summarized by Orna Rabinovich-Einy and Katsh).17x Ibid., p. 155.
As technology continues to rapidly advance, innovative online dispute resolution systems like eBay’s will only become more critical for fairly and promptly resolving the inevitable conflicts arising from these changes. Proactive development of nimble, user-centric ODR mechanisms to address new disputes will be imperative to maintain an ethical and just civil society in the face of on-going technological disruption. Additionally, embedding ODR within systems can help identify patterns in emerging types of disputes and allow for more effective communication and conflict resolution. -
4 Dispute System Design
The challenges of implementing effective electronic health record (EHR) systems highlight the need for better communication and conflict resolution in healthcare. As a National Library of Medicine article emphasizes, flawed implementation planning, lack of clinician involvement, poor interface design, and data inaccuracies often undermine EHR success.18x Ozair, F. F., Jamshed, N., Sharma, A., & Aggarwal, P. (2015). Ethical issues in electronic health records: A general overview. Perspectives in clinical research, 6(2), 73-76. https://doi.org/10.4103/2229-3485.153997. This is where dispute system design (DSD) can play a vital role. DSD focuses on “communication,19x Rabinovich-Einy, O., & Katsh, E. (2012) Lessons from online dispute resolution for dispute systems design. In Katsh, et al. (Eds.), Online dispute resolution: Theory and practice (pp. 39-60). Eleven International Publishers. information processing, and management”20x Rabinovich-Einy and Katsh (2012). Technology and the future of dispute systems design, pp. 153. which are critical for EHR adoption.21x Rabinovich-Einy and Katsh (2012), Technology and the future of dispute systems, pp. 156. As Rabinovich-Einy and Katsh22x Ibid. explain, DSD promotes ‘interest-based processes’ that ‘preserve relationships’ and encourage collaboration. This aligns well with the participatory, user-centred design approach needed for EHR systems. DSD also provides fall-back to ‘rights- and power-based options’ when interest-based methods falter. Overall, DSD offers principles and structures to manage healthcare conflicts and prevent disputes from arising in the first place. In summary, DSD’s emphasis on communication, stakeholder engagement and layered dispute resolution mechanisms directly addresses the core challenges of EHR implementation highlighted in the National Library of Medicine article by Ozair et al.23x Ozair et al. (2015) Integrating DSD concepts and frameworks into the EHR adoption process would support more successful and sustainable outcomes. By facilitating collaboration, communication and minimizing conflicts, DSD can help unlock the potential of health information technology.
-
5 Current Developments
The Civil Resolution Tribunal (CRT) in British Columbia, Canada, has implemented various forms of artificial intelligence (AI) to expand access to justice through its four-phase dispute resolution process.24x Schmitz, A. J., & Zeleznikow, J. (2021). View of intelligent legal tech to empower self-represented litigants: Science and Technology Law Review. View of Intelligent Legal Tech to Empower Self-Represented Litigants | Science and Technology Law Review. https://journals.library.columbia.edu/index.php/stlr/article/view/9391/4800. The first phase, Problem Diagnosis, involves an ‘expert system’ called the Solution Explorer, which uses rule-based AI to guide users through plain-language questions and provide customized legal information and self-help tools.25x Ibid. The process then moves to Negotiation between parties, then Facilitation, if needed, and, finally, Adjudication by an arbitrator if prior phases fail to reach resolution. Looking ahead, the CRT is experimenting with natural language processing in the Negotiation phase to analyse texts and flag potentially inflammatory language.26x Ibid. This tool could encourage parties to reframe responses in a more productive manner, potentially leading to more successful negotiations. Though still developing, the CRT’s use of AI aims to make dispute resolution more accessible, user-friendly and effective overall.
The CRT demonstrates how AI can provide free, tailored information to help individuals resolve disputes. Similarly, Med-PaLM 2 uses AI to provide free, expert-level medical information to the public.27x Matias, Y., & Corrado, G. (2023, March 14). Our latest health AI research updates. Google. https://blog.google/technology/health/ai-llm-medpalm-research-thecheckup/. Med-PaLM 2 harnesses large language models aligned to the medical domain to accurately and safely answer health questions. It is the first AI system to achieve over 85% accuracy on US Medical Licensing Exam-style questions and over 70% on Indian medical licensing exams.28x Gupta, A., & Waldron, A. (2023, April 13). Sharing Google’s med-palm 2 medical large language model, or LLM, Google Cloud Blog. https://cloud.google.com/blog/topics/healthcare-life-sciences/sharing-google-med-palm-2-medical-large-language-model. Like the first stage of the CRT model, Med-PaLM 2 could serve as an accessible source of high-quality, tailored health information for the public. Systems like the CRT’s Solution Explorer and Med-PaLM 2 show the potential for AI to expand access to legal and medical expertise.
Google has created a new AI system, Claims Acceleration Suite, to help health insurers make faster decisions about approving care.29x Google Cloud Press Release. (2023, April 13). Google Cloud unveils new AI-enabled claims acceleration suite to streamline health insurance prior authorization and claims processing, helping experts make faster, more informed decisions. Google Cloud Press Corner. https://www.googlecloudpresscorner.com/2023-04-13-Google-Cloud-Unveils-New-AI-enabled-Claims-Acceleration-Suite-to-Streamline-Health-Insurance-Prior-Authorization-and-Claims-Processing,-Helping-Experts-Make-Faster,-More-Informed-Decisions. This AI system takes information submitted by doctors – like medical records and reason for recommended care – and automatically organizes it so insurers can review requests faster. This should cut down the current 10-day wait time for insurers to approve doctor-requested procedures or medicines.30x Ibid. The AI tools can read text in the submitted documents and find important details so human reviewers don’t have to read everything. This saves time and money spent on administration. The AI system also checks for errors and missing information in doctor’s requests to speed up the process. Two health insurance companies, Blue Shield of California and Bupa, will start using Google’s AI system. It helps them follow new rules about sharing health data while taking advantage of AI to make faster choices about covering care.31x Ibid.
The research paper that Gulshan et al.32x Gulshan, V., Peng, L., Coram, M., et al. (2016). Development and validation of a deep learning algorithm for detection of diabetic retinopathy in retinal fundus photographs. JAMA, 316(22), 2402-2410. https://doi.org/10.1001/jama.2016.17216. published in the Journal of the American Medical Association (JAMA) mentions that the Google’s AI system ARDA33x https://health.google/caregivers/arda/. accurately interpreted retinal scans to detect diabetic retinopathy. This is just another example of technology in development that will necessitate an advance in how we address disputes caused by these technologies.
Heather Landi discusses how UNC Health and other major health systems are starting to pilot the use of generative AI tools like ChatGPT that are integrated into EHR systems. The goal is to use the AI to help draft responses to common patient messages, reducing the burden on physicians. UNC Health partnered with Epic and Microsoft to integrate the Azure OpenAI service into Epic’s EHR software.34x Landi, H. (2023, May 25). Epic is going all in on Generative AI in Healthcare. here’s why health systems are eager to test-drive it. Fierce Healthcare. https://www.fiercehealthcare.com/health-tech/epic-moves-forward-bring-generative-ai-healthcare-heres-why-handful-health-systems-are. Initial use cases will focus on drafting administrative responses to patient messages, with the AI suggesting text that physicians can review and edit. Landi notes this could save significant time for physicians. Beyond auto-drafting messages, Epic is also working on using generative AI to analyse patient records and surface insights. Landi also discusses how ambient voice recognition technology like Suki is being piloted to listen to doctor-patient conversations and generate note suggestions, which could reduce EHR documentation burdens. A key theme is that major players like Epic see huge potential in using generative AI to make clinicians more efficient and improve their workflow.35x Ibid. -
6 Looking Forward
The innovations highlighted, from AI-powered diagnostic tools like ARDA to natural language processing in EMRs, showcase the transformative potential of AI in healthcare. However, as Katsh et al.36x Katsh et al. (2011) argued over a decade ago, “If technology supported healthcare is to improve the field of medicine, a similar effort at dispute prevention and resolution will be necessary.” The proliferation of AI systems makes disputes over medical recommendations, insurance coverage and liability inevitable. Just as the Civil Resolution Tribunal pioneered ODR to expand access to justice, integrated dispute resolution processes will be critical to ensure AI’s benefits are realized while protecting patients. As healthcare leverages revolutionary AI, flexible and accessible ODR mechanisms must advance in parallel to handle the new legal and ethical challenges posed. Ultimately, AI-driven medicine requires AI-enabled justice. By coupling healthcare AI with proactive dispute prevention and ODR systems, the field can progress responsibly and equitably.
-
7 Health Wearable Technology
Digging further into the practical application of ODR systems in healthcare, there are valid concerns about how emerging technologies like EHRs and wearable devices could exacerbate disputes. As Emilia et al.37x Emilia Bellucci, Andrew Stranieri and Sitalakshmi Venkatraman. 2020. Towards Smart Online Dispute Resolution for Medical Disputes. In Proceedings of Australasian Computer Science Week (ACSW 2020), February 04-06, 2020, Melbourne, VIC, Australia. ACM, New York, NY, USA, 5 pages. https://doi.org/10.1145/3373017.3373059. explain, “Electronic health records emerging in many jurisdictions provide a new source of conflict typically involving privacy, confidentiality, security and accuracy.” Specifically, increased patient access to EHRs may lead to more complaints against providers over matters of privacy and accuracy. Moreover, Emilia et al.38x Ibid. predict these issues “are expected to be exacerbated when patients interface their EHR with data from self-configuring smart IoT health devices which are used to monitor their individual health information.”
With the rise of wearable devices like Apple Watch and Oura Ring39x https://ouraring.com/oura-experience. that collect individual health data, interpreting these data in an accurate, fair manner is crucial yet challenging. As Canali et al.40x Canali, S., Schiaffonati, V., & Aliverti, A. (2022). Challenges and recommendations for wearable devices in digital health: Data quality, interoperability, health equity, fairness. PLOS Digital Health, 1(10), e0000104. https://doi.org/10.1371/journal.pdig.0000104. point out, “There are four main uses for data from health wearables: monitoring, screening, detecting, prediction….” In respect to the quality of data, Canali et al.41x Ibid. also note that[T]he variability of sensors and lack of consistency of data collection in the wearable context make it difficult to coordinate and assess quality. In addition, the lack of contextual information on the ways in which wearable data are collected, classified, and interpreted raises concerns on the possibility of assessing quality.
If wearable sensors vary in quality and consistency, it becomes hard to properly evaluate the accuracy of the health data they provide. Furthermore, the lack of context around how these data get collected and classified raises concerns about proper interpretation.
Canali et al.42x Ibid. suggest one of the four main uses for data from health wearables is prediction. Health wearables and interoperable EMRs hold great promise for predicting health events like heart attacks. However, inaccurate alerts or misleading information could cause undue anxiety or harm. For example, if wearable data are inaccurate in measuring activity, sleep, heart rate or other biometrics, it could lead to false assumptions about an individual’s health. “Gregory M. Marcus, MD, a professor of medicine at the University of California, San Francisco, noted how concern over a high heart rate increases adrenaline, causing it to beat faster.”43x Robeznieks, A. (2019, March 22). 4 mistakes your patients should avoid with wearables. American Medical Association. https://www.ama-assn.org/practice-management/digital/4-mistakes-your-patients-should-avoid-wearables. This demonstrates the health risks of relying too heavily on wearable data without verifying its accuracy and context. Similarly, if Apple’s new depression prediction feature44x https://www.apple.com/newsroom/2023/06/ios-17-makes-iphone-more-personal-and-intuitive/. mistakenly flags someone as potentially depressed, it could lead to stress. To address these risks, an ODR system should enable wearable users to inquire about or contest concerning health alerts easily. The ODR system would provide transparent explanations for how predictions are made. It would also offer a formal process for correcting mistaken alerts or data. Additionally, mental health screening tools in wearables need thoughtful design to minimize false positives. ODR systems should provide users with direct access to their raw data and any analysis should be done in a transparent manner. With appropriate safeguards in place, people can benefit from actionable health insights without undue distress over algorithmic errors or misinterpretation of data by outside parties.
Furthermore,An additional problem stems from the fact that data from these unreliable devices may not be submitted to the court in its original form, but as analyzed conclusions completed by a third-party analytics company. This presents its own problems based on the unknown and un-testable nature of the algorithms these companies use to interpret the data for use as evidence in litigation.45x Vinez, K. (2017). The admissibility of data collected from wearable devices. Stetson Journal of Advocacy and the Law, 4, 1. https://www2.stetson.edu/advocacy-journal/the-admissibility-of-data-collected-from-wearable-devices/.
While data privacy and algorithmic transparency issues pose concerns in the legal realm, consumer wearables also raise significant ethical dilemmas regarding the commercial use of such intimate user data by companies regarding autonomy and manipulation.
Hence, the choice architecture shapes our decisions, often without our recognition of the process, which, Susser suggests, compromises user autonomy since their decisions are reshaped without their knowledge. For example, Garmin watches – which function as pedometers, accelerometers and have other tracking functions that make them popular for runners – suggest articles to the wearer if they fail to meet certain metrics (e.g. activity goals) to ostensibly help improve their health and well-being, while tacitly promoting other Garmin services or products within these articles. This type of ‘adaptive choice architecture’, which uses the participants’ own data to influence their behaviour, is especially problematic in research settings. Invisible influence from the use of consumer wearables heralds the potential for corporate interest to directly shape research outcomes by influencing participant interactions with their wearables.46x Sui, A., Sui, W., Liu, S., & Rhodes, R. (2023). Ethical considerations for the use of consumer wearables in health research. Digital Health, 9, https://doi.org/10.1177/20552076231153740.
This ethical tension is further compounded by the extensive aggregation and commercial use of user data by companies who leverage intimate user information to analyse demographics, sell to third parties and train proprietary algorithms – all largely without transparency to the user.
Extensive data aggregation by companies serves several purposes. First, the data can be used to analyze the demographics of the wearable user base to identify who the consumers of the product are and what kind of behaviour(s) they perform, which by extension, can be sold to analytics and marketing firms. In this way, wearable companies act as ‘data brokers’, acquiring, merging, analyzing and sharing personal data with ‘countless recipients’. Second, data can be used to teach AI. More specifically, datasets of unprecedented proportions are used to train algorithms how to better predict, influence and adapt to consumers. The complexity, and often propriety, of these algorithms renders cross-examination of the inner workings and decisions made by these systems inaccessible.47x Ibid.
While emerging health technologies like wearables provide promising capabilities for predicting health risks, regulatory measures are needed to ensure consumer protections, verifiable system accuracy and data privacy – policy priorities that align with the European Union’s recent steps to enhance medical device cybersecurity and expand consumer legal recourse through initiatives like General Data Protection Regulation (GDPR) and updated product liability laws, developments that also present key opportunities to integrate online dispute resolution systems.
In addition to enhanced cybersecurity requirements under the [Medical Devices Regulation (MDR)], the [GDPR] (EU) 2016/679 has also afforded a higher level of protection to health data for some time. As cyber threats, including data privacy threats, continue to evolve, so too must manufacturers’ abilities to prevent and contain them.48x Herron, M. (2022, January 27). Wearable medical devices: Current challenges and emerging issues. Lexology. https://www.lexology.com/library/detail.aspx?g=8d5d7e3f-366b-4c17-81b2-a8bd7381ff83.
As Herron49x Ibid. points out, regulations like the MDR and GDPR demonstrate the EU’s commitment to strengthening cybersecurity and data privacy for medical devices and health data. However, continued vigilance is needed as threats persistently evolve. This highlights the importance of on-going policy updates to keep pace with technological change. It is a perfect opportunity for ODR systems to facilitate compliance and resolve emerging disputes. “The coming years will see updated EU product liability laws to account for emerging consumer technologies, enabling more litigation and class action claims by EU consumers.”50x Ibid.
This in turn increases the likelihood of a whole new body of case law forming across the EU in relation to liability for defective wearable devices and the software connected to them.51x Ibid.
Further EU legislation “…[is] also expected to enhance the rights of consumers and their abilities to seek redress, especially in relation to goods sold online”.52x Ibid. The anticipated changes to EU liability laws show a recognition that existing frameworks need to be adapted for new technologies like wearables. Expanding consumer legal rights and access will spur more litigation that can clarify where responsibility lies across complex supply chains. As the EU bolsters consumer protections online, integrating ODR systems provides a nimble and scalable means for resolving disputes.
-
8 Dispute Resolution in Healthcare
Communication is at the heart of many disputes and complaints in healthcare, as demonstrated by “patient-reported complaints showing that most complaints are around communication and interaction with healthcare professionals”.53x Montini, T. , Noble, A. A. , & Stelfox, H. T. (2008). Content analysis of patient complaints. International Journal of Health Care Quality, 20(6), 412–420. https://doi.org/10.1093/intqhc/mzn041. Poor communication negatively impacts the patient-provider relationship,54x “This diminished confidence is affected by healthcare providers’ lack of supportive patient-oriented communication skills as well as by the fact that the patients and healthcare professionals have different goals, needs and expectations related to the healthcare encounters (Jangland, Gunningberg, & Carlsson, 2009)” (Skär, L., & Söderberg, S. (2018). Patients’ complaints regarding healthcare encounters and communication. Nursing Open, 5(2), 224-232. https://doi.org/10.1002/nop2.132). which requires “quality communication between the patients and the professionals” to build trust and care.55x Skär and Söderberg (2018). However, “lack of time in healthcare encounters can be an obstacle” to building these caring relationships.56x Ibid. Rushed visits leave patients feeling ‘lost and ignored’, generating ‘anxiety’ and diminishing trust, a key factor shaping ‘conflict dynamics’.57x Ebner, N. (2021). The human touch in ODR: Trust, empathy and social intuition in online negotiation and mediation. In Rainey, D., Katsh, E., & Abdel Wahab, M. (Eds.), Online dispute resolution: Theory and practice (pp. 73-136, 2nd ed.). Eleven International Publishing; Skär and Söderberg (2018). Patients want to feel ‘respected’, ‘understood’ and ‘welcomed’, but often complain of experiences where “healthcare professionals did not value the patient as a person”.58x Skär and Söderberg (2018). This “uncaring behaviour affects patients’ dignity and thereby their health and well-being (Eriksson, 2006).”59x Ibid. Patients desire involvement in their care, “a system that allows him or her to be involved in the decision-making process”, but inadequate communication hinders this.60x Ibid. Ultimately, patients want resolution not just for themselves but “to prevent it from happening again, either to themselves or to other patients”, yet poor communication continues fuelling disputes.61x Ibid. Improving communication requires valuing patients, spending adequate time, building trust, respecting dignity, involving patients and resolving root causes – not just resolving individual disputes. Patient-centred communication fostering caring relationships is key to preventing conflicts. As communication breakdowns often generate disputes, healing communication through trust and understanding is imperative.
Poor communication fuels patient dissatisfaction and disputes, yetHigh-quality communication between patients and healthcare professionals is significant for increasing patients’ satisfaction and participation in decision-making (Kourkouta & Papathanasiou, 2014; Petronio, DiCorcia, & Duggan, 2012; Torke et al., 2012).62x Ibid.
To resolve communication issues, ‘healthcare organizations’ must implement “communication plans and strategies to handle patients’ complaints (Coombs, Frandsen, Holladay, & Johansen, 2010)”.63x Ibid. ODR presents an opportunity to improve communication and resolve disputes. ODR can provide “sufficient time for communication” to “create meaningful relationships”, overcoming the ‘obstacle’ of rushed healthcare encounters.64x Nygren Zotterman, A., Skär, L., Olsson, M., & Söderberg, S. (2015). District nurses’ views on quality of primary healthcare encounters. Scandinavian Journal of Caring Science, 29(3), 418. https://doi.org/10.1111/scs.12146; Skär and Söderberg (2018). By individualizing care, ODR gives patients the needed time and attention. This facilitates the mutual understanding patients value – “they were pleased that they had identified a solution together” when “healthcare professionals listened” to their experiences.65x Skär and Söderberg (2018). In summary, quality communication builds trust, satisfaction and participation while insufficient communication generates conflicts. Strategic communication plans using tools like ODR can give patients time to be heard, build mutual understanding and jointly identify solutions. By focusing on healing communication through listening and relationship building, ODR provides a path to resolve disputes stemming from poor communication in healthcare.
Another important aspect to note is that sincere apologies and transparency around adverse events can help resolve conflicts.Research by Gallagher, Waterman, Ebers, Fraser, and Levinson (2003) has shown that following an adverse event, patients want an apology, an explanation of what happened and someone to take responsibility … (Robbennolt, 2009).66x Ibid.
This is further supported by Potter,67x Poter, J. W. (2008) "Implementation of dispute resolution in refractive surgery," ADR Bulletin: Vol. 10: No. 7, Article 4. Available at: http://epublications.bond.edu.au/adr/vol10/iss7/4 who asserts that “a sincere expression of regret and complete assumption of responsibility is the best policy in every instance.” Potter delineates clear steps for an effective apology, including recognition, regret, responsibility, remedy and realignment. He contends that by proactively offering an apology instead of deflecting blame, disputes will be resolved much more effectively. As Potter68x Skär and Söderberg (2018) states,
The implementation of a process for expressing regret quickly and sincerely with complete disclosure became the most important step in the beginning of our dispute resolution and conflict management effort.
Additionally, research by Skär and Söderberg69x Potter (2008). found patients were often dissatisfied with impersonal responses that simply forwarded complaints onward without a direct apology from the provider. This aligns with Potter’s70x Ibid. observation that “patients wanted to resolve their disputes with doctors playing a significant role in dispute resolution.” This raises important questions about how ODR systems could thoughtfully incorporate the relational elements that patients value in apologies and dispute resolution, while still providing efficient and timely conflict resolution. Perhaps direct contact options to connect patients to their doctor or nurse could help address these concerns by ensuring a human connection remains available, even within primarily automated ODR processes. Overall, research clearly indicates that transparent communication and heartfelt apologies from doctors facilitate conflict resolution, posing design implications for how to meaningfully incorporate relational elements into streamlined online dispute systems in healthcare.
In order to improve patient satisfaction and prevent litigation, healthcare organizations must prioritize communication training for providers and implement clear processes for handling patient grievances with empathy. Establishing robust communication strategies will build trust, empower patients and foster constructive doctor-patient relationships. However, ODR systems will need to be improved in order to effectively address these issues. As Emilia et al.71x Emilia et al. (2020). explain,relatively simple algorithms for ODR in commercial disputes will not be sufficient for medical disputes that are characteristically more complex and emotional. Further, the involvement of human mediators in ODR will exacerbate cost constraints. This paves the way for the development of smart ODR systems that integrate artificial intelligence (AI) into ODR systems.72x Ibid.
Organizations need to adopt a patient-centred approach, as research shows that
When a healthcare organization adopts a patient-centered approach to handling complaints and preventing litigation due to mishandled healthcare communication, the quality of care can improve (McCormack & McCane, 2010).73x Skär and Söderberg (2018).
To truly provide patient-centred ODR, systems must be designed with empathy and emotional intelligence in mind. Advanced natural language processing and sentiment analysis could help AI-enabled ODR systems recognize the complexity and emotionality of medical disputes, potentially limiting the need for two-way human interaction. While developing the technological sophistication of ODR is crucial, this must be accompanied by comprehensive communication skills training for providers. With a dual focus on improving doctor-patient communication and enhancing the relational capabilities of ODR systems, healthcare organizations can foster constructive relationships built on understanding and trust.
-
9 Reason for Concern
Artificial intelligence (AI) has emerged as a transformative force in healthcare, promising improved patient outcomes through enhanced diagnosis and improved communication and recordkeeping. However, these benefits do not come without risks. The same interconnected systems that enable advanced AI also introduce vulnerabilities. As healthcare embraces innovative technologies, it must also grapple with emerging threats like cyber-attacks, biases in the data and in AI, and data breaches. Medical records contain personal information that is susceptible to misuse, and healthcare databases have become appealing targets for malicious actors, a matter of increasing concern among patients as a whole. In insurance, insurers use data like age, lifestyle and health to assess risk and set premiums. This can lead to discrimination if people are denied coverage or charged more based on innate characteristics like genetics, gender, ethnicity or family history. Some countries have acted to limit access to sensitive information. For example, “[i]n 2008, Congress passed the Genetic Information Nondiscrimination Act (GINA), which bars covered health insurers and employers from collecting and using genetic information.”74x Prince A. E. R. (2017). Insurance risk classification in an era of genomics: Is a rational discrimination policy rational? Nebraska Law Review, 96(3), 624-687. However, regulations vary worldwide. Some countries rely on industry self-regulation, with inconsistent protections. Discriminatory practices in insurance remain concerning. More comprehensive and consistent regulations are needed to prevent exclusion or unfair treatment based on uncontrollable traits. Proactive oversight and enforcement will help ensure fair access to insurance for all.
Recent statistics on breach incidents and records exposed paint an alarming picture of eroding patient privacy.1,213 patients were surveyed who had seen a physician at least once in the previous 12 months. 95% said they were concerned that their medical records would be stolen or leaked online, 70% of whom had extreme or moderate concerns about healthcare data breaches.75x Alder, S. (2023, August 2). 95% of patients are worried about medical record breaches. HIPAA Journal. https://www.hipaajournal.com/95pc-patients-worried-medical-record-breaches/.
These concerns seem warranted.
In the first half of 2023, 339 data breaches of 500 or more records had been reported to the HHS’ Office for Civil Rights, and while that represents a year-over-year decline in data breach incidents, more than 41,450,000 healthcare records have been reported as breached in the first 6 months of the year – 10 million less than the number of breached records in all of 2022.76x Ibid.
Additionally, “[a]s per the HIPAA reports, 255.18 million people were affected from 3051 healthcare data breach incidents from 2010 to 2019.”77x Seh, A. H., Zarour, M., Alenezi, M., Sarkar, A. K., Agrawal, A., Kumar, R., & Khan, R. A. (2020). Healthcare data breaches: Insights and implications. Healthcare (Basel, Switzerland), 8(2), 133. https://doi.org/10.3390/healthcare8020133. Furthermore, “43.38% of health data was compromised from 2005 to 2019, the highest among all sectors.”78x Ibid.
In addition to compromising privacy, data breaches also impose a financial burden.In the healthcare industry at present, the average cost of data breach is $6.45 million, up from $3.92 million in 2019. The average cost of a breached record [over all industries] is $150. But in the healthcare industry, the cost of each breached record was $429 in 2019.79x Ibid.
Though providers are working to improve security, more is needed, considering the incentives and abilities of bad actors. As AI ushers in a new era of enhanced care, healthcare systems must prioritize patient data protection and accuracy equally. The coming years will test whether the sector can harness technology’s benefits while safeguarding individuals’ most intimate medical information.
The danger of exposure of private healthcare information is increased when online dispute resolution is added to the picture. This is especially true if it is implemented as ‘AI-assisted ODR’. Not only must the online portion of the ODR system itself be ‘hardened’ against breach, but so must any assistive AI tool. Potential breaches of the online system are fairly well-understood, though always evolving. Potential breach scenarios involving AI-assistive technologies are less clear. -
10 Data Accuracy
Electronic health records hold great promise to facilitate [efficiency and effective communication in] both medical practice and research, but they have limitations, says Nicholas Reed, an epidemiologist at the Johns Hopkins Disability Health Research Center. Example: Many people with hearing loss don’t recognize it or acknowledge it. “We know people aren’t getting their hearing checked and many doctors don’t really code for it,” Reed says. So if he relied solely on health record data, Reed wouldn’t have included many struggling with hearing loss in a recent study. Instead, Reed and colleagues used a mathematical model for a 2019 JAMA Otolaryngology study, showing that untreated hearing loss increased the likelihood of hospitalization and caused $22,434 per person in extra health care costs. “Not everything is medical in this world, and not everything is diagnosed,” Reed says.80x Arnold, C. (2022, April 15). How biased data and algorithms can harm health. Hopkins Bloomberg Public Health Magazine. https://magazine.jhsph.edu/2022/how-biased-data-and-algorithms-can-harm-health.
By relying only on coded data, researchers fail to capture the full picture of health conditions, especially those underdiagnosed or stigmatized. The example of hearing loss demonstrates how sole reliance on EHRs skews prevalence rates and health outcomes. The researchers’ use of mathematical modelling to compensate for this limitation is clever but also highlights the need for better primary data collection and coding practices. AI-assistive technologies operating in this context (separate and apart from online ODR) could be very helpful in improving current data collection and coding practices. Looking ahead, this issue extends far beyond hearing loss. Many chronic conditions, mental health challenges and neurodiverse traits are likely underrepresented in EHRs. Patients may be unaware of or hide their symptoms, and overburdened physicians miss opportunities to probe and record them.[T]here is a serious and increasing risk that naive use of Big Data analytical techniques without a full understanding of the complexities and limitations of EHR data is resulting in biased or incorrect medical findings.81x Agniel, D., Kohane, I. S., & Weber, G. M. (2018, April 30). Biases in electronic health record data due to processes within the healthcare system: Retrospective Observational Study. The BMJ. https://www.bmj.com/content/361/bmj.k1479.
Additionally, Agniel et al.82x Ibid. have stated that,
An easily overlooked aspect of EHRs is that they are observational databases – the data reflect not only the health of the patients, but also patients’ interactions with the healthcare system. For example, the date associated with a code for diabetes is when the physician made the diagnosis, not when the patient first developed the disease. Furthermore, the billing code used for that office visit might be influenced more by reimbursement policies than the original reason for the visit. Similarly, a patient might have an elevated white blood cell count; however, it will never be known unless a physician orders the laboratory test. Hripcsak and Albers describe this as a healthcare process model, where EHR data must be viewed as an indirect measure of a patient’s true state due to the recording process.83x Ibid.
Technical solutions like natural language processing of clinical notes may help retrieve more insights from EHR narrative data. However, the root problem is the need for comprehensive, patient-centred data collection and coding.
Distance matters too. Dozens of studies have shown that patients with cancer who live far from treatment centers are screened less frequently, more likely to receive surgery than chemotherapy, and have worse outcomes. Practical issues, such as how long it takes a clinician to enter a laboratory test order into an EHR, the availability of certain tests in evenings or on weekends, and the level of automation in laboratories, also affect the timing of EHR data.84x Ibid.
Medical research could be vastly improved by rethinking how patient histories, experiences and concerns are elicited and documented during clinical encounters. More widespread use of patient-reported outcome measures could also make EHR data more complete and research-ready. Furthermore, Agniel et al.85x Ibid. state,
The effects of healthcare processes on EHR data should not be viewed as data quality problems or noise. This incorrectly suggests that these effects have no information value. In fact, they generate a signal, which can be used to identify subpopulations of patients and improve predictive models. This is especially true for laboratory tests, since they provide insight into a clinician’s decision making process. For example, through analysis of EHR data, Hripcsak and Albers found the following: “[P]atients with kidney failure are more likely to have a creatinine measurement between 10 pm86x 10 pm at night. and 6 am87x 6 am in the morning. than healthier patients; the timing of glucose measurements can be used to stratify patients into health states; and laboratory tests are ordered more frequently for sick patients.”88x Agniel et al. (2018).
These examples illustrate how analysing patterns in EHR data reveals different testing behaviours for sick versus healthy patients. This shows the power of mining EHR metadata – like timing or ordering of tests – to segment populations and build better predictive models. The effects of real-world healthcare processes on EHRs should be seen as a rich source of information, not just errors. With thoughtful analysis, signals emerge that offer new clinical insights. However, this requires recognizing that EHR data encapsulate biology and human behaviours/systems. By embracing this reality, researchers can unlock discoveries and improvements concealed within the data. This underscores the need to shift perspective and leverage healthcare processes encoded within EHR data to advance knowledge and practice.
These questions on the accuracy and or timeliness of the data are likely the root cause of many disputes and conflicts that arise in the context of healthcare. When those accuracy issues are addressed, then healthcare conflicts are likely to decline. And when a conflict does arise and needs to be resolved, the online ODR system employed must ‘be aware’ that a data accuracy or timeliness issue may be involved. -
11 Bias in AI
Recent examples highlight how AI systems can perpetuate harmful biases if not carefully designed and evaluated. AI models in healthcare are especially concerning due to their inherent opacity. Unlike traditional statistical tools, complex neural networks make it difficult to understand how various inputs relate to outputs. This black box effect compounds the existing challenge of identifying and mitigating unintended biases. As powerful new AI tools are developed for healthcare, extra diligence is required to avoid amplifying inequities. Murdoch,89x Murdoch, B. (2021, September 15). Privacy and artificial intelligence: Challenges for protecting health information in a new era – BMC medical ethics. BioMed Central. https://bmcmedethics.biomedcentral.com/articles/10.1186/s12910-021-00687-3. states,
AI [has] several unique characteristics compared with traditional health technologies. Notably, they can be prone to certain types of errors and biases, and sometimes cannot easily or even feasibly be supervised by human medical professionals. The latter is because of the ‘black box’ problem, whereby learning algorithms’ methods and ‘reasoning’ used for reaching their conclusions can be partially or entirely opaque to human observers. This opacity may also apply to how health and personal information is used and manipulated if appropriate safeguards are not in place. Notably, in response to this problem, many researchers have been developing interpretable forms of AI that will be easier to integrate into medical care. Because of the unique features of AI, the regulatory systems used for approval and ongoing oversight will also need to be unique.90x Ibid.
Well-intentioned scientists may incorporate problematic assumptions that systematically disadvantage certain groups. Without transparency into model calculations, unfair biases can go undetected. According to Arnold,91x Arnold (2022).
Decisions made by researchers can explain how an equation used to predict kidney function from common laboratory values led to unintentionally racist outcomes. Scientists believed that people of African descent had more muscle mass than those with European ancestry. Since muscle mass is a key variable in estimating kidney function from creatinine levels in the blood, scientists introduced a ‘race corrector’ into the equation. The calculations systematically overestimated renal performance in Black patients, leading to reduced access to lifesaving dialysis and kidney transplants. In a November 2021 New England Journal of Medicine study, a team of researchers from the Chronic Kidney Disease Epidemiology Consortium developed a newer, more accurate equation without the need to consider race.92x Ibid.
Thoughtful oversight throughout the development and deployment process can help promote more equitable outcomes. Still, inherent trade-offs remain between model interpretability, accuracy and fairness.
What could result is a new epidemic of misdiagnosis and missed treatments that could further widen health disparities around the world. Ferryman says that this study makes clear the brave new world of AI and big data hasn’t miraculously cured the problem of bias….93x Ibid.
On-going research to improve model transparency and mitigate bias continues to be critically important. Without careful attention, AI technologies may actually exacerbate reduced healthcare access and worsen quality disparities. This would, of course, lead to increased conflict in the healthcare context.
-
12 Access to Data
As noted above, the increasing reliance on AI and big data in healthcare raises significant privacy concerns. Large technology companies like Google, Microsoft and IBM now have access to vast amounts of sensitive patient data through partnerships with healthcare providers and research institutions. While proponents argue this data-sharing fuels innovation, critics point out that patients often need more meaningful control over how their personal information gets used.
For example, DeepMind, owned by Alphabet Inc. (hereinafter referred to as Google), partnered with the Royal Free London NHS Foundation Trust in 2016 to use machine learning to assist in the management of acute kidney injury. Critics noted that patients were not afforded agency over the use of their information, nor were privacy impacts adequately discussed. A senior advisor with England’s Department of Health said the patient info was obtained on an ‘inappropriate legal basis’. Further controversy arose after Google subsequently took direct control over DeepMind’s app, effectively transferring control over stored patient data from the United Kingdom to the United States. The ability to essentially ‘annex’ mass quantities of private patient data to another jurisdiction is a new reality of big data and one at more risk of occurring when implementing commercial healthcare AI. The concentration of technological innovation and knowledge in big tech companies creates a power imbalance where public institutions can become more dependent and less an equal and willing partner in health tech implementation.94x Murdoch (2021).
There are various methods healthcare organizations can use to protect sensitive patient data, including (1) encryption, (2) access controls, (3) data minimization, (4) segmentation and (5) auditing. First, encryption encodes data so only authorized parties can read it. This protects patient data in transit and storage. Second, access controls like passwords and multifactor authentication limit data access to appropriate users. Third, only collecting the minimum amount of data is necessary. This reduces the risk of exposure. Fourth, logically separating sensitive data from other data and applying stricter controls – for example, storing payment card data separately from other account information. Fifth, comprehensive activity auditing tracks access and changes to patient records. This enables monitoring for suspicious activity. While each method alone can help protect data, the core idea is to use a layered approach with multiple controls like the ones above to protect sensitive user data according to its level of confidentiality and risk. Without a layered approach, the risk for a data breach likely increases. For example, powerful algorithms threaten privacy by enabling re-identification of ‘anonymized’ data. Studies show that even scrubbed records can be matched to individuals with alarming accuracy, further emphasizing the need for multiple methods when creating a system to protect patient data.
A number of recent studies have highlighted how emerging computational strategies can be used to identify individuals in health data repositories managed by public or private institutions. And this is true even if the information has been anonymized and scrubbed of all identifiers. A study by Na et al., for example, found that an algorithm could be used to re-identify 85.6% of adults and 69.8% of children in a physical activity cohort study, “despite data aggregation and removal of protected health information.” A 2018 study concluded that data collected by ancestry companies could be used to identify approximately 60% of Americans of European ancestry and that, in the near future, the percentage is likely to increase substantially. Furthermore, a 2019 study successfully used a ‘linkage attack framework’ – that is, an algorithm aimed at re-identifying anonymous health information – that can link online health data to real world people, demonstrating “the vulnerability of existing online health data.” And these are just a few examples of the developing approaches that have raised questions about the security of health information framed as being confidential. Indeed, it has been suggested that today’s “techniques of re-identification effectively nullify scrubbing and compromise privacy”.95x Ibid.
As AI capabilities rapidly advance, regulations need help to keep pace.
We are currently in a situation in which regulation and oversight risk falling behind the technologies they govern. Given we are now dealing with technologies that can improve themselves at a rapid pace, we risk falling very behind, very quickly.96x Ibid.
We must thoughtfully balance innovation against core ethical values like consent, transparency and accountability. Patients deserve a voice in determining what happens to their most intimate information.
-
13 Conclusion
With the rapid proliferation of health technologies and data-driven systems, disputes over privacy breaches, biased algorithms and inaccurate records are inevitable. As healthcare rapidly digitizes, the lack of accessible and responsive avenues for patients to resolve emerging disputes poses a significant obstacle to equitable access to justice. Traditional legal options prove ineffective for individuals to seek timely redress. Costly litigation relies on legal expertise many patients lack, while regulatory bodies focus on systemic rather than individual harms. This leaves patients largely powerless over rights violations that directly impact their lives. ODR platforms embedded within healthcare’s digital infrastructure may provide a nimble solution to close this justice gap. Thoughtfully designed ODR systems can give patients a direct voice to contest harm through transparent, explanatory processes adapted to medical contexts. Critical features like simplicity, transparency and explicability allow patients to clearly understand ODR stages and outcomes. By implementing ODR within patient portals and health apps, disputes can be efficiently addressed as they arise. Communication tools foster dialogue with providers to build mutual understanding, while AI assistance automates simple negotiations. With appropriate accountability safeguards and human oversight, optimized ODR systems can resolve countless minor infringements that evade regulatory purview. Most critically, meeting patients through interfaces they already use removes awareness, cost and convenience barriers that traditionally obstruct legal recourse. Integrating intuitive ODR mechanisms throughout healthcare’s digital platforms can provide patients unprecedented self-advocacy over their rights, data and care. This seamless accessibility realizes a patient-centred vision of justice. With appropriate safeguards, streamlined ODR systems may offer the speed, accessibility and empowerment patients need to resolve healthcare disputes.
Noten
- * This work is the result of a research project, "Implementing an Online Dispute Resolution System (ODR) for Healthcare Organizations” under the auspices of Southern Methodist University (SMU) and the Open University of Catalonia. The article presents the core findings of a study conducted under the direction and supervision of Aura Esther Vilalta Nicuesa, Ph.D., Full Professor of Civil Law at the Open University of Catalonia during the period of July and August 2023.
-
2 https://workspace.google.com/blog/product-announcements/generative-ai.
-
3 Katsh, E., et al. (2011, Summer). Is there an app for that? Electronic health records (EHRs) and a new environment of conflict prevention and resolution. Law and Contemporary Problems, 74, 31-56.
-
4 UNCITRAL Technical Notes on Online Dispute Resolution – United Nations. (2016). https://uncitral.un.org/sites/uncitral.un.org/files/media-documents/uncitral/en/v1700382_english_technical_notes_on_odr.pdf.
-
5 Katsh et al. (2011).
-
6 Epalm. (2021, August 25). Interoperability in healthcare. HIMSS. https://www.himss.org/resources/interoperability-healthcare.
-
7 Katsh et al. (2011).
-
8 Ibid.
-
9 Ibid.
-
10 Washington, L., Katsh, E., & Sondheimer, N. (2009, November). Dispute resolution: Planning for disputed information in EHRS and PHRS. Journal of AHIMA, 80(11), 25-30.
-
11 Ethan Katsh et al. (2011).
-
12 Abdel Wahab, M., Rainey, D., & Katsh, E. (2012). Online dispute resolution: Theory and practice. Eleven International Publishing.
-
13 Davis, A., & Salter, S. (2021, October). The Shannon Salter Interview – ODR. Civil Resolution Tribunal, access to justice, innovation. Other. https://youtu.be/RNqtsn5-yPw?si=qEWw-yS1d9S5D_1j.
-
14 University of Missouri. (2020). Library guides: Online dispute resolution: Companies implementing ODR. Companies Implementing ODR – Online Dispute Resolution – Library Guides at University of Missouri Libraries. https://libraryguides.missouri.edu/c.php?g=557240&p=3832247.
-
15 William B. Ury et al., Getting Disputes Resolved: Designing Systems to Cut the Costs of Conflict (1988).
-
16 Rabinovich-Einy, O., & Katsh, E. (2012). Technology and the future of dispute systems design. Harvard Negotiation Law Review, 17(Spring), 151-199, 155.
-
17 Ibid., p. 155.
-
18 Ozair, F. F., Jamshed, N., Sharma, A., & Aggarwal, P. (2015). Ethical issues in electronic health records: A general overview. Perspectives in clinical research, 6(2), 73-76. https://doi.org/10.4103/2229-3485.153997.
-
19 Rabinovich-Einy, O., & Katsh, E. (2012) Lessons from online dispute resolution for dispute systems design. In Katsh, et al. (Eds.), Online dispute resolution: Theory and practice (pp. 39-60). Eleven International Publishers.
-
20 Rabinovich-Einy and Katsh (2012). Technology and the future of dispute systems design, pp. 153.
-
21 Rabinovich-Einy and Katsh (2012), Technology and the future of dispute systems, pp. 156.
-
22 Ibid.
-
23 Ozair et al. (2015)
-
24 Schmitz, A. J., & Zeleznikow, J. (2021). View of intelligent legal tech to empower self-represented litigants: Science and Technology Law Review. View of Intelligent Legal Tech to Empower Self-Represented Litigants | Science and Technology Law Review. https://journals.library.columbia.edu/index.php/stlr/article/view/9391/4800.
-
25 Ibid.
-
26 Ibid.
-
27 Matias, Y., & Corrado, G. (2023, March 14). Our latest health AI research updates. Google. https://blog.google/technology/health/ai-llm-medpalm-research-thecheckup/.
-
28 Gupta, A., & Waldron, A. (2023, April 13). Sharing Google’s med-palm 2 medical large language model, or LLM, Google Cloud Blog. https://cloud.google.com/blog/topics/healthcare-life-sciences/sharing-google-med-palm-2-medical-large-language-model.
-
29 Google Cloud Press Release. (2023, April 13). Google Cloud unveils new AI-enabled claims acceleration suite to streamline health insurance prior authorization and claims processing, helping experts make faster, more informed decisions. Google Cloud Press Corner. https://www.googlecloudpresscorner.com/2023-04-13-Google-Cloud-Unveils-New-AI-enabled-Claims-Acceleration-Suite-to-Streamline-Health-Insurance-Prior-Authorization-and-Claims-Processing,-Helping-Experts-Make-Faster,-More-Informed-Decisions.
-
30 Ibid.
-
31 Ibid.
-
32 Gulshan, V., Peng, L., Coram, M., et al. (2016). Development and validation of a deep learning algorithm for detection of diabetic retinopathy in retinal fundus photographs. JAMA, 316(22), 2402-2410. https://doi.org/10.1001/jama.2016.17216.
-
34 Landi, H. (2023, May 25). Epic is going all in on Generative AI in Healthcare. here’s why health systems are eager to test-drive it. Fierce Healthcare. https://www.fiercehealthcare.com/health-tech/epic-moves-forward-bring-generative-ai-healthcare-heres-why-handful-health-systems-are.
-
35 Ibid.
-
36 Katsh et al. (2011)
-
37 Emilia Bellucci, Andrew Stranieri and Sitalakshmi Venkatraman. 2020. Towards Smart Online Dispute Resolution for Medical Disputes. In Proceedings of Australasian Computer Science Week (ACSW 2020), February 04-06, 2020, Melbourne, VIC, Australia. ACM, New York, NY, USA, 5 pages. https://doi.org/10.1145/3373017.3373059.
-
38 Ibid.
-
40 Canali, S., Schiaffonati, V., & Aliverti, A. (2022). Challenges and recommendations for wearable devices in digital health: Data quality, interoperability, health equity, fairness. PLOS Digital Health, 1(10), e0000104. https://doi.org/10.1371/journal.pdig.0000104.
-
41 Ibid.
-
42 Ibid.
-
43 Robeznieks, A. (2019, March 22). 4 mistakes your patients should avoid with wearables. American Medical Association. https://www.ama-assn.org/practice-management/digital/4-mistakes-your-patients-should-avoid-wearables.
-
44 https://www.apple.com/newsroom/2023/06/ios-17-makes-iphone-more-personal-and-intuitive/.
-
45 Vinez, K. (2017). The admissibility of data collected from wearable devices. Stetson Journal of Advocacy and the Law, 4, 1. https://www2.stetson.edu/advocacy-journal/the-admissibility-of-data-collected-from-wearable-devices/.
-
46 Sui, A., Sui, W., Liu, S., & Rhodes, R. (2023). Ethical considerations for the use of consumer wearables in health research. Digital Health, 9, https://doi.org/10.1177/20552076231153740.
-
47 Ibid.
-
48 Herron, M. (2022, January 27). Wearable medical devices: Current challenges and emerging issues. Lexology. https://www.lexology.com/library/detail.aspx?g=8d5d7e3f-366b-4c17-81b2-a8bd7381ff83.
-
49 Ibid.
-
50 Ibid.
-
51 Ibid.
-
52 Ibid.
-
53 Montini, T. , Noble, A. A. , & Stelfox, H. T. (2008). Content analysis of patient complaints. International Journal of Health Care Quality, 20(6), 412–420. https://doi.org/10.1093/intqhc/mzn041.
-
54 “This diminished confidence is affected by healthcare providers’ lack of supportive patient-oriented communication skills as well as by the fact that the patients and healthcare professionals have different goals, needs and expectations related to the healthcare encounters (Jangland, Gunningberg, & Carlsson, 2009)” (Skär, L., & Söderberg, S. (2018). Patients’ complaints regarding healthcare encounters and communication. Nursing Open, 5(2), 224-232. https://doi.org/10.1002/nop2.132).
-
55 Skär and Söderberg (2018).
-
56 Ibid.
-
57 Ebner, N. (2021). The human touch in ODR: Trust, empathy and social intuition in online negotiation and mediation. In Rainey, D., Katsh, E., & Abdel Wahab, M. (Eds.), Online dispute resolution: Theory and practice (pp. 73-136, 2nd ed.). Eleven International Publishing; Skär and Söderberg (2018).
-
58 Skär and Söderberg (2018).
-
59 Ibid.
-
60 Ibid.
-
61 Ibid.
-
62 Ibid.
-
63 Ibid.
-
64 Nygren Zotterman, A., Skär, L., Olsson, M., & Söderberg, S. (2015). District nurses’ views on quality of primary healthcare encounters. Scandinavian Journal of Caring Science, 29(3), 418. https://doi.org/10.1111/scs.12146; Skär and Söderberg (2018).
-
65 Skär and Söderberg (2018).
-
66 Ibid.
-
67 Poter, J. W. (2008) "Implementation of dispute resolution in refractive surgery," ADR Bulletin: Vol. 10: No. 7, Article 4. Available at: http://epublications.bond.edu.au/adr/vol10/iss7/4
-
68 Skär and Söderberg (2018)
-
69 Potter (2008).
-
70 Ibid.
-
71 Emilia et al. (2020).
-
72 Ibid.
-
73 Skär and Söderberg (2018).
-
74 Prince A. E. R. (2017). Insurance risk classification in an era of genomics: Is a rational discrimination policy rational? Nebraska Law Review, 96(3), 624-687.
-
75 Alder, S. (2023, August 2). 95% of patients are worried about medical record breaches. HIPAA Journal. https://www.hipaajournal.com/95pc-patients-worried-medical-record-breaches/.
-
76 Ibid.
-
77 Seh, A. H., Zarour, M., Alenezi, M., Sarkar, A. K., Agrawal, A., Kumar, R., & Khan, R. A. (2020). Healthcare data breaches: Insights and implications. Healthcare (Basel, Switzerland), 8(2), 133. https://doi.org/10.3390/healthcare8020133.
-
78 Ibid.
-
79 Ibid.
-
80 Arnold, C. (2022, April 15). How biased data and algorithms can harm health. Hopkins Bloomberg Public Health Magazine. https://magazine.jhsph.edu/2022/how-biased-data-and-algorithms-can-harm-health.
-
81 Agniel, D., Kohane, I. S., & Weber, G. M. (2018, April 30). Biases in electronic health record data due to processes within the healthcare system: Retrospective Observational Study. The BMJ. https://www.bmj.com/content/361/bmj.k1479.
-
82 Ibid.
-
83 Ibid.
-
84 Ibid.
-
85 Ibid.
-
86 10 pm at night.
-
87 6 am in the morning.
-
88 Agniel et al. (2018).
-
89 Murdoch, B. (2021, September 15). Privacy and artificial intelligence: Challenges for protecting health information in a new era – BMC medical ethics. BioMed Central. https://bmcmedethics.biomedcentral.com/articles/10.1186/s12910-021-00687-3.
-
90 Ibid.
-
91 Arnold (2022).
-
92 Ibid.
-
93 Ibid.
-
94 Murdoch (2021).
-
95 Ibid.
-
96 Ibid.