By Stacey Kusterbeck
Could an AI tool have avoided a misdiagnosis that harmed an ED patient? This question is likely to come up during future malpractice litigation, according to healthcare attorneys and patient safety experts interviewed by ED Management.
“As AI technology expands and improves, plaintiffs’ attorneys may argue that failure to use the technology is a deviation from the standard of care,” says Bill Bower, JD, leader of the Healthcare Practice at Gallagher Bassett.
In the courtroom, the admissibility of such arguments will be based on expert testimony and the determination of whether use of AI tools is generally accepted in the medical community. “AI, in clinical practice, has not quite reached the level of evolution where it is being used in the courthouse as setting the standard of care,” explains Bower.
For a plaintiff’s expert to argue that an AI tool is standard of care in the ED, there are some hurdles to overcome. The expert would have to provide a foundation for such testimony by demonstrating that the majority of institutions in the United States employ such a tool, and that the tool’s use is generally recognized in the healthcare industry as standard practice. “The defense would be allowed to challenge such testimony prior to its introduction by establishing with the court that the use of the AI tool is not standard and has not been widely accepted by the medical community,” says Bower.
AI tools certainly can provide decision support and assistance to ED clinicians. That does not equate to being deemed the legal standard of care. “While AI tools are becoming more accepted in the administrative business of healthcare, AI is in its relative infancy in care delivery,” explains Bower.
In the ED setting, AI tools could be considered the standard of care for specific medical conditions (such as sepsis) only after it has been convincingly demonstrated that the AI tools are better than the usual care provided in the ED. “While I expect they will eventually be better, the results so far have been mixed, and it is not clear when that will happen,” says David W. Bates, MD, MS, professor of medicine at Harvard Medical School and director of the Center for Patient Safety Research and Practice at Brigham and Women’s Hospital.
The legal standard of care an ED provider is held to is continuously evolving, as medical knowledge and practices change over time. “The use of AI tools in healthcare is rapidly advancing, with the potential to become a standard part of patient evaluation in the future,” says Hala Helm, JD, strategic healthcare risk advisor for Marsh, a global insurance broker and risk advisory firm.
The timeline for when this might happen is difficult to predict. “It will depend on several factors, including regulatory approval, clinical validation, and acceptance by healthcare professionals and patients,” says Helm.
For AI tools to be considered part of the standard of care, there must be substantial evidence from clinical trials and real-world evidence demonstrating efficacy, safety, and cost-effectiveness in improving patient outcomes, according to Jade Davis, JD, an attorney at Hall Booth Smith. The widespread adoption of AI as a standard of care also requires the necessary technological infrastructure within EDs, adds Davis. EDs must have the appropriate hardware, software, and support systems for data management and security. The AI tools also need to be integrated into ED clinical practice guidelines and protocols. “This involves endorsement by professional societies and healthcare organizations, which play a crucial role in setting the standards of care by recommending practices based on evidence,” says Davis.