Quantcast
© 2022 North Carolina State University

Artificial Intelligence in Veterinary Medicine Raises Ethical Challenges

Use of artificial intelligence (AI) is increasing in the field of veterinary medicine, but veterinary experts caution that the rush to embrace the technology raises some ethical considerations.


Tracey Peake
Jan 14, 2023

Photo by Lucie Helešicová on Unsplash 

Use of artificial intelligence (AI) is increasing in the field of veterinary medicine, but veterinary experts caution that the rush to embrace the technology raises some ethical considerations.

“A major difference between veterinary and human medicine is that veterinarians have the ability to euthanize patients – which could be for a variety of medical and financial reasons – so the stakes of diagnoses provided by AI algorithms are very high,” says Eli Cohen, associate clinical professor of radiology at NC State’s College of Veterinary Medicine. “Human AI products have to be validated prior to coming to market, but currently there is no regulatory oversight for veterinary AI products.”

In a review for Veterinary Radiology and Ultrasound, Cohen discusses the ethical and legal questions raised by veterinary AI products currently in use. He also highlights key differences between veterinary AI and AI used by human medical doctors.

AI is currently marketed to veterinarians for radiology and imaging, largely because there aren’t enough veterinary radiologists in practice to fill the demand. However, Cohen points out that AI image analysis is not the same as a trained radiologist interpreting images in light of an animal’s medical history and unique situation. While AI may accurately identify some conditions on an X-ray, users need to understand potential limitations. For example, the AI may not be able to identify every possible condition, and may not be able to accurately discriminate between conditions that look similar on X-rays but have different treatment courses.

Currently, the FDA does not regulate AI in veterinary products the way that it does in human medicine. Veterinary products can come to market with no oversight beyond that provided by the AI developer and/or company.

“AI and how it works is often a black box, meaning even the developer doesn’t know how it’s reaching decisions or diagnoses,” Cohen says. “Couple that with lack of transparency by companies in AI development including how the AI was trained and validated, and you’re asking veterinarians to use a diagnostic tool with no way to appraise whether or not it is accurate.

“Since veterinarians often get a single visit to diagnose and treat a patient and don’t always get follow up, AI could be providing faulty or incomplete diagnoses and a veterinarian would have limited ability to identify that, unless the case is reviewed or a severe outcome occurs,” Cohen says.

“AI is being marketed as a replacement or as having similar value to a radiologist interpretation, because there is a market gap. The best use of AI going forward, and certainly in this initial phase of deployment, is with what is called a radiologist in the loop, where AI is used in conjunction with a radiologist, not in lieu of one,” Cohen says. “This is the most ethical and defensible way to employ this emerging technology: leveraging it to get more veterinarians and pets access to radiologist consults, but most importantly to have domain experts troubleshooting the AI and preventing adverse outcomes and patient harm.”

Cohen recommends that veterinary experts partner with AI developers to ensure the quality of the data sets used to train the algorithm, and that third-party validation testing be done before AI tools are released to the public.

“Nearly everything a veterinarian could diagnose on radiographs has the potential to be medium-to-high risk, meaning leading to changes in medical treatment, surgery, or euthanasia, either due to the clinical diagnosis or client financial constraints,” Cohen says. “That risk level is the threshold the FDA uses in human medicine to determine whether there should be a radiologist in the loop. We would be wise as a profession to adopt a similar model.

“AI is a powerful tool and will change how medicine is practiced, but the best practice going forward will be using it in line with radiologists to improve access to and quality of patient care, as opposed to using it to replace those consultations.”

Publication: Eli B. Cohen, et al., First, do no harm. Ethical and legal issues of artificial intelligence and machine learning in veterinary radiology and radiation oncology, Artificial Intelligence (2022). DOI: 10.1111/vru.13171.

Original Story Source: North Carolina State University


RECOMMENDED