We Will Seek Justice For Your Injuries

  1. Home
  2.  | 
  3. Medical Malpractice
  4.  | What is “black box medicine?”

What is “black box medicine?”

On Behalf of | Oct 6, 2020 | Medical Malpractice

Advancing technology has made a huge impact on the field of medicine. In addition to robotic surgical tools, artificial intelligence is also playing a role in diagnosing ailments in what is known as “block box medicine”, which involves artificial intelligence. Patient data is subjected to a computer algorithm to sort through symptoms and medical background to make the best decision about treatment.

While this is thought to be a more efficient method of diagnosing, it is not completely impervious to mistakes. There are also concerns about the legal implications of such mistakes if they cause injuries. If you are mistakenly diagnosed by AI, who is actually to blame?

Legal liability in AI misdiagnosis

It might seem like the company that created the AI responsible for the misdiagnosis would have the most legal liability. However, most patients cannot sue a business for medical malpractice, since businesses do not actually practice medicine. In this case, the patient could file a product liability claim against the business for the harm caused.

Standard of care still plays a role

While AI helps health care workers lower the risk of human error, humans are still responsible for interpreting the data provided by the algorithm. If health care staff misread results of a test and misdiagnose a patient, the medical worker can be held accountable for any damages if it is proven the worker deviated from the standard of care. Standard of care compares the conduct of one medical professional to that of a reasonable medical professional to determine whether their actions were sound. If it is determined another reasonable doctor would have read the results differently, medical malpractice can be claimed.

Privacy is another concern

When building a medical AI, patient information can be fed to the algorithm to train its decision-making process. This has raised concerns privacy and patient confidentiality, both of which are covered under HIPAA. If information is used in AI programming and still has patient identifiers attached, you could claim your rights have been violated. However, when patient information has been removed from data on medical conditions or symptoms, it can be lawfully used.


FindLaw Network