The Dangers of Artificial Intelligence

Hidden in the United States 2023 federal spending bill, among major changes to Medicare payments to doctors and post-pandemic Medicaid, lies a little noticed change with big implications: a mandate to protect medical devices connected to the internet from hacks or ransomware attacks. The law, which went into effect earlier this year, explicitly states that companies cannot sell their connected medical devices without first showing the Food and Drug Administration a solid cybersecurity plan. It also gives the FDA $5 million to see a higher security standard through. Historically, the agency has lacked the resources to keep up with rapidly-evolving security threats, or the authority to force device makers to comply with its draft guideline.

There are also legal dangers for healthcare providers.  If a clinician relies on an AI assisted support prompt without taking all other clinical information into consideration, that could pose a risk.  Physicians, clinicians and nurses need to understand how artificial intelligence works, and most importantly, understand that they are still accountable for making the final decision.  Harvey Castro, MD and ChatGPT expert, says “Acting on ChatGPT’s response without vetting the information places doctors as serious risk of a lawsuit.”  He continues, “Sometimes the response is half true and half false…  Physicians really have to make sure they are vetting the information provided.”

One thing is sure, is that the media will provide extensive coverage of AI-related medical adverse events.  This will be similar to the type of media coverage  for Tesla’s self-driving cars when systems fail and accidents happen. 

The flip side of this however, is that in the future, healthcare providers may face risks if they have not consulted with AI modalities.  “AI could even, in some modalities, become the standard of care”, says Saurabh Jha, MD (radiologist) at the University of Pennsylvania. 

There’s also the risk and cost of the AI generated response recommending unnecessary procedures, or not recommending a standard one that’s normally part of standard of care.

While there have not been any AI lawsuits filed to date, experts believe that they make an appearance soon.  “At some point, a provider will make a decision that is contrary to what the AI recommended.  The AI may be wrong, or the provider may be wrong.  Either way, the provider will neglect to document their clinical reasoning, a patient will be harmed, and we will have the first AI claims, said Sue Boisvert, senior patient safety risk manage at The Doctors Company, an American medical liability insurer.

Then there’s the issue of when ChatGPT hallucinates.  This refers to a response that AI has generated, that sounds valid, but actually is incorrect or even unrelated to the correct context.

For more articles in the series on Artificial Intelligence, go to:

2Ascribe Inc. is a medical and dental transcription services agency located in Toronto, Ontario Canada, providing medical transcription services to physicians, specialists (including psychiatry, pain and IMEs), dentists, dental specialties, clinics and other healthcare providers across Canada. Our medical and dental transcriptionists take pride in the quality of your transcribed documents. WEBshuttle is our client interface portal for document management. 2Ascribe continues to implement and develop technology to assist and improve the transcription process for physicians, dentists and other healthcare providers, including AUTOfax. AUTOfax works within WEBshuttle to automatically send faxes to referring physicians and dentists when a document is e-signed by the healthcare professional. As a service to our clients and the healthcare industry, 2Ascribe offers articles of interest to physicians, dentists and other healthcare professionals, medical transcriptionists, dental transcriptionists and office staff, as well as of general interest. Additional articles may be found at  For more information on Canadian transcription services, dental transcription, medical transcription work or dictation options, please contact us at

You might also enjoy

AI Bias

The results created by an AI model can be considered