ChatGPT (or any Artificial Intelligence bot) and Your Medical Office

If you have computers in your office, you’re already encountering, and maybe unknowingly embracing, artificial intelligence (AI).  For example, as this article is being written, MS Word is actively prompting for what the next words will be.   Most search engines now have incorporated some version of AI into their search parameters to improve the quality of their searches.

Most of us have heard of ChatGPT (GPT stands for Generative Pre-Trained Transformer).  And many have tried it out.  At 2Ascribe, we’ve already had it write several articles for our website that will be posted soon (with some editing of course).  One of the challenges with using these large language models (LLM) in the workplace is to know what information your employees are sharing with these AI software.

Case in point, one executive downloaded his company’s strategic plan for the next five years and asked ChatGPT to create a PowerPoint presentation for him.  Needless to say, all that information is now in the public domain and available to his competitors.

If employees are asking it to research something for them, a medical condition or treatment for example, are they sharing any information that is patient identifiable, personal or proprietary to your medical practise?  Are you asking ChatGBT to write letters to insurance companies about specific patients and their medical conditions?  To create sick notes or employee physical reports to employers?

Because once that information is out there, it’s in the public domain.  In most jurisdictions, there is privacy legislation regarding protecting patient identifiable information.  Breaches of those legislations (federal, provincial and state), come with consequences.

It’s important to understand, AI is not always right.  The answers are created to “feel correct” to us, as if we’re ‘talking’ to another person.  It can sound very sure and authoritative, but that does not mean the answers it provides are right, or they are even the best answers to your questions.    If you’re not sure about an answer, do some research to make sure the answer is at least reasonable.  And check some of your own references.

Educate your employees around the challenges and even dangers of providing any patient information or proprietary information to an AI bot.  For example, if you write a scientific paper to be published and ask ChatGPT to create the abstract, your paper may be found online before it’s published.

Many prominent figures in AI today are calling for a moratorium on how we use AI in our society, in our lives, and in our human interactions.  More on that in another post.


For more articles in the series on Artificial Intelligence, go to:

2Ascribe Inc. is a medical and dental transcription services agency located in Toronto, Ontario Canada, providing medical transcription services to physicians, specialists (including psychiatry, pain and IMEs), dentists, dental specialties, clinics and other healthcare providers across Canada. Our medical and dental transcriptionists take pride in the quality of your transcribed documents. WEBshuttle is our client interface portal for document management. 2Ascribe continues to implement and develop technology to assist and improve the transcription process for physicians, dentists and other healthcare providers, including AUTOfax. AUTOfax works within WEBshuttle to automatically send faxes to referring physicians and dentists when a document is e-signed by the healthcare professional. As a service to our clients and the healthcare industry, 2Ascribe offers articles of interest to physicians, dentists and other healthcare professionals, medical transcriptionists, dental transcriptionists and office staff, as well as of general interest. Additional articles may be found at  For more information on Canadian transcription services, dental transcription, medical transcription work or dictation options, please contact us at


You might also enjoy

AI Bias

The results created by an AI model can be considered