Not everyone wants ChatGPT. At least one mental health Chat isn’t interested at this point in time.
Wysa launched its AI-powered chatbot that helps people manage their mental health long before ChatGPT fueled enthusiasm for technologies that seem to think and talk like humans. But while other companies are racing to find ways to incorporate generative AI into health care, Wysa is taking a much more cautious approach to the tech, the company’s co-founder and president Ramakant Vempati reported.
Wysa’s interactive bot uses techniques from cognitive behavioral therapy to help people manage anxiety, stress, and other common issues. But it’s programming doesn’t share ChatGPT’s ‘DNA’. The bot uses natural language processing to interpret input from users, but it always delivers one of its pre-written and vetted responses. No generative responses means no potentially unsafe content.
It’s a formula that’s been working so far for Wysa, which announced a Series B funding round last year and says 6 million people have tried its app. Wysa is freely available to consumers with paid content options, and is also used by the U.K.’s National Health Service and U.S. employer groups and insurers.
Vempati said that the company has fielded a lot of questions about ChatGPT and is even having active conversations with a handful of customers about possible use cases. But as the company outlined in a recent guide to generative AI, they aren’t comfortable releasing updates that they aren’t completely sure will perform safely and reliably. Still, with proper guardrails and testing, Vempati said he believes there’s an opportunity to use generative AI to do things like help the company translate its scripts into other languages or make the bot’s conversation less dry and repetitive. He’s clear, however, that the company hasn’t embarked on any updates yet.
Vempati said that the hype around ChatGPT has created an openness to chat as a delivery mechanism for mental health care, but has also raised the bar for quality.
“Expectations have increased in terms of what the service should and can do, which is I think probably a call to action for us saying it needs to start actually delivering a very human like conversation — sometimes Wysa does not,” he said. “So how do you balance safety as well as the demand of the client?”
From STAT Health Tech, “Speaking of AI hype, the current buzz has generated the need, it seems, for storied institutions to take public positions or otherwise organize around the idea of doing AI safely and ethically. In a recent week, it reported on:
Stanford Medicine announced the launch of Responsible AI for Safe and Equitable Health, or RAISE-Health, which will be co-led by the school’s dean Lloyd Minor and computer science professor Fei-Fei Li. According to the release, the effort will “establish a go-to platform for responsible AI in health and medicine; define a structured framework for ethical standards and safeguards; and regularly convene a diverse group of multidisciplinary innovators, experts and decision makers.”
At its annual meeting, American Medical Association leaders called for “greater regulatory oversight of insurers’ use of AI in reviewing patient claims and prior authorization requests,” citing a ProPublica investigation which revealed that Cigna was using technology to enable doctors to reject huge numbers of claims without reading patient files.
And earlier this year, a STAT investigation found that Medicare Advantage plans use AI to cut off care for seniors.
Nature Medicine, the Lancet, PNAS, and other publishers are working together to develop standards for the “ethical use and disclosure of ChatGPT in scientific research.” In an email, a representative said there are concerns generative AI use might lead to plagiarism and derivative work, but that an outright ban on the technology could be short-sighted.
Unfortunately, Canada is probably years away from regulating AI. Pending legislation, Bill C-27, “The Artificial Intelligence and Data Act”, was introduced in June 2022, but isn’t expected to become law until 2025. What is currently under investigation by the federal privacy officer, is if ChatGPT is inappropriately collecting and using data on Canadians without proper consent. Philippe Dufresne, Canada’s privacy officer, said, “As regulators, we need to keep up with – and stay ahead of – fast-moving technological advances in order to protect the fundamental privacy rights of Canadians”.
For more articles in the series on Artificial Intelligence, go to:
- An AI (Artificial Intelligence) Primer
- AI’s Capabilities
- AI Achievements
- ChatGPT (or any AI bot) and Your Medical office
- Teaching AI in Medical School
- Patient Trust in AI Chatbots & ChatGPT
- Competitors to ChatGPT
- AI Policies and Regulatory Challenges
- Ai Bias
- AI’s Limitations, Concerns and Threats
- What AI Can’t and Shouldn’t Do
- The Dangers of AI
- The Future of Generative AI
2Ascribe Inc. is a medical and dental transcription services agency located in Toronto, Ontario Canada, providing medical transcription services to physicians, specialists (including psychiatry, pain and IMEs), dentists, dental specialties, clinics and other healthcare providers across Canada. Our medical and dental transcriptionists take pride in the quality of your transcribed documents. WEBshuttle is our client interface portal for document management. 2Ascribe continues to implement and develop technology to assist and improve the transcription process for physicians, dentists and other healthcare providers, including AUTOfax. AUTOfax works within WEBshuttle to automatically send faxes to referring physicians and dentists when a document is e-signed by the healthcare professional. As a service to our clients and the healthcare industry, 2Ascribe offers articles of interest to physicians, dentists and other healthcare professionals, medical transcriptionists, dental transcriptionists and office staff, as well as of general interest. Additional articles may be found at http://www.2ascribe.com. For more information on Canadian transcription services, dental transcription, medical transcription work or dictation options, please contact us at firstname.lastname@example.org.