As digital tools like OpenAI’s ChatGPT Health become more accessible, questions arise about their accuracy and safety. One Ontario physiotherapist explains why a face-to-face consultation still matters.
When Chad Watters of Lake Country Physiotherapy read OpenAI’s announcement about ChatGPT Health, he had some reservations.
“It’s one of the tools that people use, but it shouldn’t be the only tool. As physiotherapists, we like looking at the patient and seeing and feeling how things are moving,” remarks the physiotherapist.
OpenAI announced last month that it will soon roll out ChatGPT Health, a tab for health where users can upload their medical records and connect apps like Apple Health and MyFitnessPal. The program will use the medical informati…
As digital tools like OpenAI’s ChatGPT Health become more accessible, questions arise about their accuracy and safety. One Ontario physiotherapist explains why a face-to-face consultation still matters.
When Chad Watters of Lake Country Physiotherapy read OpenAI’s announcement about ChatGPT Health, he had some reservations.
“It’s one of the tools that people use, but it shouldn’t be the only tool. As physiotherapists, we like looking at the patient and seeing and feeling how things are moving,” remarks the physiotherapist.
OpenAI announced last month that it will soon roll out ChatGPT Health, a tab for health where users can upload their medical records and connect apps like Apple Health and MyFitnessPal. The program will use the medical information that users have provided to answer questions like, “Can you summarize my latest bloodwork before my appointment?”
Chad points out that people had been using search engines prior to ChatGPT to aid in self-diagnosis, and that it’s been helpful or even reassuring in many cases. “If you ask the right questions or if you have enough knowledge, the internet is good at providing you with that information.” But, he adds, “Too many people are getting misdiagnosed.”
A concern of Chad’s is that the information with which the AI is trained could be wrong. There could be new information (like new medical studies) that it hasn’t included, or the information it’s using could be inaccurate to your condition. “The AI-generated answers are dependent on what is most popular as a match on the internet for your question,” explains Chad. “It’s powerful software that can be misguided if the information isn’t parsed.”
Chad offers up an example of where AI might (or does) overlook key minute details. “Blood flow restriction has evidence that it helps to create muscle hypertrophy with lower load volume for the muscle— but you have a blood clot. The AI shouldn’t suggest this for you as an option to build muscle post-surgery,” he says.
Chad adds another anecdote: “A quick check on weighted vest for scoliosis notes that it is generally safe. This isn’t a view that would be held by many health professionals, given that the degree of scoliosis would alter our answers."
Another important thing to consider is that AI-generated answers are only as good as the question or material provided. If ChatGPT is provided your symptoms as you see them, but there are no follow-up questions like your healthcare professional would provide, it can only provide you with its best compilation of likely scenarios from that information.
If the information you provided fits with a common scenario and you are misdiagnosed by the AI agent, it could lead to months of the wrong focus. This can lead to longer recovery times or a greater number of joints and/or muscles being encapsulated by the injury, due to changes in movement pattern.
It’s beneficial to see an experienced, insightful healthcare provider who knows the right questions to ask or the conversations to have. There are all kinds of nonverbal clues and signs that providers look for and take into account when working with patients— such as a patient looking pale, fatigued, or downtrodden.
Lastly, there are privacy concerns about connecting your medical records to ChatGPT. “Our college in Ontario is quite stringent with how we handle patient health information. We need to make sure that wherever we’re storing this information, it stays on Canadian servers,” says Chad.
It’s not all bad news. AI can sometimes be a helpful part of a patient’s journey. “When you input symptoms, AI could come up with a diagnosis that was totally off the radar and is accurate, that no one was considering,” says Chad.
Recently in the news there is evidence that AI is effective at spotting rumours that the human eye has missed with imaging. This type of specific use is an excellent way to use computing to aid in healthcare.
That’s when it’s good to get a second (or third) opinion from a human healthcare provider, so you’re not led down the wrong path. After a proper diagnosis is found, the internet is also useful for finding treatment tools like strengthening exercises, which you can then run by your physiotherapist.
Chad recognizes that there are barriers to some people seeking care. Benefits packages have not increased, and he and his colleagues know that physiotherapists and other healthcare providers need to provide more from each session to maintain patient trust and health. “We’re aware that there are cost pressures and we have to be more effective at what we do,” he says.
One thing that AI can’t replace is the feeling of being supported by a fellow human, being face-to-face. Feeling heard, understood, and supported by an expert in their field.
“In today’s society where we get siloed and we live online, we feel sometimes that we’re isolated and we can’t have that conversation with somebody,” says Chad.
“Many good healthcare providers provide a trusting environment where you feel that you can open up. It’s often a weight off your shoulders and an anxiety reduction to know that someone else is supporting you through it.
For more information about Lake Country Physiotherapy and the care they provide, visit online.
