Eric was diagnosed with low bone density. AI made his GP wonder if there was more to it
Save articles for later
Add articles to your saved list and come back to them any time.
Victorian doctors are harnessing artificial intelligence to help diagnose health conditions, write referrals and compile medical records.
But there are concerns the rapidly evolving technology, which signals a new era in medicine and is also being used to interpret chest X-rays and detect diseases in blood tests, could pose a risk to patient care if not used correctly.
Dr Umair Masood and his patient Eric Collicoat. Credit: Jason South
Dr Umair Masood and Dr Chris Irwin have created what they believe is the world’s first commercially available AI tool to streamline GP consultations with patients.
The pair, who work in Gisborne and Ivanhoe, have trialled the technology with more than 200 patients at their clinics over the past two months, and are now hoping to sell the software to other GPs.
“It reduces the administrative burden for doctors so they can spend more time with their patients,” Masood said, pointing out that he is now using his computer 90 per cent less.
“It also gives us diagnostic guidance. But it does not replace a doctor’s clinical judgement.”
Patient Eric Collicoat allowed this masthead to observe his recent consultation with Masood, which was recorded by the ConsultNote.ai software.
The technology compiled medical notes – omitting information that was not relevant such as banter about the cricket – and wrote up a referral to a specialist. It also provided Masood with a list of other potential diagnoses for his long-term patient.
The Gisborne GP diagnosed Collicoat with osteopenia, or low bone density, but the software also prompted him to consider an overactive parathyroid, vitamin D deficiency and chronic kidney disease.
“I knew the diagnosis was osteopenia but the technology gave me some other things to think about,” he said.
Collicoat said he enjoyed being able to maintain eye contact with Masood throughout the consultation, instead of watching him type up notes.
“It’s the future,” Collicoat said. “It’s made the consultation feel more thorough.”
Patients must consent to the technology being used during their consultation and Masood has put a sign on his wall that states: “there is an AI microphone in use”.
Bill O’Shea, a member of the Law Institute of Victoria’s health law committee, said he was concerned that some GPs may rely too heavily on artificial intelligence and that it could lead to an incorrect diagnosis.
“The GP really needs to impose their own knowledge and experience on top of any information they have received from artificial intelligence,” the former president of the Law Institute of Victoria said.
“It doesn’t relieve a medical practitioner from the liability for what it is that is coming out of the process – the diagnosis or the referrals.”
The federal government is weighing up whether further regulation is needed to mitigate the potential risks posed by artificial intelligence and has released a discussion paper on the safe and responsible use of AI across various industries.
The Therapeutic Goods Administration regulates software-based medical devices that incorporate artificial intelligence when they are used for medical purposes.
Masood said his technology complied with all relevant laws, did not store any personal information about patients and was exempt from TGA regulation because it was “intended only for the purpose of providing or supporting a recommendation to a health professional about prevention, diagnosis, curing or alleviating a disease, ailment….[and] not intended to replace the clinical judgement of a health care professional.”
A Medical Board of Australia spokeswoman said doctors, not technology, were ultimately responsible for diagnosing patients.
“Using technology to make routine administrative tasks more efficient makes perfect sense,” she said.
Rob Hosking, the chair of the Royal Australian College of General Practitioners’ expert committee on practice technology, said artificial intelligence could significantly reduce the workload for GPs.
But he said healthcare workers required training to use the technology and patients’ records needed to be secure and not stored overseas, which is required under Australian law.
Professor Farah Magrabi from the Australian Institute of Health Innovation at Macquarie University recently published a review of 266 safety incidents involving AI-assisted technology that had been reported to the US medicine watchdog. These included an ultrasound failing to pick up a heart condition that turned out to be fatal and a computer-assisted needle puncturing a spine instead of its target.
She said there was a huge amount of excitement about AI in healthcare, but it was important that the workforce was educated on how to use the technology safely and ethically.
Magrabi said artificial intelligence was already being used in Australia to interpret chest X-rays and to help nurses provide wound care.
“Humans can get tired but the AI doesn’t get tired,” she explained. “The AI is being used as a second pair of eyes.”
Get the day’s breaking news, entertainment ideas and a long read to enjoy. Sign up to receive our Evening Edition newsletter here.
Most Viewed in National
From our partners
Source: Read Full Article