When it comes to the relationship between artificial intelligences and the salute it is good to always move with the proverbial lead feet. ChatGPT and other tools based on the use of generative models cannot and must not replace real professionals, especially when life is involved medicine and the need to apply specific therapeutic indications.
We saw in the article on how to program with ChatGPT that the chatbot of OpenAI is often called a “stochastic parrot“. Services like ChatGPT, and the different ones in general generative models available today, they are based on a probabilistic process, certainly not deterministic. The AI “predicts” the most likely next words in the composition of the answer by drawing on the data collected during the phase training. Obviously, the more extensive and above all qualitatively high this data is, the more ChatGPT and “partners” are able to produce reliable answers and relevant. Although we talk about “knowledge” in the case of modern artificial intelligences, the use of that term is a bit of a stretch because – as we have explained – the answers derive from a probabilistic analysis carried out in the face ofinput o prompt provided.
Let ChatGPT and other artificial intelligence advise and do medical diagnoses However, it is a topic that is hotly debated. What is certain is that artificial intelligence can be useful as a support to possibly complement the skills of the professional.
On the other hand, a chatbot using Med-PaLM 2, model developed from health data and based on Large Language Model (LLM) Google PaLM 2 (announced in May 2023) is already used in many medical facilities and is continuously subject to tests and checks. Artificial intelligence cannot replace the experience and knowledge of doctors, but it can be used as an auxiliary tool.
ChatGPT centers the medical diagnosis by solving the problem complained of by a child
If we then talk about diagnosis that concern the health of minors, it is good to use a dose of caution even greater. However, a story that comes from the USA cannot be untold.
A mother says she subjected her 4-year-old son, Alex, to a series of medical checks for three years in a row in an attempt to resolve a problem that prevented him from playing with other children. Alex complained ache but after 17 medical opinions it was not possible to understand the reasons.
Practically until the age of 7, therefore, Alex took painkillers on a daily basis. Starting from the verification of any oral problems, the child endured the application of an appliance to avoid grinding his teeth and breathe better, then underwent physical therapies to try to resolve an imbalance between the lower limbs. The mother then consulted a series of specialists to understand the region of her son’s severe migraines and tiredness.
However, the chatbot invites you to contact specialists
Desperate, Alex’s mother entered all the details related to ChatGPT symptoms complained by the child, information on the checks carried out, the results of the checks, the opinions already received previously. Using its training data, ChatGPT’s generative model issued its diagnosis by talking about Tethered Spinal Cord Syndrome, a medical condition in which the spinal cord is retained or “tied” to the base of the spine. The normal movement of the spinal cord within the spinal canal is thus greatly impeded.
After the “primer” received from ChatGPT, the woman turned to a pediatric neurosurgeon at the University of Michigan specialized in the treatment of the syndrome in question. The doctor confirmed the diagnosis obtained by the artificial intelligence, started a specific treatment and the child is now on the mend.
“Diagnosing the syndrome can be difficult in young children because they are unable to accurately describe their symptoms“, commented Dr. Holly Gilmer, who treated Alex. “Children who undergo surgery for this syndrome can recover relatively quickly“.
In providing its personal “hypothesis” (ChatGPT doesn’t even correctly call it a diagnosis…), the chatbot had nevertheless invited people to contact medical specialists for further information on the case.