Artificial intelligence may contain the answer to one of humanity’s biggest challenges… but what is it? And how can AI help?
Estimates say the global population will swell to almost 10 billion by 2050. Which begs the question: how will healthcare providers cope? Artificial intelligence may hold the key. But it’s not all clear-cut: while some harbor apprehension about the impact of our health being heavily reliant on machines, many scientists and innovators are working round the clock to prove that AI tech can revolutionize healthcare.
It’s one of the world’s biggest issues. The healthcare industry is crucial to everyone on the planet’s well-being. So how can AI help? Put simply: by doing the more time-intensive work, AI will free up more resources for specific patient care, research and more than likely give healthcare professionals a few more precious hours sleep at night. But how? These pioneering uses of AI in healthcare today will blow your mind.
In a nutshell: protein folding is the ability to predict a protein’s shape. Why is that important? Proteins are a key part of the structure and function of body tissue and organs. When you understand the details of proteins, you get insights into how they function and how you can change those functions.
Protein folding helps scientists understand protein’s role within the body, as well as diagnosing and treating diseases believed to be caused by misfolded proteins, such as Alzheimer’s, Parkinson’s, Huntington’s and cystic fibrosis. In the past, scientists relied on manual techniques that involved hours of trial and error, costing thousands of dollars per trial. Now, biologists are turning to AI as an accurate, time-saving alternative.
Introducing Deep Mind’s Alpha Fold, a 3D model for predicting protein structures based on artificial intelligence and many years of research. With the introduction of this new tech, scientists can design new, effective cures for diseases more efficiently while reducing the costs associated with experimentation. Win-win.
12 patients, one local hospital, and a very positive outcome. Researchers from the University of Oxford have completed the first successful trial of robot-assisted retinal surgery, with the help of Dutch firm Preceyes, the latter having developed the tech aimed at assisting eye surgeons. Each patient in the trial had a membrane removed from the back of their eye, but only six were operated on normally, the other half with Preceyes. OK, so the robot-assisted method took longer (four minutes compared to one), but the surgery report noted that the robot was able to perform the procedure with equal or better efficacy than in the traditional manual approach. The next step? Robotic surgical devices will be used for the precise and minimally traumatic delivery of gene therapy to the retina – a first for humankind.
3D imagery scanning
Comparing 3D scans is a crucial part of a diagnosis, but as it can take two hours or more, there’s been a desperate need to make the process quicker. Those calls have been answered: Massachusetts Institute of Technology researchers have created a machine-learning algorithm that compares scans up to 1,000 times faster than the human eye.
The algorithm works by ‘learning’ as it scans thousands of pairs of images. While uploading the pictures, it teaches itself how to align them perfectly atop each other before mapping all the pixels together at once. Image scanning can now take from one second to two minutes, depending on the computer. Which means, in theory, surgeons could find out how successful a completed surgery was right there and then, in the operating theater.
Speeding up diagnosis with IBM Watson
IBM Watson, created in 2010, is an AI-operated database that helps healthcare professionals make medical decisions easier and quicker by drawing on past research sources.
So far, Watson’s uptake has been positive. Especially when you consider it’s now in use at three of the top cancer hospitals in the US where it’s speeding up DNA analysis in cancer patients, helping to make their treatment more effective. It’s not just a one-trick pony, mind. Watson’s power uses visual recognition to highlight points of interest in X-ray and MRI scans, speeding up analysis for doctors and their patients.
Natural language processing (NLP) for clinicians
Natural language processing is a branch of AI that converts human speech into written text. Let’s take Dragon Medical One, for example. A cloud-based speech recognition system, it allows clinicians to log a patient’s diagnosis without a pen in sight. This gives the clinician more time to engage with the patient instead of writing notes.
It’s having quite an effect. Nuance Communications, creators of Dragon Medical One, claims to have helped Allina Health, a US not-for-profit healthcare system, boost their clinicians’ productivity. Apparently, Allina Health saw 70 percent of their voice-based documentation automated and a 167 percent increase in the overall amount of documentation captured after adopting the system.
What does this mean for cybersecurity?
Yep, you guessed it: using more cloud-based and software-based solutions means a higher risk of a cyberattack. Granted, humans might be slower and less exciting, but they can’t be hacked into…unless they have augmented bodies. That said, it’s a small price to pay for a healthcare revolution.