Your Amazon or Google smart speaker might one day be able to spot the signs of you having a heart attack and call for help.
This is the prediction of a group of researchers from the University of Washington, who have created a piece of artificial intelligence which listens out for the sounds associated with someone having a heart attack.
- AI can spot a heart attack over the phone with 95 percent accuracy
- Mobile health apps are driving digital healthcare growth
- New Withings heath wearables monitor your heart from home
When in cardiac arrest, the person often gasps for air in a process known as agonal breathing. But because the majority of heart attacks happen away from the hospital, and many occur in the patient's bedroom away from any witnesses, a smart speaker trained to recognize the sound of agonal breathing could sound the alarm and call for help.
The AI was designed with smart speakers like the Amazon Echo and Google Home in mind, as they feature always-listening microphones which are usually set up to respond to 'Alexa' or 'Hey Google', but could potentially be modified to pick up on agonal breathing, too. Being connected to the internet and the person's contact list, they could then call for help, or call for the emergency services.
The smart speaker could also be set up to give CPR instructions to anyone who is present on the scene, before the emergency services arrive.
On average, the proof-of-concept tool developed by the researchers was able to detect agonal breathing events 97 percent of the time from up to 20 feet away. It was taught and tested using recordings of agonal breathing from real 911 calls.
The teaching used audio from 162 emergency calls placed between 2009 and 2017. During these calls, the dispatcher asks the caller to hold the phone close to the patient, so they can identify agonal breathing. Clips 2.5 seconds long were extracted from the start of each agonal breath, to create a total of 236 audio files. Researchers then used machine learning to create a dataset of 7,316 positive clips.
For the negative dataset (recordings not featuring agonal breathing), the researchers used 83 hours of audio collected during sleep studies, producing 7,305 sound samples. These clips containing snoring and other associated sounds, but no agonal breaths. The AI was then taught the difference between these two sets of data.
Shyam Gollakota, an associate professor at the University of Washington's school of computer science and engineering, said: "We envision a contactless system that works by continuously and passively monitoring the bedroom for an agonal breathing event, and alerts anyone nearby to come provide CPR. And then if there's no response, the device can automatically call 911."
According to 911 call data cited by the University of Washington, agonal breathing occurs in around 50 percent of cardiac arrests, with those who manage to take breaths during their heart attack often having a higher survival rate.
Co-corresponding author Dr. Jacob Sunshine said: "This kind of breathing happens when a patient experiences really low oxygen levels. It's sort of a guttural gasping noise, and its uniqueness makes it a good audio biomarker to use to identify if someone is experiencing a cardiac arrest."This research comes after a Danish startup called Corti announced in January 2018 that it had developed an AI, also caught Corti, which could spot the acoustic signs of a heart attack in the background of a 911 call with 95 percent accuracy.
Check out The GearBrain, our smart home compatibility checker to see the other compatible products that work with Amazon Alexa enabled devices.