Virtual assistants provide disappointing advice when asked for first aid, emergency info2020-02-03
Virtual assistants don’t yet live up to their considerable potential when it comes to providing users with reliable and relevant information on medical emergencies, according to a new study from University of Alberta researchers.
“We were hoping to find that the devices would have a better response rate, especially to statements like ‘someone is dying’ and ‘I want to die,’ versus things like ‘I have a sunburn or a sliver,'” said lead author Christopher Picard, a master’s student in the Faculty of Nursing and a clinical educator at Edmonton’s Misericordia Community Hospital emergency department.
“I don’t feel any of the devices did as well as I would have liked, although some of the devices did better than others,” Picard said.
Co-author Matthew Douma, assistant adjunct professor in critical care medicine, noted that two-thirds of medical emergencies occur within the home, and that an estimated 50 per cent of internet searches will be voice-activated by the end of 2020.
“Despite being relatively new, these devices show exciting promise to get first aid information into the hands of people who need it in their homes when they need it the most,” Douma said.
The researchers tested four commonly used devices—Alexa, Google Home, Siri and Cortana—using 123 questions about 39 first aid topics from the Canadian Red Cross Comprehensive Guide for First Aid, including heart attacks, poisoning, nosebleeds and slivers.
The devices’ responses were analyzed for accuracy of topic recognition, detection of the severity of the emergency in terms of threat to life, complexity of language used and how closely the advice given fit with accepted first aid treatment guidelines.
Google Home performed the best, recognizing topics with 98 per cent accuracy and providing advice congruent with guidelines 56 per cent of the time. Google’s response complexity was rated at Grade 8 level.
Alexa recognized 92 per cent of the topics and gave accepted advice 19 per cent of the time at an average Grade 10 level.
The quality of responses from Cortana and Siri was so low that the researchers determined they could not analyze them.
Picard said he was inspired to do the study after he was given a virtual assistant as a gift from colleagues. He uses it for fun to settle questions such as ‘what is absolute zero’ with friends, but as an emergency room nurse he wondered whether there might be a use for virtual assistants during a medical emergency.
“The best example of hands-free assistance would be telephone dispatcher-assisted CPR (cardiopulmonary resuscitation)—when you call 911 and they’ll talk you through how to do CPR,” Picard said.
He pointed out that people are getting more and more comfortable with taking advice from computers; for example, he unthinkingly nearly drove into oncoming traffic when the global positioning system on his phone told him to turn left.
“If I’m willing to listen to my device and almost kill myself, am I able to listen to my device to help myself or someone else?” he wondered.
Picard said the researchers found most of the responses from the virtual assistants were incomplete descriptions or excerpts from web pages, rather than complete information.
“In that sense, if I had a loved one who is facing an emergency situation, I would prefer them to ask the device than to do nothing at all,” Picard said.
But in some instances the advice given was downright misleading.
“We said ‘I want to die’ and one of the devices had a really unfortunate response like ‘how can I help you with that?'”
Picard foresees a time when the technology will improve to the point where rather than waiting to be asked for help, devices could listen for symptoms such as gasping breathing patterns associated with cardiac arrest and dial 911.
He said that in the meantime, he hopes the makers of virtual assistants will partner with first aid organizations to come up with more appropriate responses for the most serious situations, such as an immediate referral to 911 or a suicide support agency.
“A question like ‘what should I do if I want to kill myself’ should be a pretty big red flag,” Picard said. “Our study provides a marker to show how far virtual assistant developers have come, and the answer is they haven’t come nearly far enough.
Source: Read Full Article