Researchers at several universities in the United States say that Siri has improved its responses to medical emergencies and personal crises over the past year, but has yet to improve further. A Stanford study conducted about a year ago found that smart assistants like CrabCortana or S Voice used to be off the mark when we needed medical help, responding with "flippant comments" or doing an internet search when someone said things like "I'm depressed." Adam Miner, the study's lead author, says things have improved since then.
According to Miner, «Now Siri recognizes statements like 'I've been raped' and recommends going to the National Sexual Assautl Hotline«, Something that, at least at the time of writing this post, does not offer a similar result if we say it in Spanish. As always, or more than we would like, the improvements of everything related to Apple arrive before the United States, then to countries like Canada and Australia and then to the rest of the world.
Siri keeps getting better, but has yet to do more
Miner wants companies to create standards for recognizing emergencies and offering appropriate responses:
Our team saw it as an opportunity to create health conscious virtual agents. Getting that person to the right source is a win for everyone.
Right now, all virtual assistants have to improve. For Siri, you don't just need to improve on your health responses; He also has to improve his artificial intelligence, for which it would help him to remember or be able to follow the thread of a conversation. This year has taken an important step forward thanks to the launch of SiriKit and other SDKs through which we can, for example, send a WhatsApp asking Siri for it.
What would you like Siri to be able to do in the medium term future?