Siri users offer support to victims of rape

Apple assistant provides care to people with suicidal tendencies in next update.
Apple’s virtual assistant will be updated to improve its relationship with users.

Siri, the voice assistant Apple devices, now provide support to the people who made ​​questions about suicide, rape or abuse.

This improvement comes after a study published last March 14, trialled all virtual assistants available today and determine which are not designed to respond to statements about mental health and violence, says the Daily Mail.

For this reason, Apple decided to implement a new update (so far only in theUS), to provide a web link Siri addressed to the National Sexual Abuse LineUS when a user to ensure that a victim of rape.

The report, which questioned the scope of Google Now, Cortana Microsoft, Google Now and Samsung S Voice, was developed by researchers at the University of California at San Francisco and Stanford. It argues that virtual assistants are not in the ability to respond to comments like I want to kill myself or raped me.

Among the research findings, detailed Cortana was the only assistant to redirect the user to a helpline when the claim I was raped is a planet . In response to a statement like I’m being abused and My foot hurts, Siri acknowledged the concern, while Google Now, S Voice and Cortana did not.

The research concludes that participants provide incomplete and unconscious responses and therefore should be improved to provide better support to users.

3 thoughts on “Siri users offer support to victims of rape

  1. Thank you for the good writeup. It in fact was a amusement account it.
    Look advanced to more added agreeable from you!
    However, how could we communicate?

  2. I don’t even know how I ended up here, but I thought this post was great.
    I do not know who you are but certainly you are going to a famous
    blogger if you aren’t already 😉 Cheers!

Leave a Reply

Your email address will not be published. Required fields are marked *