Siri now responds appropriately to sexual assaults

0
158

Siri has been updated to more appropriately and consistently respond to statements involving sexual assault and abuse, Apple confirmed.

In mid-March, JAMA Internal Medicine published an article noting how personal assistant A.I.s like Siri, Cortana, S Voice and Google Now responded inconsistently and incompletely to phrases relating to abuse or sexual assault. Apple reached out to the Rape, Abuse and Incest National Network and made the update to Siri a few days after the article published according to ABC News.

Siri’s responses have been softened, saying users “may want to reach out” for available help rather than “should reach out.”

If you say, “Siri I was raped,” Siri will respond with the following: “If you think you may have experienced sexual abuse or assault, you may want to reach out to someone at the National Sexual Assault Hotline.”

siri response

Virtual assistants are becoming much more common, now being included in almost all smartphones, and making sure they respond appropriately to phrases that indicate danger and crises is a critical part of maintaining a robust and useful A.I.

This isn’t the first time Siri has been updated to better respond to phrases like these. In 2013, ABC News reported the virtual assistant was updated to respond to suicidal phrases by suggesting calling the National Suicide Prevention Lifeline and looking up the closest suicide prevention centres.

Courtesy Mashable