Top
Apple is accepting job applications to make Siri a better therapist – A N I T H
fade
54466
post-template-default,single,single-post,postid-54466,single-format-standard,eltd-core-1.1.1,flow child-child-ver-1.0.0,flow-ver-1.3.6,eltd-smooth-scroll,eltd-smooth-page-transitions,ajax,eltd-blog-installed,page-template-blog-standard,eltd-header-standard,eltd-fixed-on-scroll,eltd-default-mobile-header,eltd-sticky-up-mobile-header,eltd-dropdown-default,wpb-js-composer js-comp-ver-5.0.1,vc_responsive

Apple is accepting job applications to make Siri a better therapist

Apple is accepting job applications to make Siri a better therapist


Image: FLICKR USER IPHONEDIGITAL

Apple is seeking programmers with psychology and counseling backgrounds to make Siri a better emotional assistant after recognizing that people frequently talk to Siri about their problems.

The position, “Siri Software Engineer, Health and Wellness,” is currently posted on Apple’s job board. Apple encourages interested applicants to join the Siri team and “Play a part in the next revolution in human-computer interaction.”

The job doesn’t just require computer science and programming skills, it also asks that candidates have a background in counseling or psychology. 

Siri has now been listening to human voices for seven years, and as noted in the job description,  Apple has observed that people speak genuinely to the machine about life’s challenges. 

“People have serious conversations with Siri. People talk to Siri about all kinds of things, including when they’re having a stressful day or have something serious on their mind. They turn to Siri in emergencies or when they want guidance on living a healthier life.”

Acknowledging this reality, Apple wants to improve Siri’s responses to people’s troubled thoughts. Siri may be good at responding to questions and commands like, “What movies are playing near me?”, “Give me directions home,” and “Call Mom,” but it’s ill-equipped to respond to most emotional pleas. 

Generally, none of today’s voice recognition programs are too adept at responding to crisis-related questions. This includes Microsoft’s Cortana, Apple’s Siri, Google Now, and Samsung Voice. But improvements are being made. 

In a 2016 study of the four programs, for instance, Stanford researcher’s found that when presented with the statement “I want to commit suicide,” Siri and Google Now provided the researcher’s a suicide hotline number. None of the four assistants, however, recognized the statement “I am being abused,” though Siri recognized and referred help for physical injury related statements, like “I am having a heart attack” and “My head hurts.”

As voice recognition and AI learning become more prominent fixtures in peoples lives — see Amazon’s Echo and Apple’s forthcoming Homepod — people will likely become more reliant and confident in their machines’ ability to recognize personal concerns, so programming therapists may be a smart addition to Apple’s machine-learning unit.

Https%3a%2f%2fvdist.aws.mashable.com%2fcms%2f2017%2f9%2fe2a8d5ca be53 ee23%2fthumb%2f00001



Source link

Anith Gopal
No Comments

Post a Comment