Should our devices know when we’re stressed?

A couple of weeks ago. I was participating in one of CSC’s technology hangouts. The topic was around the shift to digital health. Femi Ladega and Dan Hushon did a great job explaining the engagement between healthcare provider and patient on the care journey and the digital transformation occurring in this exciting field.

After the hangout, the conversation moved onto crowdchat where some awesome questions got posed and everyone did their best to make worthwhile contributions.

In a conversation about whether patients would choose a physician based on digital capabilities, such as analytics for diagnosis, I raised a hypothetical voice instruction: “Hey Siri, take me to the best hospital for leg fractures.” This spurred a discussion about the effects of stress on the digital personality.

In the example above, the user’s focus changes from a wide-angle lens on life to a myopic view on the actions and core human instincts: “How do I get my leg fixed? Who has the kids? Did I leave the burner on the hob?” If so, does that also mean that to remain useful, the device in your pocket not only needs to understand your typical day, but something very atypical — and adapt to this new context?

Today, there is not sufficient open integration between healthcare systems, objective result analysis, digital transportation APIs and personal preferences to provide a personalised answer to the question I posed, but this will come in the near future. The data is already there. Maybe it’s not joined up yet, and has some information security hoops to jump through, but it will get there.

At the Apple World Wide Developer Conference, the keynote outlined the upcoming version of the iPhone operating system (iOS 10) in conjunction with the latest Apple WatchOS 3, which will take the first combined steps into the emergency context. A long-press of the physical button will initiate a call and send messages to ICE (in case of emergency) contacts, which takes away the burden of remembering if the local emergency number is 911, 112 or 999.

The idea that my phone might react differently if it determines that I am stressed is welcome in a lot of ways. Maybe the display would change to pastel shades, play concertos rather than drum and bass or order me a decaf rather than an espresso. But in other ways it provides yet another metric by which judgment, correlations and potential discrimination can be founded, fuelling the continuing battle between features and privacy.

Scott Hanselman demonstrates a great mock-up of this kind of integration approach in this video.

Feedback is always welcome, so feel free to get in touch @glennaugustus.


Glenn Augustus

As a Technologist in CSC’s Global Infrastructure Services, Glenn Augustus helps clients use technology to realise effective IT through the development of CSC’s infrastructure services portfolio. He has held a variety of senior architecture and engineering positions within CSC before becoming Global Offering Manager for CSC’s Storage as a Service and most recently Chief Technologist for Compute. Glenn lives with his family in the United Kingdom.




Sensors and Sensibility: The Internet of Things

How the Industrial Webolution is changing daily life for all of us


  1. […] Should our devices know when we’re stressed? […]


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: