The letterbox problem with voicebox assistants

There are lots of voice activated tools and services now available from software on your PC and in your car to physical hardware you can place around your home. These devices and associated requests are becoming everyday occurrences, “Alexa, whats the weather” to“Siri, recipe for Chocolate Cake” (too many to list).

The two main ways to control them is via a button press then speaking–similar to my car– to get it into a listening state. Or, they are always in a listening state awaiting a set of specific interaction commands, such as the application’s name. Thankfully, there is at least a ‘turn off listening’ mode.

However with all these devices and software, there is a distinct lack of security around voice recognition and interaction. For instance, a recent incident where a TV show caused a number of Dolls Houses to be purchased.

We are busy connecting these devices to all sorts of home automation to make it easier to do things, but how many stop to think of what I term as: “The Letterbox Problem”. This is where you have automated your home to a level that includes things like your lights, powered items and your house alarm. As you walk into your house you can say voice commands to turn on lights, put the kettle on and turn off the alarm. The Letterbox Problem happens when someone has the ability to literally shout through your letterbox and activate or deactivate items in your house. To a would be thief, turning lights on and off will check to see if anyone is at home first before going for the alarm.

The security challenge here is to ensure that a level of voice recognition and security controls are in place. Voice recognition by itself is not good enough as I’m sure you’ve heard an impressionist mimic a celebrity on a TV or Radio show.

I would like to see a form of two factor authentication on a voice system so it can be sure it’s me before it carries out the task. Voice may be one of these, but something else like a token code or app on the phone may be a solution.

There a number of basic steps you can take at the moment to help protect yourself such as:

  • Think about the systems you are connecting the voice device to. Can it compromise your security if anyone else uses it?
  • Use the mute button on devices or turn off listening mode when not in use.
  • Keep the devices updated with the latest patches and firmware.
  • Use good password security practices on any sensitive systems you use (ie Bank Accounts, Paypal etc).
  • Use strong passwords on any associated accounts to the voice assistants, (ie Amazon, Google, Apple etc).
  • If your system allows it, clear out its cache and old activities on a regular basis so they can’t be replayed against you.
  • Don’t have a system listening when the TV or Radio is on, especially when you’re out of the room. You may end up with a new dollhouse.

This entry was originally posted in Max’s blog.


Max Hemingway — Senior Architect

Max is a senior architect for DXC in the United Kingdom. With more than 25 years of experience, he has a broad and deep range of technical knowledge and is able to translate business needs into IT-based solutions. Currently the chief architect of the BAE Systems account in the UK, Max has a proven track record acquired through continual client engagement and delivery of leading edge infrastructures, all of which have delivered positive results for end-clients, including IT cost reduction, expansion of service capability and increased revenues.

Comments

  1. Yasin Kara says:

    Excellent read. What about before it finalises any online purchases, the voice assistant will ask one of your verification question e.g. What was the name of your first pet? that could stop any would-be letterbox shouters from causing mayhem.

    Like

    • Thank you. Keywords are good suggestion, however the challenge of adding extra levels of questions is that it makes the system less intuitive and in the use of common questions they are always based around the same types of things (pet names, first car, street born etc.) potentially makes other security less secure if your using the device with others in earshot. A move towards personalisation of the initialisation of the conversation (changing devices name from Siri, Alexa etc.) should help, along with the potential for voice recognition.

      Like

Trackbacks

  1. […] The letterbox problem with voicebox assistants […]

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: