What happens in Vegas . . . now goes to lots of other places
Critical data privacy and security issues surround Echo and always-on mic devices
Sales of the the Amazon Echo and other e-enabled home devices have grown exponentially.
What happens in Vegas won’t be staying in Vegas if you book a room at glitzy resort and casino Wynn.
Last month, Wynn announced all of its thousands of swank rooms on the Strip would soon come complete with their very own, always-on microphone. As a feature.
Data from that microphone will head northwest to Seattle - and will also be harvested by Wynn, a spokesman has indicated (Vegas pioneered the gathering and use of customer data; see Adam Tanner’s What Stays in Vegas for a history).
But wait! No need to be alarmed that your aural equivalent of sexy selfies could be exposed to the world. That microphone is simply the benign one inside Amazon’s Echo, a small ‘home hub’ cylinder that can synch to other home devices. Or operate solo from your Vegas room.
The mic has to be always on to deliver the full Echo experience, in which you interact by voice with it, or perhaps ‘her’, as the Echo’s artificial intelligence is a female voice called Alexa.
On hearing her name, Alexa’s ready to wake and respond to requests for information or actions such as streaming music or making online purchases. Or whatever you might want to suddenly know about while in Vegas. Logarithms, perhaps, or Wagner’s operatic use of musical motifs.
But Alexa can’t do her thing unless the mic is on, listening for your summons. The data from your interactions is sent, encrypted, to Amazon. Amazon hasn’t made very clear what data might be sent in the waiting-but-listening mode. Users also have found Alexa can mishear her name, wake up and record when unwanted.
Such ambiguities have made the Echo of interest in a US murder trial, as the primary suspect has an Echo. Prosecutors served a warrant to Amazon for whatever data Echo might have been sent on the night of the alleged murder. So far, Amazon has ignored the warrant.
But given the routine requests for data from mobile phones, there’s no reason to suspect sound data- rightly or wrongly - won’t join the roster of seizable digital data.
All of which raises critical privacy and security questions, especially as sales of the Echo exploded in December. And it’s why, when I noted pressing technology issues and trends likely to overflow into 2017 in last week’s column, I held over microphone-enabled devices for a broader consideration this week.
Analytics company VoiceLabs says 2016 sales of the Echo, and Google’s similar Home device, grew exponentially, with Echo up at least 400 per cent over 2015. Google Home users quadrupled in just two days over Christmas. Plus, 2016 saw 1500 per cent growth in third party developers for both ‘virtual assistants’.
Those figures indicate consumers have reached a tipping point for such technology. No longer creepy, but cool.
How benign is that microphone, then? Given obvious (but always downplayed) security risks, how benign can any listening technology ever be? What is it listening to, and who hears those interactions (Intended or unintended)? How secure are the growing range of mic-activated devices, including children’s toys (Mattel this week announced a children’s version of an Echo-like device), TVs, gaming boxes, home hubs, vehicles and our mobile and computing devices?
Well, not very benign, actually. Yes, you get to be your own personal Captain Kirk talking to a seemingly semi-sentient gizmo. But frankly, you have no idea at all what is being heard and transmitted to any of a number of separate entities: the maker of the device, the provider of a service, a rogue employee, the swelling universe of third party developers (think app privacy and security issues, but with added live recording).
Not to mention hackers who might figure, why bother with PC ransomware when a hacked mic could be much more … compelling?
“Anything can be hacked,” says security expert and Europol advisor Brian Honan of BH Consulting in Dublin, of the devices. “It’s a matter of time and motive.” Plus, only some devices even send voice data securely, he notes.
And then there’s data gathering and surveillance by state agencies. The NSA, for example, can secretly control the mic of a mobile, even when the user has turned the device off.
Right now, there’s scant recognition of potential problems, no coordinated effort to securely manage not just the devices but the incoming tsunami of third party applications, and minimal industry or regulatory response besides stating (as Amazon has) that devices have privacy and security inbuilt by design (again, see Honan above).
The only (brief) guidance document around seems to be last year’s ALTA Associates’ Executive Women’s Forum’s Voice Privacy GuidingPrinciples.
The Future of Privacy Forum also has produced a useful overview. So, prepare for the inevitable high profile security and privacy breaches.