How would you feel about having a device in your home that’s always listening to what’s going on, standing ready to record, process and store any information it receives? That might be a somewhat alarmist way of putting it, but it’s essentially what smart home speakers do.
Smart speakers offer audio playback but also feature internet connectivity and often a digital assistant, which dramatically expands their functionality.
With today’s smart speakers, you can search the internet, control other home automation devices, shop online, send text messages, schedule alarms and more.
This connectivity and functionality may offer us convenience, but as with any new connected technology, these speakers also come with security concerns. Any time you add a node to your network, you open yourself up to more potential vulnerabilities. Since smart home tech is still relatively new, it’s also bound to have bugs.
Although smart home companies work to fix these flaws as quickly as possible and want to ensure their devices are secure, there’s still always a chance you’ll run into security issues. Here are six potential risks you should be aware of.
Although smart speakers are always listening since their microphones are continuously on, they don’t record or process anything they hear unless they detect their activation phrase first. For Google Home, this phrase is “OK, Google.” For an Amazon speaker, say “Alexa.”
There are several problems. The first is that the technology isn’t perfect yet, and it’s entirely possible that the device will mishear another phrase as it’s wake-up phrase. For example, an Oregon couple recently discovered that their Amazon Echo speaker had been recording them without their knowledge. Amazon blamed the mistake on the device mishearing something in a background conversation as “Alexa.”
Unfortunately, these misunderstandings can extend beyond just activation. After the Oregon family’s Echo recorded their conversation, it sent the recording to a random person on their contact list. They only knew about the incident because the person who received the recording contacted them and told them.
Amazon offered the same explanation for this part of the event. According to the company, the speaker misheard the background conversation as a whole string of commands, resulting in sending the discussion to the couple’s acquaintance. This situation suggests that these speakers’ listening skills might not be as advanced as they need to be to function properly.
Smart speakers may misunderstand cues and unexpectedly wake up, but people could also purposefully wake them without your permission. Once they do so, they could potentially gain access to some of your information.
Burger King demonstrated this vulnerability when it ran an ad that purposely activated Google Home speakers and prompted them to read off a description of the Whopper burger. Google reacted quickly and prevented the devices from responding. Burger King fired back by altering the ad so that it triggered the speakers again.
While this prank might be relatively harmless, people could also potentially activate your speaker without your permission, even by yelling through your front door or an open window. Because of this vulnerability, you should avoid using a smart speaker for things like unlocking your front door. You can also change the wake word and set up pins for specific features.
Thus far, most of the reports of problems with smart speakers have revolved around unauthorized access or faulty functionality. The devices are certainly also vulnerable to malicious hacks as well.
Security experts in the tech realm have already discovered various susceptibilities, enabling companies to fix them. Hackers may at some point, however, find some of these vulnerabilities first. If they do, they may be able to access sensitive personal information.
To protect yourself from becoming a victim of a hacking incident, use hardware and software only from companies you trust. Also, use secure passwords, and change them often wherever you can.
Using smart speakers could also increase your vulnerability to voice hacks, a subset of identity theft in which someone obtains an audio recording of your voice and uses it to access your information. Once they have this recording, they use it to trick authentication systems into thinking they’re you. This hack is a potential way to get around smart speakers’ voice recognition capabilities.
Smart home speakers provide a potential goldmine of audio recordings that someone could use for voice hacking. If a bad actor manages to hack into the speaker or cloud service where your records get stored, they could use it to hack into various accounts of yours.
The fact that some cloud service is storing these recordings may make users uncomfortable. These recordings may be used to personalize your experience, improve the smart assistant’s effectiveness, serve you ads or do a range of other things.
Luckily, you can delete these recordings if you’d like through your account settings. In addition, advocates have called for more transparency about how these companies use customer data.
Be Smart About Using Smart Speakers
All smart technology comes with security risks. That doesn’t necessarily mean we shouldn’t use it, but it does mean we should be careful about how we use it and take appropriate security measures.
If you choose to get a smart speaker, take the time to set up your security settings, and allow access only to people and companies you trust.
About the Author:
(Security Affairs – Smart Speakers, IoT)