If you’ve ever been worried about the security or privacy credentials of your Alexa-connected devices, there’s some new data that may have you unplugging them immediately. Researchers from Germany’s Ruhr-Universität Bochum as well as North Carolina State University and Google are putting forth new vulnerability findings based on an analysis of 90,194 unique Alexa Skills from seven different Skill stores across the globe. The issues stem from a negligible amount of safeguards from Amazon that are fairly easy to circumvent for anyone who's so inclined.
Amazon Skills: Into the Skillverse — One of the issues researchers came across involves hearing double. When you enable Skills by voice, Amazon often prefers native or first-party Skills, but there’s nothing to stop other developers from having the same or similar trigger phrases.
You can accidentally enable direct duplicates or homonyms, the latter of which is also known as skill squatting and has concerned security researchers for years. The duplicates actually pose the more imminent threat — while researchers found that skill squatting was possible, they didn’t uncover any instances of it.
Skill squatting relies on the mispronunciation of words, but malicious developers have another duplication trick up their sleeves. It’s easy enough to create Skills under a trusted developer name. Even though it won’t link to anything official and encrypted data isn’t vulnerable to this exploit, the name can be enough to ensnare users, revealing data like their location.
Hidden intentions — Amazon does run a certification process for Skill kits, but researchers found that simple changes to backend code post-approval can expose sensitive data. Skills might look above board, but developers could alter them to get users to supply personal data like phone numbers.
Privacy? Never heard of her — Of all the Skills assessed, only 24.2 percent had privacy policies, and specifically in the U.S. Skill store, this only jumps up to 28.5 percent. The U.S. store significantly lacked privacy policies for Skills in the “kids” category, putting Amazon at odds with internet child protection laws.
Even when you crack into the privacy policies, less than a quarter provide details on permission data. Permissions themselves aren’t hard to come by, with researchers finding that developers can completely bypass them to get to what is meant to be protected data.
Amazon has reportedly confirmed some of these comprehensive issues and is working on resolving them. For now, though, it may pay to check which Skills you've enabled.