The biggest tech news last week wasn’t what it should have been. Equifax got hacked. Ho hum. Another hack of tens of millions of customer or user details. Yawn. Only 143-million people? Yahoo! had hacks of about 1-billion and another 500-million, while an adult website saw 400-million accounts compromised and eBay lost sensitive data for 145-million users.
The breach Equifax reported Thursday, however, very possibly is the most severe of all for a simple reason: the breath-taking amount of highly sensitive data it handed over to criminals. By providing full names, Social Security numbers, birth dates, addresses, and, in some cases, driver license numbers, it provided most of the information banks, insurance companies, and other businesses use to confirm consumers are who they claim to be. The theft, by criminals who exploited a security flaw on the Equifax website, opens the troubling prospect the data is now in the hands of hostile governments, criminal gangs, or both and will remain so indefinitely.
Hacking accounts has become so common that news outlets didn’t bother to report the differences between Yahoo!’s many breaches and the Equifax breach. Are we becoming immune to such intrusions?
Wait. It gets worse. Siri got hacked.
Mark Wilson explains:
Chinese researchers have discovered a terrifying vulnerability in voice assistants from Apple, Google, Amazon, Microsoft, Samsung, and Huawei. It affects every iPhone and Macbook running Siri, any Galaxy phone, any PC running Windows 10, and even Amazon’s Alexa assistant.
What? How can you hack Siri? I mean, Siri just sits there waiting for a verbal command– audio from your mouth– before jumping into action. Well, it seems Siri can hear sounds we cannot hear and that makes it easy for someone to compromise Siri’s functionality without us hearing a word.
A team from Zhejiang University translated typical vocal commands into ultrasonic frequencies that are too high for the human ear to hear, but perfectly decipherable by the microphones and software powering our always-on voice assistants. This relatively simple translation process lets them take control of gadgets with just a few words uttered in frequencies none of us can hear.
Uh oh. That means sounds we cannot hear can be heard by the iPhone’s microphone, by Siri, and that can make the device do whatever hackers and criminals and government spooks want Siri to do.
The microphones and software that power voice assistants like Siri, Alexa, and Google Home can pick up inaudible frequencies–specifically, frequencies above the 20KhZ limits of human ears.
The good news is that this so-called DolphinAttack needs to be very close to the device so the inaudible sound frequencies can be picked up by the iPhone’s microphone (this exploit works on many other devices, too; nearly any popular device that has voice command capability). In other words, the hacker needs to be sitting next to you.
It’s one thing to have our personal information stolen online but it’s something else again to have our robotic overlords obey commands we cannot even hear. It seems to me this would be an easy fix. Apple could have Siri learn our particular voice and speech patterns which would prohibit outside, inaudible sound frequencies from interfering with Siri’s desire to serve mankind than to serve hackers.