Many people use smart devices in their homes, but are unaware that they listen to us all the time, even when we don’t want them to. To reduce the risk, scientists have invented new ways for you to be safer when using your devices. Scientists have focused on the prevention of automated speech recognition and devices recognizing your speech, creating a way to mask your voice, so that it’s harder for the device to hear you as you go about your daily life.
Automated speech recognition poses a threat because it can listen to what you say, allowing it to steal your personal information and conversations. To solve this issue, Mia Chiquier, majoring in computer science at Columbia, created a new program that confuses the ASR by disarraying your speech—this causes the smart device to struggle with picking out the sounds of your words. The way the program disorients your device is by using quiet sound waves in the background. The program predicts what you might say next and transmits the words using quiet sounds, disorienting the device’s interpretation of your speech.
Why scientists want to protect your speech is because smart-device manufacturers have algorithms that they use to better understand what your conversations and texts are about. They can use this information to sell to merchants that advertise products you will probably like. Even scarier, smart-device companies can sell your confidential information, such as bank account numbers, medical records, and passwords. Before Chiquier invented this method of masking conversations, people used to just play loud white noise over their conversations. This trick was ineffective, however, because smart devices can still pick up to 80% of the words you say. With the new voice camouflage algorithm, smart device users can feel much more secure.
To make sure their program worked successfully so that they could implement it in the real world, scientists from Chiquier’s team used a real-life simulation to test their program. The experiment was set up so that someone is speaking in a room with some background noise. Then technicians looped the conversation with high levels of white noise. Researchers next repeated it with the voice-masking algorithm, which worked much better than the white-noise method.
Overall, people should be very careful about what they talk about, because your smart device might just be listening in, too. Researchers are helping people find safer ways to go about their day without having to worry about their privacy being invaded. The data shows that these projects are going well, and a safer future for smart devices is near.
Automated speech recognition poses a threat because it can listen to what you say, allowing it to steal your personal information and conversations. To solve this issue, Mia Chiquier, majoring in computer science at Columbia, created a new program that confuses the ASR by disarraying your speech—this causes the smart device to struggle with picking out the sounds of your words. The way the program disorients your device is by using quiet sound waves in the background. The program predicts what you might say next and transmits the words using quiet sounds, disorienting the device’s interpretation of your speech.
Why scientists want to protect your speech is because smart-device manufacturers have algorithms that they use to better understand what your conversations and texts are about. They can use this information to sell to merchants that advertise products you will probably like. Even scarier, smart-device companies can sell your confidential information, such as bank account numbers, medical records, and passwords. Before Chiquier invented this method of masking conversations, people used to just play loud white noise over their conversations. This trick was ineffective, however, because smart devices can still pick up to 80% of the words you say. With the new voice camouflage algorithm, smart device users can feel much more secure.
To make sure their program worked successfully so that they could implement it in the real world, scientists from Chiquier’s team used a real-life simulation to test their program. The experiment was set up so that someone is speaking in a room with some background noise. Then technicians looped the conversation with high levels of white noise. Researchers next repeated it with the voice-masking algorithm, which worked much better than the white-noise method.
Overall, people should be very careful about what they talk about, because your smart device might just be listening in, too. Researchers are helping people find safer ways to go about their day without having to worry about their privacy being invaded. The data shows that these projects are going well, and a safer future for smart devices is near.