AI assistants can be furtively oversmart

R . Venkatesh, a roving consultant from Delhi, has suddenly become acutely aware of the “listening powers” of his smartphone. Earlier this month, he ganged up with a couple of college friends over drinks. Among other things, they reminisced about their large cassette collections languishing at the back of some forgotten cupboard.
One of his friends mentioned hearing about a device to convert tapes to an MP3 format. The next day Venkatesh was “spooked out” when he began receiving advertisements of software to convert tapes to MP3 in his social media feed. He distinctly recalls not searching for any such product online.
What Venkatesh did not know is that his phone can eavesdrop on his conversations since virtual assistants such as Apple Siri, Google Assistant, Microsoft Cortana and Samsung Bixby reside in smartphones, tablets and laptops, and even in stationary devices like Amazon Echo or Google Home smart speakers.
People typically use artificial intelligence (AI)-powered virtual assistants and voice AI-enabled devices including smart speakers to turn music on and off, check weather forecasts, adjust the room temperature and order goods online among other things.
These AI assistants simply convert the voice commands to text, similar to what a computer does when you type keywords or sentences, and interpret what the user means using natural language processing technology. They get better and better as you provide more voice inputs (read more data on which the algorithm trains itself), which also helps them understand your accent.
However, these smart assistants can do much more. According to a 10 May report in The New York Times (NYT), researchers can now send secret audio instructions to Siri, Alexa and Assistant, which humans can’t hear. According to the report, some Berkeley university researchers said they could embed commands directly into recordings of music or spoken text. “So while a human listener hears someone talking or an orchestra playing, Amazon’s Echo speaker might hear an instruction to add something to your shopping list,” said the NYT report.
Further, Amazon.com Inc. and Google Inc. have filed separate patent applications for a number of technologies that could potentially jeopardize the privacy of consumers.
These include a system for deriving sentiments and behaviours from ambient speech, even when a user has not addressed the device with its “wake word”; multiple systems to identify speakers in a conversation and build profiles from these; a system to recommend products based on furnishings observed by a smart home security camera; a methodology for “inferring child mischief” using audio and movement sensors; and systems to insert paid content, according to a December 2017 report by Consumer Watchdog.
“While Amazon and Google may be focused on the commercial implications of their inventions, these technologies also have troubling legal and ethical implications,” notes the report.
Moreover, researchers caution that even headphones and earphones, which are physically built like microphones, can be misused by hackers. Further, software can render a PC, even one without microphones, into an eavesdropping device, say researchers from the Ben-Gurion University of the Negev Cyber Security Research Centre.
Besides using antivirus programmes to reduce risks of security and data leaks, online security firm Kaspersky recommends that users should also turn off the microphone on Amazon Echo and Google speakers. “There’s a button. It’s not a particularly convenient way to ensure privacy—you will always have to remember to neutralize the assistant—but at least it’s something,” says a 28 February 2017 blog by the firm.
Kaspersky further suggests the use of Echo’s account settings to prohibit or password-protect purchasing. While these may not be foolproof measures, they may provide a start to users like Venkatesh.

Share This Article
Avatar of
By
Follow:
A Newspaper company in Kashmir
Leave a comment