According to experts, the ability of Alexa to imitate human voices is a potential security risk.

Untitled design 10

At a conference that Amazon held on Wednesday in Las Vegas, a senior vice president named Rohit Prasad stated that the system would enable Alexa to imitate any voice after listening to less than one minute of audio.

In light of the fact that “so many of us have lost someone we love” as a result of the pandemic, Prasad has stated that the objective is to “make the memories last.”

Users on Twitter described it as having a “creepy” vibe, despite Amazon’s marketing of the technology as a sentimental tool. Not to mention the implications for safety that something like this could have.

According to Jake Moore, Global Cybersecurity Advisor at ESET, “our voices are often used as a password to authenticate certain accounts.”

As a result, to mimic a particular voice for just 60 seconds could result in serious security implications.

Amazon is not the first company to experiment with voice recognition and artificial intelligence. Recently, Microsoft restricted the types of businesses that could use its software to parrot voices out of fear of the dangers posed by deep fakes.

“Deep fake audio attacks are already happening against businesses, but they are typically created by powerful computers utilizing a lot of data input,” the report stated.

Moore explained that when tech giants add gimmicky features for the masses, it raises the threat level for many more people.

Providing AI assistants with the voices of real people could be an example of a situation in which technology has advanced ahead of security measures, putting people in danger.

If the voices of individuals really can be imitated in this simple and speedy manner, then there is a possibility that some potentially serious incidents are on the horizon.

Moore believes that businesses should ask themselves why we might need a certain technology before developing it rather than just developing it for the sake of developing it.

Since the implementation of a biometric security system that authenticated customers through their voices, HSBC reported a reduction of fifty percent in the amount of fraudulent activity that occurred during telephone banking operations the previous year.

Imagine for a moment that Alexa is able to imitate those voices. The potential security nightmare that it could cause is not difficult to imagine at all.

If this technology is ever developed, then experts believe that it would be prudent to switch from authenticating your bank accounts with your voice to using another verification method, such as online banking via your smartphone. This would be the case even if this technology were to become a reality.

Related Articles

Leave a Reply

Back to top button