AI Voice Cloning New Weapon of Cyber Criminals; 83% Indians Report Financial Loss: McAfee Report
AI Voice Cloning New Weapon of Cyber Criminals; 83% Indians Report Financial Loss: McAfee Report
Over 80% of Indians share their voice data online or in recorded notes at least once a week through social media, voice notes and others. So, where the cybercriminals will get the data to create a fake voice is quite evident

Until now, Indian telecom customers were facing the issue of spam calls and messages, but now, AI-linked voice cloning scams have been emerging in the digital time.

The Telecom Regulatory Authority of India (TRAI) had implemented new guidelines for operators on May 1 to check unsolicited spam calls and messages using AI in their call and message services.

But the latest AI scam is posing more danger as researchers say cloning someone’s voice is a now major tool in the armoury of cyber criminals.

According to a report by global cybersecurity company McAfee Corp., nearly half of Indian adults (47%) have experienced or know someone who has experienced an AI voice scam, which is nearly double the global average.

The report entitled ‘The Artificial Imposter’ stated that 83% of Indian victims reported a financial loss, with 48% losing more than Rs 50,000.

This report was released after conducting a worldwide survey involving 7,054 people from seven countries, including India. The survey demonstrated how AI technology is fuelling an increase in online voice scams, with only three seconds of audio required to clone a person’s voice.

As per the findings, 69% of Indians believe they don’t know or can’t discern the difference between an AI voice and a human voice. Something similar was reported earlier also when the same security company in its report stated that Indians were using ChatGPT to write love letters and 78% of Indians are finding it difficult to discern any difference between the love language written by a human and an AI tool.

However, among other key findings, the recent report highlighted that over half of Indian respondents said they would respond to a phone or voice note claiming to be from a friend or loved one in need of money. Especially if they assumed the request came from their parent (46%), partner or spouse (34%), or child (12%).

It was also found that messages alleging that the sender had been robbed (70%), engaged in a car accident (69%), misplaced their phone or wallet (65%), or require assistance when travelling abroad (62%) were the most likely to generate a response.

Modus Operandi

Over 80% of Indians share their voice data online or in recorded notes at least once a week through social media, voice notes and others. So, where the cybercriminals will get the data to create a fake voice is quite evident.

According to McAfee’s research, scammers are employing AI technology to clone voices and then send a bogus voicemail or voice note, or even phone the victim’s contacts directly, pretending to be in danger.

The team spent weeks investigating the matter and found more than a dozen freely available AI voice-cloning tools. Even News18 did a simple Google search and found demos of cloned audios of Twitter boss Elon Musk, former US president Barack Obama, actor Tom Hanks and even late US president John F. Kennedy.

McAfee also said there are both free and paid tools available online. The team noted: “In one instance, just three seconds of audio was enough to produce an 85% match (Voice match accuracy levels indicated are based on the benchmarking and assessment of McAfee security researchers), but with more investment and effort it’s possible to increase the accuracy.”

According to the team, by training the data models, the researchers were able to achieve a 95% voice match based on just a small number of video files.

However, the team noticed that they had no issue mimicking accents from throughout the world, whether they were from the US, UK, India, or Australia, but that more distinctive voices were more difficult to replicate.

For example, cloning a person’s voice with an uncommon cadence, rhythm, or style requires more work, and they are less likely to be targeted as a result. But again, the more accurate the clone will become, the chances will increase to scam someone for monetary purposes. With such technology, a scammer may net thousands of dollars in a matter of hours.

However, the research team’s overarching conclusion was that artificial intelligence has already changed the game for cybercriminals. The barrier to entry has never been lower, making it easier to perpetrate cybercrime.

Read all the Latest Tech News here

What's your reaction?

Comments

https://shivann.com/assets/images/user-avatar-s.jpg

0 comment

Write the first comment for this!