Imagine getting a call, email, or SMS from the government demanding money immediately. There is no reason to doubt the request because the contents are precise, expert, and include information that is specific to you. The majority of customers are aware of this scam because it is pretty common.
Consider hearing a loved one’s distinctive voice on the other end of the phone when you receive a call from them and they urgently request money or your account information. Despite sounding like something out of science fiction, this is becoming more and more of a reality thanks to the rapid advancement of AI tools.
Attacks committed undercover are increasing.
For the first five months of the year compared to 2021, impersonation attacks climbed by 264%, according to the Southern African Fraud Prevention Service (SAFPS). South Africans just learned this week that Dr. Mmereka Patience Martha Ntshani was seeking legal advice regarding possible identity theft by Dr. Nandipha Magudumana in the infamous Facebook rapist claims.
Gur Geva, the founder and CEO of iiDENTIFii (www.iiDENTIFii.com), claims that the technology needed to impersonate a person has gotten more affordable, user-friendly, and available. Because of this, it is now easier than ever for a criminal to pretend to be someone else.
The threat of voiceprint assaults is rising
The Federal Trade Commission of the United States warned consumers this week to be on the lookout for calls from con artists that closely mimic their loved ones’ voices. To fake an attack, all a criminal needs is a brief audio clip of a family member’s voice, which is frequently taken from social media.
This technology has a lot of potential. For instance, Microsoft has tested an AI tool that can produce audio in a variety of languages using a brief sample of a person’s speech. This does show how voice may be used as a medium, despite not being made available for general usage.
Highlighting the voice biometrics’ flaws
“Voice has traditionally been regarded as an essential and unfailing component of a person’s identity. Because of this, numerous companies and financial institutions included it in their arsenal of identity verification tools, according to Geva.
Voice-based accounting, which enables users to give account instructions by voice command, is an appealing security option for financial services firms around the world. Real-time authentication with voice biometrics eliminates the need for security questions or even PINs. For example, Barclays incorporated Siri to enable mobile banking payments without opening or logging into a banking app. In order to develop voice-based voice biometric verification for e-commerce utilizing standard smartphone biometric sensors, Visa worked with Abu Dhabi Islamic Bank.
Financial institutions need to be mindful of the prospect of widespread fraud via voice-based interfaces as voice-cloning emerges as a real concern. For instance, a con artist could mimic a customer’s voice and conduct business on their behalf,” claims Geva.
The popularity of voice cloning serves as a prime example of the value of complex, multi-layered biometric authentication procedures. Geva continues, “At iiDENTIFii, our experience, research, and global perspective have inspired us to develop a remote biometric digital verification technology that can authenticate a person in less than 30 seconds but, more importantly, triangulates the person’s identity with their verified documentation and their liveness.
In order to defend against deep fake attacks and impersonation, iiDENTIFii utilizes biometrics with liveness detection. Even voice recognition combined with motion detection is no longer sufficient to verify that you are speaking with a real person. Synthetic fraudsters can fake the authentication process by using voice cloning in conjunction with images or videos if liveness detection technology isn’t very secure.
Geva continues, “While identity theft is expanding in scope and sophistication, the technologies we have available to stop fraud are smart, scalable, and capable of meeting the challenge.