The Menace of AI Voice Scams. How You Can Protect Yourself?
For a long time, there has been a widespread fraudulent scheme where scammers target individuals with phone calls pretending to be authoritative figures.
As an example, people have fallen victim to swindlers pretending to be law enforcement officers who urgently demand payment to facilitate the release of a friend or family member.
Sorry to say, such scammers are only getting smarter. Federal regulators are currently warning that scammers can now initiate calls and imitate the voice of a friend or family member. These scammers employ artificial intelligence (AI) tools and a replicated version of the voice to manipulate unsuspecting victims deceitfully.
As a result, the Federal Trade Commission recently released a consumer advisory urging individuals to be on the lookout for phone calls that employ artificially generated voice clones. AI voice clones represent one of the criminals' latest techniques to defraud and extort money from unsuspecting people.
These scams have proliferated because artificial intelligence has transcended the boundaries of mere science fiction and has become an actual and tangible aspect of our present-day existence. Scammers now exploit this technology to duplicate the voice of your loved ones.
So, where are they getting the audio snippets? They can use a short audio snippet of your family member's voice, easily obtainable from social media platforms, and combine voice-cloning software to replicate the voice effectively. Consequently, when the scammer reaches out to you, their voice will be virtually indistinguishable from that of your family member.
Esteemed experts like Steve Weisman, a scams specialist and law lecturer at Bentley University, corroborate that scammers can extract audio recordings of individuals from videos shared on widely-used online platforms. In addition, Pennsylvania’s Haverford Township Police Department issued a press release stating that scammers who employ voice cloning techniques also fabricate counterfeit voicemails from trusted sources.
Besides, AI voice clone scammers may even manipulate the caller ID to make it appear that the call originates from a trustworthy entity, further enhancing their deception of unsuspecting targets.
AI voice clones are on the rise.
Imposter scams have existed for decades, but federal officials warn they are increasing in sophistication. The proliferation of affordable and highly available AI programs has empowered scammers to increase the efficiency and intensity of AI voice scams.
Criminals leverage AI programs, which go for as little as $5 monthly, to clone voices with near-perfect accuracy. The cheap cost, high availability, and the fact that scammers need audio of less than thirty seconds to clone voices have caused AI voice scams to increase sharply.
A 2023 McAfee research revealed that scammers are utilizing artificial intelligence technology to clone voices to send fabricated voicemails or make deceptive calls to the contacts of their victims. The study highlights that 70% of adults lack confidence in differentiating between a cloned voice and an authentic one. Hence, this has contributed to the increasing prevalence of this deceitful technique.
It's worrisome the survey revealed that almost half (45%) of individuals would respond to a voicemail or voice note that appears to be from a friend or loved one seeking financial assistance. This inclination is particularly strong when the request purportedly came from a spouse (40%), parent (31%), or child (20%).
“The threat is not hypothetical — we are seeing scammers weaponize these tools,” notes Hany Farid, a Berkeley Artificial Intelligence Lab member and computer science professor at the University of California. “The trend over the past few years has been that less and less data is needed to make a compelling fake,” he adds.
ElevenLabs, a company specializing in artificial intelligence speech software, acknowledges that its voice replication technology can imitate and duplicate an individual's vocal patterns. Although they mention that the replicated voices may bear similarities but are not an exact match to the source, they have not explicitly addressed the issue of telephone scams. However, in a January 2023 tweet, the company noted a rising trend in the misuse of voice cloning.
According to the Federal Trade Commission, scammers can employ voice cloning programs and short audio snippets to replicate the voice of a person's loved one. Con artists can source those snippets from publicly available content on popular platforms such as YouTube, TikTok, Instagram, or Facebook. Shockingly, scammers only need as little as 30 seconds of publicly shared audio to generate an AI-driven call that convincingly mimics the targeted family member's voice.
In a letter to the FTC, Sen. Mike Braun, a member of the Senate Special Committee, expressed concerns that AI voice scams are increasingly targeting older people. The letter warned that voice clone software enables scammers to trick elderly individuals exposing them to financial losses and identity theft risks. During an interview WITH Fox News Digital, Braun noted that AI voice scams result in losses amounting to $2.6 billion annually and added that scammers abuse voice clone technologies to target vulnerable older adults.
What are the dangers of AI voice clones?
Artificial intelligence voice clones pose significant dangers. As technology advances, scammers and malicious actors exploit AI voice cloning to deceive and defraud unsuspecting individuals. The risks and implications associated with AI voice clones include the following:
1. Manipulating voices for malicious reasons
One of the primary dangers lies in the ability of scammers to manipulate and mimic the voices of others with astonishing accuracy. The scammers can replicate the voices of friends, family members, or authority figures using voice cloning AI-powered tools.
As such, it enables them to perpetrate convincing impersonation scams. For instance, criminals can trick victims into believing they are communicating with someone they trust, causing huge financial losses. Furthermore, the emotional manipulation involved in such scams can lead victims to act hastily, which may result in unwanted consequences.
2. Psychological impacts
The psychological impact of falling victim to an AI voice clone scam can be severe. Victims may experience feelings of betrayal, violated trust, and vulnerability. Moreover, the repercussions extend beyond individual victims, as the scams can cause rifts within relationships and erode overall trust in digital communications.
3. Increased disinformation
The potential misuse of AI voice clones results in the increased spreading of disinformation and deepfake content. With the ability to replicate voices convincingly, malicious actors can create audio recordings that appear authentic but are entirely fabricated. As a result, it poses a significant threat to the integrity of audio evidence, potentially undermining legal proceedings and public trust in audio recordings as reliable sources of information.
4. Weaponization for social engineering
AI voice clones have the potential to be weaponized for political manipulation and social engineering. Attackers can impersonate public figures, politicians, or influential individuals, disseminate false information, manipulate public opinion, and sow discord. Combining AI voice cloning with other AI-generated media, such as deepfake videos, amplifies the potential harm and the challenge of discerning truth from falsehood.
5. Ethical and privacy concerns
The ethical and privacy concerns surrounding AI voice clones are significant. As scammers gather audio samples from publicly available sources, it raises questions about consent, data privacy, and the need for robust regulations to govern such technology. Unauthorized use of someone's voice for fraudulent purposes violates their privacy and erodes individuals' trust in digital platforms and communication channels.
How can you protect yourself from AI voice scams?
First and foremost, exercise heightened caution and adopt a skeptical mindset when receiving unexpected requests for money or sensitive information. Scammers using AI voice cloning technology are skilled at creating elaborate narratives to manipulate their victims. Therefore, independently verifying the caller's identity through alternative means can be a powerful defense. In addition, reach out directly using a known and trusted phone number to confirm the request's legitimacy and protect yourself from falling into the scammers' trap.
Furthermore, guarding your personal information is crucial in the fight against AI voice cloning scams. In particular, be mindful of sharing personal or financial details over the phone without proper verification. In the current digital age, scammers can mine vast information from various online sources, including social media platforms and public databases. They can use this information to piece together fragments of your life and create an AI voice clone that appears strikingly similar to your loved one or a trusted authority figure. Therefore, it is essential to exercise caution and avoid divulging sensitive information over the phone unless you have verified the authenticity of the request.
One effective strategy for fortifying your defense against AI voice scams is enabling biometric authentication where possible. Biometric authentication methods, such as voice recognition or fingerprint scans, provide additional security to your personal accounts and sensitive information. These technologies leverage unique physical characteristics that are challenging to replicate, offering higher confidence in verifying your identity. As such, biometric authentication can significantly reduce the risk of unauthorized access to your accounts, thwarting potential AI voice cloning attacks.
In conclusion
In the ever-evolving digital technology landscape, the emergence of AI voice scams is a stark reminder of the vulnerabilities we face in our interconnected world. The manipulation and replication of human voices using artificial intelligence present a significant threat, as scammers employ increasingly sophisticated techniques to deceive and defraud unsuspecting individuals. These scams jeopardize our financial security and erode trust in our digital interactions.
Therefore, we must continually educate ourselves about the risks and adopt proactive measures to protect our identities and personal information to combat this menace. Staying informed and educated about the latest scams and techniques employed by scammers using AI voice cloning technology is crucial for maintaining your security. Scammers continuously refine their tactics, making it essential to stay up-to-date with emerging trends and preventative measures.
Also, regularly reviewing resources from reliable sources, such as cybersecurity organizations, government agencies, and news outlets, can provide valuable insights into evolving scams and offer practical guidance on protecting yourself effectively.
About Pulsar Security
Pulsar Security is a team of highly trained and qualified ethical hackers whose job is to leverage cybersecurity experience and proprietary tools to help businesses defend against malicious attacks. Pulsar is a Veteran, privately owned business built on vision and trust, whose leadership has extensive military experience enabling it to think strategically and plan beyond the problems at hand. The team leverages offensive experience to offer solutions designed to help analyze and secure businesses of all sizes. Our industry experience and certifications reveal that our engineers have the industry's most esteemed and advanced on the ground experience and cybersecurity credentials.
Corey Belanger
Corey is a Security Consultant and leads QA of product development, using his expertise in these dual roles to more effectively test and secure applications, whether while building enterprise applications or while performing penetration tests and vulnerability assessments for customers. An Army veteran with a tour of duty in Afghanistan, Corey has built a post-military career in security while earning Network+, Security+, GIAC Certified Incident Handler, GIAC Python Coder, GIAC Web App Penetration Testing, and GIAC Penetration Tester certifications. Corey is also a BsidesNH organizer and founding member of TechRamp, avenues which he uses to help others build their skills for careers in security and technology. Fun Fact: When not manning a terminal or watching the Bruins, Corey can often be found snowboarding or riding his motorcycle.