Home arrow Fraud and Scams arrow How to Protect Yourself arrow Voice Deepfake Scams Are Coming for Your Money
User Login





Lost Password?
No account yet? Register
Guard My Credit Menu
Home
- - - THE ISSUES - - -
Videos
Fraud and Scams
Credit Issues
Identity Theft
Privacy Issues
Our Children
Politics & Politicians
- - ACTION CENTER - -
Guard My Credit Links
Helpful Pamphlets
- - - - - - - - - - - - - - -
About ACCESS
Contact Us
About Our Site
Join the Fight
ACCESS is a non-profit, tax exempt consumer advocacy group.

Donations are tax deductable.

Guard My Credit Hits
11429463 Visitors
Voice Deepfake Scams Are Coming for Your Money PDF Print E-mail

September 7, 2023 - With around three seconds of recorded audio, scam artists are now able to make a relatively believable deepfake of just about anyone's voice. At least that's true if you are an American (more on this later). And with 15 seconds of recorded audio, the deepfake is even better. They are doing this using a number of readily available AI platforms that are now open source. And it is leading to some real issues with fraud.

Image

Scams using an imposter's voice have been around for years. In the past, they have largely targeted people known to have money and they have involved plots such as mimicking the voice of a child or grandchild calling to say that they are in trouble and need money right away. Victims receiving calls like this will often wire money immediately only to find out that they've been duped later on. But in order for the scam to work, the people behind it had to do some research to make sure that it would be worth the effort.

They needed to know that the potential victim had the money they were demanding. They had to learn about their family members and then find one that they could actually mimic. These scams weren't scalable... meaning you couldn't just easily move from one potential victim to another. But AI is changing that. And to see what we mean, all you have to do is take a look at someone's Facebook (or other social media) page.

If you go to Facebook and click on some random person's page, there's a pretty good chance that you'll find a video of them speaking somewhere in their timeline. And then you can take a look at their friends. If you can find a parent, child, cousin, etc..., you've got everything you need to pull off a deepfake scam. You can take that video that you found and run it through an AI platform to generate the voice you need. Then you call the relatives that you've targeted and deliver your message. "Help! I've been arrested and I need bail money!" Or, "Help! My car broke down and I'm going to lose my job if I can't get it fixed right away! Can you please send money?" You get the idea.

The introduction of AI has changed everything. And most people in the United States who speak English without a foreign accent are vulnerable. That's because most of us have placed some form of recording that includes our voice online. But AI still does have some limitations. One of those has to do with accents. It still has trouble generating voices that have an accent. But that's likely to change in the near future as the technology evolves.

All of these changes are already having an impact. There are now known cases of deepfake voices being used to call bankers and attempting to get them to transfer funds. The same thing is likely to start happening with stock brokers. So anyone who speaks with a banker or a broker on a regular basis really needs to have a conversation with them about this issue and come up with a plan to prevent victimization.

As a consumer, you need to be aware of these scams and double check any calls you receive from family or friends asking for money. The person on the other end of the line may sound familiar, but actually be a complete stranger. Don't get caught up in the moment. In what you think is an emergency situation, that may be easier said than done, but you really need to verify information before you wire money to someone.

And before you post that next video online, just remember that it could be used to mimic your voice in a scam. Then ask yourself if it really needs to be accessible to the entire world.   

Note: When posting a comment, please sign-in first if you want a response. If you are not registered, click here. Registration is easy and free.

 

Follow ACCESS  
Comments
Search
Only registered users can write comments!

3.25 Copyright (C) 2007 Alain Georgette / Copyright (C) 2006 Frantisek Hliva. All rights reserved."

 
Guard My Credit Polls
#1 - Why did you visit our site today?
 
.•*´¯☼ ♥ ♥ Your Support of These Links Is GREATLY Appreciated ♥ ♥ ☼¯´*•.
Advertisement
 
Go to top of page
Home | Contact Us |About Us | Privacy Policy
eXTReMe Tracker
12/04/2024 01:42:26