This Wednesday, April 14, 2016, file photo shows a push-button landline phone, in Whitefield, Maine.
Robert F. Bukaty/AP
hide caption
toggle caption
Robert F. Bukaty/AP

This Wednesday, April 14, 2016, file photo shows a push-button landline phone, in Whitefield, Maine.
Robert F. Bukaty/AP
For years, a common scam has been to receive a call from someone claiming to be an authority figure, such as a police officer, asking you to urgently pay money to help a friend or family member. to get out of trouble.
Now, federal regulators are warning that such a call could come from someone who looks exactly like that friend or family member – but is actually a scammer using a clone of their voice.
The Federal Trade Commission issued a consumer alert this week urging people to be vigilant for calls using artificial intelligence-generated voice clones, one of the latest techniques used by criminals hoping to defraud. people with money.

“Everything (the scammer) needs a short audio clip of your family member’s voice – which he could get from content posted online – and a voice cloning program,” said notified the committee. “When the scammer calls you, he will speak like your beloved.”
If you’re not sure if it’s a friend or relative, hang up and call them.
The FTC suggests that if someone who looks like a friend or relative asks for money — especially if they want to be paid by wire transfer, cryptocurrency, or gift card — you should hang up and call the no one to verify his story.
An FTC spokesperson said the agency was unable to provide an estimate of the number of reports of people being defrauded by thieves using voice cloning technology.
But what looks like a plot from a science fiction story is barely made up.
In 2019, scammers posing as the boss of a UK-based energy company CEO demanded $243,000. A bank manager in Hong Kong was tricked by someone using voice cloning technology to make large transfers in early 2020. And at least eight seniors in Canada lost a combined $200,000 earlier this year in an apparent voice cloning scam.

“Deepfake” videos purporting to show celebrities doing and saying things they haven’t done are getting more sophisticated, and experts say voice cloning technology is also advancing.
Subbarao Kambhampati, a computer science professor at Arizona State University, told NPR that the cost of voice cloning is also coming down, making it more accessible to scammers.
“Before, it required a sophisticated operation,” Kambhampati said. “Now petty crooks can use it.”
npr