|
马上注册,结交更多好友,享用更多功能,让你轻松玩转社区。
您需要 登录 才可以下载或查看,没有账号?注册
×
作者:微信文章
Fraudsters in China are using artificial intelligence (AI) to clone people’s voices and scam their relatives, with elderly grandparents being the primary targets. In a recent case in Hubei province, an elderly woman surnamed Liu was tricked by an AI-generated version of her grandson’s voice.
On April 28, Liu received a call from her grandson’s home landline. The caller, impersonating her grandson, claimed he had injured someone in a supermarket and needed 20,000 yuan (US$2,800) for compensation. He sobbed and begged her not to tell his mother, threatening to "jump from a building" if she did. Liu, believing the call was genuine, borrowed money from relatives and friends.
The scammer, using the cloned voice, instructed Liu to hand over the money to a man surnamed Zhang, who collected it on his behalf. Zhang later told police he thought he was working a profitable part-time job, earning 1,000 yuan (US$140) a day. He was unaware he was part of a scam.
-ad
Police revealed that the scammers used AI to clone the grandson’s voice from previous crank calls. They deliberately targeted landlines in elderly people’s homes, making it difficult for victims to verify the calls. AI voice cloning apps, which cost only a few dozen yuan, enable scammers to copy voices easily.
The incident highlights the urgent need for regulations to address the misuse of AI technology. Online observers expressed outrage, with one commenting, "Shame on these scammers, profiting from elderly people’s love for their grandchildren." Another suggested using AI assistants to counter such scams: "I am beating magic with magic."
Source :
Editor: Crystal H
Advertisement
Most Popular
What is "zuo yue zi"? She wrapped in large plastic bag after...

The Upcoming January Holiday Schedule!
New legal, policy measures take effect in January in China
Press “wow”
|
|