Latest news about Bitcoin and all cryptocurrencies. Your daily crypto news habit.
The technology is getting so advanced that deepfakes may soon become undetectable by a human verifier, said Jimmy Su, Binance’s chief security officer.
Deepfake technology used by crypto fraudsters to bypass Know Your Customer (KYC) verification on crypto exchanges such as Binance is only going to get more advanced, Binance’s chief security officer has warned.
Deepfakes are made using artificial intelligence tools that use machine learning to create convincing audio, images or videos featuring a person’s likeness. While there are legitimate use cases for the technology, it can also be used for scams and hoaxes.
Deep fake AI poses a serious threat to humankind, and it's no longer just a far-fetched idea. I recently came across a video featuring a deep fake of @cz_binance , and it's scarily convincing. pic.twitter.com/BRCN7KaDgq
— DigitalMicropreneur.eth (@rbkasr) February 24, 2023
Speaking to Cointelegraph, Binance Chief Security Officer Jimmy Su said there had been a rise in fraudsters using the tech to try and get past the exchange’s customer verification processes.
“The hacker will look for a normal picture of the victim online somewhere. Based on that, using deepfake tools, they’re able to produce videos to do the bypass.”
Su said the tools have become so advanced that they can even correctly respond to audio instructions designed to check whether the applicant is a human and can do so in real-time.
“Some of the verification requires the user, for example, to blink their left eye or look to the left or to the right, look up or look down. The deepfakes are advanced enough today that they can actually execute those commands,” he explained.
However, Su believes the faked videos are not at the level yet where they can fool a human operator.
“When we look at those videos, there are certain parts of it we can detect with the human eye,” for example, when the user is required to turn their head to the side, said Su.
“AI will overcome [them] over time. So it's not something that we can always rely on.”
In August 2022, Binance’s chief communications officer, Patrick Hillmann, warned that a “sophisticated hacking team” was using his previous news interviews and TV appearances to create a “deepfake” version of him.
The deepfake version of Hillmann was then deployed to conduct Zoom meetings with various crypto project teams, promising an opportunity to list their assets on Binance — for a price, of course.
Hackers created a "deep fake" of me and managed to fool a number of unsuspecting crypto projects. Crypto projects are virtually under constant attack from cybercriminals. This is why we ask most @binance employees to remain anonymous on LinkedIn. https://t.co/tScNg4Qpkx
— Patrick Hillmann (@PRHillmann) August 17, 2022
“That’s a very difficult problem to solve,” said Su, when asked about how to combat such attacks.
“Even if we can control our own videos, there are videos out there that are not owned by us. So one thing, again, is user education.”
Related: Binance off the hook from $8M Tinder ‘pig butchering’ lawsuit
Binance is planning to release a blog post series aimed at educating users about risk management.
In an early version of the blog post featuring a section on cybersecurity, Binance said that it uses AI and machine learning algorithms for its own purposes, including detecting unusual login patterns and transaction patterns and other “abnormal activity on the platform.”
AI Eye: ‘Biggest ever’ leap in AI, cool new tools, AIs are the real DAOs
Disclaimer
The views and opinions expressed in this article are solely those of the authors and do not reflect the views of Bitcoin Insider. Every investment and trading move involves risk - this is especially true for cryptocurrencies given their volatility. We strongly advise our readers to conduct their own research when making a decision.