Earlier this month, Google decided to add deepfake training to its list of forbidden projects on its Colaboratory service. The change was first spotted by DFL developer going on Discord by the name ‘chervonij’. When he tried to train his deepfake models on the platform, he got an error message, saying
“You may be executing code that is disallowed, and this may restrict your ability to use Colab in the future. Please note the prohibited actions specified in our FAQ.”
Google appears to have made the change under the radar, and has since remained quiet on the matter. While ethics are the first potential theory to come to mind, the actual reason might be a bit more on the pragmatic side.
Share your thoughts on Cybersecurity and get a free copy of the Hacker's Manual 2022 (opens in new tab). Help us find how businesses are preparing for the post-Covid world and the implications of these activities on their cybersecurity plans. Enter your email at the end of this survey (opens in new tab) to get the bookazine, worth $10.99/£10.99.
Abusing the free resource
Deepfakes are “photoshopped” videos - fake videos showing people saying things they never really said. Their creators leverage artificial intelligence (AI) and machine learning (ML) technologies to create super convincing videos, which are getting more and more difficult to distinguish from legitimate content.
To make them convincing, though, deepfakes require significant computing power, not unlike the one on offer through the Colab service. This Google project allows users to run Python in their browser while using free computing resources.
> A deepfake of Elon Musk is trying to scam people out of crypto again (opens in new tab)
> Deepfakes could be the next big security threat to businesses (opens in new tab)
> How concerned should you be about deepfake fraud? (opens in new tab)
As deepfakes are usually used to crack jokes, create fake news, or spread fake revenge porn, it’s easy to think ethics are behind Google’s decision. However, it might also be that too many people were using Colab to create fun little deepfake videos, preventing other researchers from doing more “serious” work. After all, the computing resource is free to use.
Besides deepfakes, Google doesn’t allow Colab to be used on projects such as mining cryptocurrency, running denial-of-service attacks, password cracking, using multiple accounts to work around access or resource usage restrictions, using a remote desktop or SSH, or connecting to remote proxies.
Via: BleepingComputer (opens in new tab)