Hacker Newsnew | past | comments | ask | show | jobs | submit | rshi's commentslogin

On a related note, I can definitely foresee a lot of voice actors having their voices cloned for uses they wouldn't really intend. Seems like a big legal grey area as many countries have personality rights.


I wonder what the legal implications of this alongside similar developments like deepfakes are going to be in the next couple years. We're already having fraudsters impersonate CEOs using Deep learning-aided Voice generation[1] due to just how low the barrier of entry is now. There's already a public implementation of the paper out [2]!

[1]: https://www.wsj.com/articles/fraudsters-use-ai-to-mimic-ceos... [2]: https://github.com/CorentinJ/Real-Time-Voice-Cloning


CorentinJ's implementation isn't quite as good as Google's - I think with some of Google's samples I couldn't tell that they weren't real, especially over the phone. But I could easily tell with CorentinJ's.

That seems to be common with open implementations of Google's voice synthesis and speech recognition work. I guess they hold back some of the secret sauce, or can afford to train it more.


Currently watching UK: https://mobile.twitter.com/FutureAdvocacy/status/11942824810...

Sorry for the Twitter link but Future Advocacys website seems to be down.


The latest episode of Blacklist had a dark plot based on deep-fakes.


didn't know new season was out! thx!



Yes, it is this one, thanks.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: