Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You're exactly proving the point GP makes, good job.


No they're not and this flippant dismissal is so intellectually lazy.

The ability to scam people exists and isn't going away. People went town to town in covered wagons selling bullshit. People scammed others out of money centuries before that.

The answer isn't pearl-clutching nonsense about how this technology is so different and so morally reprehensible compared to everything else. The answer is education. If someone shows up at your door and sells you a bottle of water that will cure all your ills for $100 then skips town, most people would see it as your fault for being so gullible. Someday there will be some technological or social way to easily differentiate deep fakes from real video the same way you can with photoshopped images today.


We're steadily breaking down all mechanisms that ordinary people can use to trust information they're getting, such as getting a video call from a member of their family and recognising the face and voice.

All previous scams relied on con artists using various means to pretend to either be trustworthy in their own right despite being strangers or as representing some trustworthy institution. But having people being able to act as your family members is a whole other issue. You can't claim that this is no different from other scams.


In Argentina they’ve been scamming people by pretending to be a relative for years now! No AI required. Somebody calls you in the middle of night, distraught, and crying says “mom?” or “dad?”. Then someone interrupts, claims your son or daughter has been abducted (by now you probably have given them their name when responding to the initial plea) and requests money to be dropped at some location where a motorcycle picks it up. They make you stay on the line so you can’t call the cops nor the allegedly abducted person.

These calls usually are made from prison, with an outside accomplice. They are rarely caught.

People are sleepy, and concerned, and swear the voice they heard was the one of their child.

Another no-ai popular scam is done by stealing a WhatsApp account (e.g. by cloning the sim), and then contacting a friend or relative asking for a quick cash transfer for something urgent, to be returned the next day.

Deepfakes might make these scams more believable, but the core causes of the issue and the solutions have not changed.


That sort of scam has been possible, sure, but it depends on both the person being phoned being startled and half asleep, and unable to reach the person or someone else near them in subsequent calls. You're right that I shouldn't have been so absolutist in saying 'all' previous scams, but this is an edge case that doesn't equate to what's becoming possible with real time deep fakes.

What this sort of tech is enabling is scamming where even someone who is fully awake and is probably quite aware and not normally prone to scamming can nonetheless be tricked by a video call.

It's taking us to a point where the only safe way to trust that the person you're speaking to is definitely who they say they are, in all circumstances, is to do it in person. Something not possible for people living far from their other family members.

Do you not see how fundamentally this breaks the trust models we have built over the past few decades?


Scamming an individual is already a ton of work. The amount of effort this shaves off the process is small.


>>The ability to scam people exists and isn't going away.

True, but giving the scammers orders-of-magnitude better tools can have serious consequences. If you already have a plague of robbers in your town, handing out free automatic handguns and ammo to anyone is beyond stupid. Yet this is pretty much what these AI tools give to scammers.

Arguing that people will eventually figure it out is no justification for allowing it. It ignores all the casualties in the meantime. This is especially bad because the result of this scale of weaponized technology may well be a complete destruction of trust in society, or in technology in general. These are catastrophic for everyone in society and the economy.


>The answer is education.

This is the stock answer used for deflection in way too many scenarios. We've seen how it doesn't work.

The complete lack of an enforced ethics code in our field is the biggest blight on our combined contribution to society.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: