Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> The secret surveillance can't be used in a normal court of law. Especially here in Europe as we don't have secret courts.

You don't have secret courts.. yet. Anyway, it doesn't actually matter if you have secret courts or not, for high profile targets. The US has them and the US also has extradition treaties. And if you do have legislation against spying on your own citizens, your allies don't, they can do it for you and then share intelligence.

This whole effort is to short-circuit all of that and streamline an existing process.



I doubt it's really the high-profile targets they're after though. It sounds more like they want to make this the bread & butter of policing. Having an AI looking over our shoulders to see if we're up to anything bad. Of course not just the content of our conversations, but GPS locations etc. Basically like Apple was proposing but here they are already targeting a much wider range than just CSAM.

Because if it was only the high-profile cases the intelligence services already have huge permissions in terms of hacking, infiltration etc.


I agree with you about the general direction but you're hinting at preventive policing which I don't believe it to be the case. I mean just look at the US and the amount of surveillance since 9/11.. I think the goal here is to monitor dissidents or to help post-factum in investigations.


Not really preventative.

It's just that if you start collecting conversational data from everyone, you need AI to go through it. It's simply not possible to do it manually.

And once you get AI involved, scope creep is guaranteed because of the huge advancements being made in that area.

There's no way we can stop AI from being built but we can still stop some of our data from going into it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: