> Grayshift has been shopping its iPhone cracking technology to police forces. The firm, which includes an ex-Apple security engineer on its staff, provided demonstrations to potential customers, according to one email.
It kind of just enforces the idea that security by obscurity isn't sufficient; ideally even despite deep knowledge of the internals, it should still be secure.
What would be sleazy though is if he deliberately discovered a bug and didn't report it to apple while working there, and then took it with him to this job.
But if he simply made use of his general knowledge of the system for more effective pentesting, I think its pretty much fair, and expected
It seems crazy that apple wouldn't have some kind of iron clad confidentiality agreement for any developers that work in security.
Obviously, you can't prohibit ex employees from getting a job elsewhere, you surely can stop them from peddling company secrets to the highest bidder.
That being said, the only way I can imagine that apple isn't already prepping a massive lawsuit is that the method they are using did not necessarily require insider knowledge.
Worst case you have to pay (a lot of) fines (at least in the jurisdiction where I am jail time is not an option for stuff like this). That just makes the price go up by however much the fines are and doesn't solve anything. In the end, people are free to do what they want.
Is it? Maybe I'm morally bankrupt, but if you think that's sleazy the world is a literal (figurative) ball of mud.
A thing that can be cracked will be cracked. If someone's good at something, they will probably look to sell their skills. If it is within the law for police to perform this kind of thing (it probably definitely shouldn't be), they're gonna use whatever contractor can do it best (if they can't do it themselves).
It just seems like the natural progression -- outside of the oddity of police being allowed to "break into" (in a sense) private property.
I think this argument devolves into the does-tech-hurt-people (guns, security research) argument, and maybe I'm just way off base, but I don't knock the iphone cracking people in this instance... I mean I wouldn't necessarily do it but this isn't bad enough to pass muster in my book as sleaze.
This “if I don’t do it, someone else will do it” attitude really just highlights how much more ethics training do we need to have in part of becoming a software engineer.
Try that attitude in the medical community, or civil engineering, or aviation engineering, or physics or chemistry, and see where that will get you.
Yeah but that wasn't my argument... I didn't say that "someone else would do it so they should be that person", I'm saying that if you're good at taking apart washing machines, and someone's personal washing machine happens to have names and addresses of people who are considered government dissenters on it, the question of should you sell that knowledge to the government is up to you.
It varies by case, but if you try to delineate where it's OK and where it's not, it will be endlessly subjective -- in the face of that, I lean in the "do what you want" direction.
I don't think there is a single discipline that's unscathed in this discussion, it's just the level to which the morals are shared in each. In the medical community, "do whatever you can to save a life" is a pretty strong propellant, but also the medical system in the US (for one) is really fucked, and if the medical community is as ethically pure as you propose them to be, I get the feeling that wouldn't have happened.
[EDIT] - Also, correct me if I'm wrong, but when I think of "ethics" I just think of agreed-upon shared moral values. That is inherently up to someone or some group to decide, at some point, is it not?
>... the question of should you sell that knowledge to the government is up to you.
True. But that doesn't really free you from moral judgement.
> It varies by case, but if you try to delineate where it's OK and where it's not, it will be endlessly subjective -- in the face of that, I lean in the "do what you want" direction.
This really depends on your value system. Of course, you can do what you want (perhaps as long as it's legal), but again, moral judgement is a separate issue. It's totally okay to be immoral, and ethics itself doesn't really have any limitations on what you would do.
From a utilitarian perspective, yes, perhaps everything is endlessly subjective, but I think for most people their ethics system lies somewhere in between deontological and utilitarian, that is, there are certain moral guidelines that are more applicable even in subjective situations.
Again, I'm not saying you shouldn't do anything immoral. There are many other factors for one to consider: life, liberty, the pursuit of happiness, etc. Most of this have nothing to do with ethics at all, but none of it gives anyone a free pass for moral judgement.
> the medical system in the US (for one) is really fucked
That may be true, but we're not really talking about the system here. We're talking about the individual physicians.
> "ethics" I just think of agreed-upon shared moral values
Ethics is a system, sort of the "theory of morals". Ethics training is more about the logical study of these theories, and how one would apply a certain ethics system to evaluate a certain situation, rather than the "agreed-upon shared moral values". Endless engineering fields before us have faced these problem before, and the principles apply just fine in Computer Science.
> That may be true, we're not really talking about the system here, we're talking about the individual physicians.
I thought of this in the context of "if ethics to everyone einvolved, is worth a lot the system would never have gotten that bad". Clearly someone missed the forest for the trees if every doctor is considered ethical but can be part of such a huge system that's failing patients (and causing the outcomes they seek to prevent).
> Ethics is a system, sort of the "theory of morals". Ethics training is more about the logical study of these theories, and how one would apply a certain ethics system to evaluate a certain situation, rather than the "agreed-upon shared moral values". Endless engineering fields before us have faced these problem before, and the principles apply just fine in Computer Science.
I clearly haven't read/don't know enough about it, could you recommend some good introductory texts?
I find things in this area to always devolve to the personal/private/corporate (in the sense of a group of people) freedom basic arguments. Would be great to read something that proposes something new.
Personally, I think ethics mostly boils down to "I treat others like I want them to treat me." If you would not be okay with your washing machine repairman selling out you and your dissenting friends to the government, then you shouldn't be selling the knowledge.
Yes, it is your decision, but it is not simply a decision of "do I whichever I want in the moment." It is ("do I want to do this & "would I be okay with someone doing this to me). (That's a bitwise AND there)
If they're like wayyyy out of what I deem to be understandable, I'll be outraged, but I don't know if I consider this company to be that far over my line (for the reasons I stated). If I considered this company to be over the line, what would I think about the leagues of developers who write essentially surveillance software (and somethings actually surveillance software) for the internet?
Also, I think the way of thinking you're referring to is called universalism (roughly stated - "do a thing only if you're OK with everyone doing that thing"), but I think it has it's problems.
> Yes, it is your decision, but it is not simply a decision of "do I whichever I want in the moment." It is ("do I want to do this & "would I be okay with someone doing this to me). (That's a bitwise AND there)
Yeah that's basically integrity (at least the consistency part) though -- I don't agree with the example I gave (it's pretty devil's advocate-y and obviously against the mores of the HN crowd), but I don't know that I can call someone who draws the line a few steps away from where I might have drawn it "sleazy", especially when there are people out there that do way worse things in hidden places.
> medical community, or civil engineering, or aviation engineering, or physics or chemistry
Honestly, I think this conflates two very different problems.
If we're talking about Chris Wylie? Yeah, he built a thing that did real harm and said "I didn't think about the consequences, I just wanted to write the code." If people are disinterested in the harm they cause, or hiding behind "someone else would have", that's time for an ethics class.
But GrayKey? Or Prism, XKeyscore, Echelon, or whatever else? There's no shortage of people who believe they are doing the moral thing with projects like this. GrayKey (supposedly) doesn't sell to everyone - they request data to approve buyers first. It's not hard to imagine the people behind it think the government should have this access and believe they're the ones doing the right thing.
(And on the flipside, I'll bet some of the engineers on Greyball thought it was justified to fight harmful government regulation.)
I don't think this is an isolated example. Psychology has a strong code of ethics and associated training, but the American Psychological Association helped the CIA design a torture program. British aviation engineers design fighters that end up sold to Saudi Arabia. Hell, civil engineers design prisons to support widespread solitary confinement and other treatment many people consider unethical. Either their ethics training isn't working, or the people doing this stuff consider their behavior ethical.
There are programmers doing unethical things because they never thought about the consequences, or because they just don't care. But the focus on unethical software is rarely "somebody wrote ransomeware" - it's usually about controversial political applications. Calls for licensing and ethics training in the industry all too often look like an endrun around "what's moral?", attempting to settle moral disputes by treating disagreement as ignorance.
If they need to break the law to collect the data (the far over-reaching CFAA), what should they need? a search warrant? (honest question about civil liberties, I really am curious)
I'm definitely not the person to ask -- I don't particularly live in this space, and I'm fairly sure other people here do and could answer more completely, but:
If you see modern society in a democratic environment with the (somewhat naive) outlook that a large enough group of people have consented to be governed by the laws of the land, then it's really up to the law on when this is allowed/disallowed. It's more of a reflection of the people (or.. who's making the laws), and what they've decided -- if people decide that personal privacy is protected under law then so be it. If people decide police are allowed to break the law (at all) then so be it. If people decide that police are somehow above what would normally be a law when they have certain piece of paper (search warrant) signed by the local wiseman (judge) who is likely looking out for the best interests of the community and country at large, so be it.
Things get blurry really quickly, and this is a gross oversimplification of how any of these systems work, but it's how I tend to think about it. I sometimes think that it can't be any other way -- once a bunch of humans attempt to work together, there are some fundamental problems that just end up best solved this way (in terms of efficiency and other factors).
If it was marketed to repair shops and such, would you think it's "sleazy"? What if he released it as a way to jailbreak? Data recovery is the other use which comes to mind.
Quite frankly, I think it's a relief to see a few "cracks in the wall" --- that not everyone working for Apple agrees completely with their authoritarian view of security.
Sure, "I forgot my password and want to recover my own data" is a legitimate use of a security flaw, and you might be happy in the short term if you need to use it like that.
That doesn't make it a good thing overall that the flaw exists.
I'm sorry, but it's no longer the 1980s. In plain English, hacking now means "gain unauthorized access to a computer." You may not like it, but English is defined by usage, not whatever culture the one-time jargon was inherited from.
Regardless, the point is that the parent's satirical accusation of hypocrisy is invalid because it's using the same word "hack" for two very different meanings.
Hint: I've been here a while, and do very much get what HN is about. I simplified my argument to make a point: the hypocrisy of Hacker New "users" (including myself) when it comes to hacking (both in the broad sense and specific techniques) when it comes to law enforcement.
Yeah it's a bit like FOSS people not liking for profit companies using open source to make money or like people saying you have freedom of speech but what you say offense them.
He got paid by Apple to secure them. Now he gets paid to do the opposite. After a couple of beers he will you the truth $$, but now he probably reasons it as "helping cops to catch bad guys."
That 15k(or 30k) box looks like it is slightly more polished than an arduino case straight from the likes of DigiKey.
It wouldn't look out of place in the 80's. The LEDs in particular would fit right in.
I would not be surprised to find an actual $1 micro controller driving this. Or to find that out the box wasn't really required at all – and that during development the software ran in a normal laptop, but they needed a physical product to charge the big bucks...
GPS, 2-factor auth and probably additional tamper-proofing is packaged in the enclosure. GrayKey's value drops to zero the moment the exploit is unearthed; I suspect the black box mostly provides safeguarding.
Though, one wonders whether a simple tap on the lightning cable couldn't spill the device's secrets.
I would not be surprised if this was mainly developed by the Chinese, who have a history of making things like the *Box series, primarily for repair purposes.
If anyone can extract keys from the actual secure enclave processor, it would be them.
> FBI Director Christopher Wray recently said that law enforcement agencies are “increasingly unable to access” evidence stored on encrypted devices.
> Wray is not telling the whole truth.
I wish there was some punishment for Government officials for lying to the public. You can be prosecuted for lying to the FBI, so why shouldn't they be prosecuted for lying to you (the voter, who is supposed to have the power in a democracy)
(1) Whether LE agencies are actually finding it more-and-more difficult to access devices that employ encryption. I think this is plausibly true, simply because there are more such devices being sold than ever before (from the perspective of senior LEOs).
(2) Which LE agencies is he talking about? If he is refering to all agencies, then he might be right. Many LE agencies have very limited budgets. However, if he is talking about the more well-funded and competent agencies, then he is probably wrong
>I wish there was some punishment for Government officials for lying to the public. You can be prosecuted for lying to the FBI, so why shouldn't they be prosecuted for lying to you
Get rid of the punishment for lying to LEOs and it is all fair and equal. They lie to us and we lie to them, both without punishment. It would still be illegal to lie to judges.
>(the voter, who is supposed to have the power in a democracy)
The purpose of democracy is to use elections to legitimize a limited group of persons to use the power of the State. The power of voters is limited to selecting who will be in that group. The rest of the power of the State is in the power of the hands of those who were selected.
I would guess it's to reduce the amount of frivolous accusations by people who only think they were lied to or simply disagree. I would imagine that most government officials would spend their day fighting off these accusations, with no time for their official duties.
It looks like it runs third party code on the device. Only needs to be connected to the black box for two minutes and then unplugged for the remainder of the process.
I've always wondered about the legalities of such things. How is it okay for a company to legally sell a hack of another company's technology? Is it because they only sell to the police? If this is okay, then where is the actual limit? Can they sell hacked access to a company's servers for example?
It is a hack. It shouldn't be possible to test pins via a lightning connection. If it weren't exploiting a bug you would have to enter the PIN via the screen, and then you only get a few attempts.
It seems likely that Apple could just buy one of these, find the bug and fix it. Unless it is using some very low level exploit which I suppose is possible and might explain why it is hardware rather than software (though that might also be to justify its cost and prevent piracy).
From the article, it seems to be a passcode bruteforcing tool. They state in the article 3 days or longer for a 6 digit passcode. Which I assume means 3 days for a 4 digit code. That’s about 26 seconds per guess.
So if you care about securing against this, use a longer passcode (and alphanumeric) is the message I guess.
This device pranks the hardware to allow unlimited attempts at guessing the password.
I don’t know the details: is it as simple as grounding the “password entered incorrectly” pin? Or is it about injecting so much noise on a signal line that the message to increment the PIN attempt count never gets through? I don’t know.
Last year someone demonstrated the possibility of dumping and restoring the state of the security hardware in between entry attempts, so that the phone always thought you were on your first try. I assume this is the technique being used by the GreyKey.
With all the work that I've heard went into secure enclave (IIRC that's apples hardware+software security latest-and-greatest), does that mean someone (or an entire team) at Apple is absolutely sweating bullets/thinking frantically right now?
Security and a focus on user privacy protection (from other entities, at least) is definitely a differentiator for apple devices
Maybe.. but I think the system is designed so that isn't possible. There is a secret device key inside the secure enclave used to salt your passcode... I bet they're bypassing the limit on number of password guesses.
On the other hand, is it possible to programatically enter the passcode? From the picture, it looks like it's unlocked via lightning cable. It's not emulating unlocking via touchscreen. I wonder how much security would it add to require input via screen.
both touchID and faceID are ephemeral and expires after 24 hours (configurable, i think). Unless the device was obtained from the person immediately any hardware hack to bypass them won't work.
It is still a 24h block (let's use the default) where the device is vulnerable. In many countries there are no protections against pressing your finger on the phone, or worse yet, turning it to face you.
It works fine for every day use, and if I ever find myself in a situation where I could conceive a higher than normal risk of getting arrested, I would reboot the device so the password is required to unlock it. This can be done with one hand in the pocket in a matter of seconds.
Tap the power button 5 times to activate Emergency controls. Offers to power off, display Medial ID information, or instigate an Emergency SOS where your GPS location is sent in an SMS to a designated emergency contact and a call to the emergency services is initiated.
It also disables biometric unlocking until the password is entered.
I wonder if they have any process in place to prevent Apple buying one of these and figuring out how it works.
I would guess Apple already has one. But if they’ve tried to get one, and been foiled somehow, there must be a fascinating cloak-and-dagger story there that we’ll probably never hear...
well I am curious what the break time on longer passwords is? They made claims against four and six character passcodes but my employer already requires longer if we are to receive corporate mail.
I have never worked for a company that requires less than 8 characters for a password. Most have required 10 characters (or more), with at least one numeral, one special character, one uppercase letter, and one lowercase letter.
Absent this whole article is the fact that there are good reasons for criminals to want to crack your phone. These developments just make it more likely your personal information and, frankly, cash can be stolen by anyone who swipes your phone.
This is one of the best points I have read. Without the community knowing how he is doing it—it can be used maliciously.
There are extremely good reasons for police officers to want access to an iPhone (which could be time dependent). At the same time there’s a lot of potential for misuse of the ability to gain access by agencies or malicious actors.
It’s a trade-off. I err on the privacy side of the issue because things can be misconstrued in a legal setting. I can see why some people don’t have the same viewpoint and lean the other way.
I for one am glad I started using 25 character passwords 3-4 years ago. I just wonder how long it will be before that is not good enough either. Surely in my life time. And what's next? 50 character passwords? One hundred characters? 10-factor authentication?
If we ever get to 50 or 100 character passwords i don't think passwords will be the norm anymore. Perhaps facial recognition, but i think a better option is on the horizon.
It's probably in Apples best interested to let this firm operate. It might be even a long term strategy for them. It relieves pressure from them and keeps law enforcement happy.
Yeah I’m curious about how this would work with a longer pin. I have one that’s over 6 and you need to press ok after punching in the pin; making it much harder to crack I’d imagine.
And couldn't Apple defend against this by using an exponentially increasing wait period between guesses? Probably after some number of guesses with no delay, say 10 guesses.
I thought the iphone has a delay after a few attempts at the secure enclave level?
I wonder if this is doing some sort of timing or voltage related validation of the code without needing to actually submit it. Ie the equivalent of 1234,backspace,5,backspace,6 etc without sending whatever is the equivalent of 'submit'
Most key bits are below; there's much more in the article and in the article's links.
> GrayKey can unlock an iPhone in around two hours, or three days or longer for 6 digit passcodes
> 'GrayKey' ... can break into iPhones, including the iPhone X running the latest operating system iOS 11.
> The device comes in two versions: a $15,000 one which requires online connectivity and allows 300 unlocks (or $50 per phone), and and an offline, $30,000 version which can crack as many iPhones as the customer wants.
Wow. That's very sleazy.