Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Adding a security key to Gmail (techsolidarity.org)
213 points by idlewords on April 13, 2017 | hide | past | favorite | 123 comments


I would advise against Google Authenticator as a backup as it really defeats the point of a hardware token.

Google Authenticator stores the TOTP secret in plaintext on your device where the potential exists for it to be stolen. An adversary that exploits your phone can generate TOTP tokens as they like and ignore the fact you have a hardware token. If you are going to use Google Authenticator it is your weakest link and a security token buys you no added security, only ease of use.

The typical goal of a security token is to be able to assert: "No one can log into my account without this physical device or an offline backup token from my safe"

To acheive this consider a device with built-in TOTP support in addition to U2F. All current Yubikeys fit the bill here as as well as some Nitrokey models. Desktop or Android users can use the either USB or NFC devices but it is worth noting that iOS lacks support for either which means you would need a desktop or Android device to fetch TOTP tokens for an iPhone.

You can use the open source "Yubico Authenticator" apps to store your TOTP secrets in your key alongside your U2F secret. Now both methods use the same hardware and your phone/computer only get handed OTP codes from the key if present, but can't generate them itself.

Added bonus is now you can now use security token backed login even on the majority of sites/browsers today that lack U2F support.

Extra bonus is these keys can be used for ssh without any server changes. Security token all the things :)


The attacker who gets access to the filesystem of your phone can almost certainly defeat any "encryption" a TOTP authenticator would use to protect secrets, so the premise of phone-based authenticators is that your phone isn't going to get compromised.

This is reasonable when you consider that if your computer --- the least secure device you own --- is compromised, your attacker is virtually certain to get your email account with it, because you'll have (at some near point) a logged-in session.

Meanwhile: the #1 concern that laypeople have about security tokens is that they'll be locked out of their account when they lose the token. Authenticator (or Duo) is a perfectly sane answer to that concern.

Finally, it's worth adding that at least the last time I helped someone set this up, you can't remove your phone number as a factor from your Google account until you have TOTP set up. Your phone number is an extremely insecure login factor.

We use, and recommend, Google Authenticator as a backup login factor.

We do not recommend Yubikey 4 keys for normal users. Nerds on HN might get a kick out of them; I say, go ahead and enjoy yourself. We're trying to solve problems for people who aren't computer experts.


Yubico Authenticator is a fork of Google Authenticator and is a drop-in replacement.

I have never had any problem helping someone that has used google authenticator set this up. Scan barcode and tap.

Also users have a much easier time when they get a new phone. Just tap to new phone and get codes. There is no data to transfer.

As for people getting locked out, that is what the printable backup codes are for, or a secondary key, depending on your threat profile.

In a corporate setting this is a non-issue as an admin can bail you out.


That's a desktop TOTP application. Now not only do they have to have their computer with them to log into their Google account from their phone, but they have to have 2 security keys on the account to remove their phone number from it, and all their backups are physically separated from them, so unless they bring their backup codes with them when they travel, if they lose their key, they're boned.

And all this for what real additional security?

If you want to nerd out and get your security key to do pet tricks like handling your TOTP secrets, I do not have a problem with that. But please don't tell ordinary users they're wrong when they don't do that.


I have Yubico Authenticator installed on my phone and my desktops/laptops.

The android app is a direct fork of google authenticator and has nearly identical UX.

I tap/plug my key to either of them to get a token.

If you want to make the argument TOTP via hardware token is overkill for most users, that is totally fair. On that note though, there is no point in having hardware token via U2F.

Security is ahout the weakest links. All I am saying is anyone going through the trouble to set up U2F as this guide suggests, might as well spend the extra 10 seconds to store their TOTP secret on the key as well, vs exposing it on the phone.

I assume someone that has a hardware token is getting it for a reason: To have assurances an attacker can't log in as them without that token.


You keep saying that, and it keeps being false. The point of U2F isn't to put your secrets in super-secure hardware so you can walk around feeling like you have an HSM hanging from your keychain like a bad-ass. The point of U2F is to defeat phishing attacks, which is how people actually get compromised.

It would not be much of a stretch to say that this U2F guide was written deliberately as a corrective to this mindset.


I disagree with regards to your risk analysis.

Your cost/benefit considerations prioritize relatively miniscule security improvements without considering usability costs or diminishing returns. While we're at it, why don't we just use one-time pads? After all, those are impervious to any form of cryptanalysis.

The risk profile for most users does not require a hardware-based auth factor if it results in real world usability sacrifices that end in either 1) accidental misuse or 2) gradual disuse.

You're optimizing for someone compromising the device, great. But the point is that if that risk if on the table, all of this work is essentially meaningless anyway.


"if that risk if on the table, all of this work is essentially meaningless anyway"

I can't agree with this strongly enough. If someone's willing and able to hack your iPhone, then you need more help than a random art major writing a yubikey howto can give you.


Just an offshoot thought, but some random art major writing one of these guides giving a bird's eye overview of the entire process is something most businesses should be able to do themselves. But they don't.


I am only focusing on this bit because it is a few extra seconds of work if you are going to have a hardware token anyway. Why expose your TOTP secret if you don't have to?

I like knowing that if the phones and laptops of someone on my team were compromised, we have some damage control.

With the approach Yubico Authenticator takes an attacker with remote access to your Android Phone and a keylogger on your laptop still can't log in as you remotely.

Granted with more effort that combo can burn you in other ways, but TOTP on a hardware token still gives you some very real reduction in attack surface with no real added user burden. Why not?


Three people have answered that for you, and you keep ignoring them. We're protecting Google accounts, not TOTP secrets. We all understand that you can keep a TOTP secret more secure than a Google account. But since nobody cares about the security of a TOTP secret for a compromised Google account, I'm not sure why we're still talking about this.


Yes but you didn't address the main rebuttal, which is that encrypting or not encrypting the TOTP store is a red herring. If someone has access to the filesystem they can likely walk around the issue of encryption, just as they would on a desktop computer. And if they control execution, encrypting the data becomes utterly moot (and I'd argue most cases of someone gaining filesystem access where the individual cares enough to have 2fa are going to be a jailbroken device, which results in complete debugging and reversing capability, which makes this redundant).

This is not the thing to optimize for. Yes, optimize for ease of use, because take PGP as an example of great security vs horrid usability and look where that's got us. If you do the security improvement analysis from a cost/benefit perspective, you do not win by using a hardware key over regular 2fa apps. Users will shoot themselves in the foot, or simply not use it.


I guess I didn't understnd that bit. It sounds like there is som mistunderstanding about how Yubico authenticator works?

It does not store the secrets on the disk/memory of the phone/laptop at all. It just sends over a code. The device only sees one code and nothing else.

This is also why a user can get a new phone and and just tap the key to the new phone and truck on. Magic.

When a user drops their Google Authenticator phone in the toilet however... bad day.


No, I'm pretty clear about how Yubikey TOTP works. The point is that the threat model doesn't make sense. Any device you can use Yubikey TOTP on is significantly less secure than your iPhone.

Yes, I'm clear that the attacker in this scenario doesn't get your TOTP secrets. If your primary goal is to protect your TOTP secret, I see your point. My problem is, my goal is to protect my actual account. I kind of don't give a shit about my TOTP secret, because Google will give me as many new TOTP secrets as I ask it for, but I only have the one account. If the device I'm securely generating a TOTP secret for is compromised, I'm going to feel pretty silly doing a security theater dance with my Yubikey as my attacker steals my cookie and locks me out of my Google account.


Cookie theft is for sure a real issue. In the case of Google, if your cookie suddenly pops up in another country, it will often be quickly terminated.

Not all services do this, and Google does no do it all the time either.

Still, I see no reason not to take the super easy low-hanging fruit to reduce attack surface when you can.


Don't be evasive. It doesn't matter how you secure the channel, with cookies or magic beans: if the attacker controls the device you're using Google Mail from, they've compromised the account. What's the point of having a super secure key to a house with a wide-open hole in the side of it? There is no point, is the answer.


They may of compromised gmail, because gmail is logged in. They however won't get to the other 20 services that I have not recently logged into recently on the device that also use hardware tokens. They don't get the totp secret for my AWS account which I only log into from that device in emrgencies. Etc.

If someone gets remote access to your device it is a very bad day, but you -can- have damage control and a clear picture of what they had access to and what they did not.

If the attacker roots your phone and it has your unlocked password manager on it and google authenticator with all the 2fa secrets... well now they get the entire farm, including for services you don't have active cookies for.

Hardware tokens are not magic, but they are a very useful tool and if we combine enough tools we make the life of an attacker that much harder.


And then there is the recent story of someone getting private emails/password resets/paypal info sent because Gmail ignores the dot in their email address... https://news.ycombinator.com/item?id=14140569 - which makes one wonder if that's already an attack angle being used.


No, I understand that.

Let me simplify my point: Whether or not the TOTP secret is on the smartphone, encrypted or not, or sent to it from another device, is the wrong attack vector to optimize for.

Getting mainstream users en masse to consistently and correctly use any 2fa is a win.

Furthermore, you're moving the goalposts a bit by using the Yubikey in this scenario. So sure, if someone compromises your phone they don't compromise the Yubikey, but 1) how certain are you that your Yubikey is safer than a modern iPhone or Android model with the crypto and security engineering that entails and 2) how certain are you that accessing your iPhone's filesystem or execution state does not bypass this whole dance entirely?

For you, the minimal security gains might outweigh the usability costs if you know what you're doing. But a hardware token for most people, as the technology currently stands?


The device keeps the secret in self contained memory and never exposes it. It is on the other side of a USB bus or in some cases NFC. There have been local attacks against old designs via side channel attacks etc, but never once a remote attack. The model makes that pretty hard. The phone is essentially zero knowledge. Android and iOS allow arbitrary execution of user installed code and are a massive attack surface and have piles of pubished vulnerabilities.

By moving secrets to very simple easy to reason about devices we get substantial reduction in attack surface.

Also I have helped deploy these to several dozen people, taught workshops etc. It is no harder than teaching people to use Google Authenticator, but lower attack surface.

Use U2F where you can and when you must fall back to TOTP at least you can promise an attacker does not get a free pass to genreate codes whenever they want which is something.


Getting mainstream users to use 2FA? Is that like getting them not to use overly simple PWs? Or not use open wifi? Those people?make Not sound like an ahole but...How's that working out for ya?

I think ya might be able to argue that if you're going to add friction (e.g., Yubikey) you're also creating a great senses of seriousness. That sense of seriousness is seriously lacking.

All that said, the UN + PW idea is too weak. We need something that's up to the threat AND is also appropriate to the risk of loss. Best I can tell, as a general mainstream rule, we're not even close to that. It's 2017? Really?


  The attacker who gets access to the filesystem of your
  phone can almost certainly defeat any "encryption" a TOTP
  authenticator would use to protect secrets
Is Android's hardware-backed keystore no good? The documentation makes it sound like keys can't be extracted or used without user authentication.


It doesn't matter how good it is, because once the application is used, the (now-resident) attacker gets the (now-unlocked) secrets.


Who are the we in your post?


Clicking on his name [1] you can see he works at Latacora now [1]. Hopefully that helps answer your question

[1] https://news.ycombinator.com/user?id=tptacek [2] https://latacora.com/


Do you have a citation for the fact that google authenticator stores the keys in plain text? Furthermore, for the case of google authenticator on an iPhone, any files on the user partition are encrypted anyway, and I know from experience that the google authenticator app does not back up keys to either iCloud or iTunes backup. This should mean you are safe on iOS


If you are rooted, you can easily copy the sqlite database that secrets are stored in. I've done this a few times to migrate secrets to a new phone.


tokenizerrr | https://news.ycombinator.com/item?id=14105616

Even without root. Just run a backup and extract it from that. You can do it with just adb or helium.

--

I can't quickly find any examples online that don't specifically mention requiring rooting the phone. Just this anecdote: https://community.spiceworks.com/topic/465582-google-authent...

If you do a backup (even if not rooted you can use ADB to backup your apps and data,) then you can simply restore the app and data to your new phone and the codes for the device come with it.

This article appears to contain the most detailed instructions, stating that both Titanium backaup and manual extraction require root: https://www.howtogeek.com/130755/how-to-move-your-google-aut...

If your Android is rooted, you can use Titanium Backup, which we’ve written about before, to take a backup of your Google Authenticator app data. [...] If you have root access to your device, you can actually extract the credentials manually

It was at one point possible to extract from iPhone backups: https://dpron.com/recovering-google-authenticator-keys-from-...


ADB backups won't export the secrets from Google Authenticator. The app is configured to disallow that.

Source: I've tried.


Well the scenario we are wanting to defend against is an attacker that can remotely (or even locally) exploit/root the phone (see long list of vulns for ios and android that have allowed exactly this). How many of these still exist not yet patched?

Depending on who you work for, someone might just burn a 0-day on you. It all depends on your threat profile.

Putting the secret in a hardware token gives you easy to reason about assurances a mobile phone OS vendor can't ever offer.

Also this means when you get a new phone, you just install app and tap key. No setup required.

Easier for you, and far more secure. Win/win.


No, that's a scenario you want to defend against, and I'll remind you again that if you're dealing with attackers that can exploit your computing devices directly, the tokens are pretty much cosmetic. If you have an insecure phone and you actually use it like a smartphone, you're boned no matter how many security tokens you've got attached to your key ring.

When we work with lawyers, reporters, and NGOs, what we find are people with much more urgent security problems. They're one carefully worded email away from giving their entire email account away to a 25 year old in Estonia. They aren't worried that their phone is about to get owned up --- mostly because that isn't going to happen, but for other reasons too.

Real targets are going to be compromised for 3 reasons:

1. They're going to be phished out of losing their credentials.

2. They're going to share credentials between sites and lose them in a breach of one of those sites.

3. They're going to click on an attachment and lose their whole computer to an attacker.

The U2F/TOTP stack this post recommends nicely addresses (1) and (2), and nothing anyone on this thread is talking about addresses (3). I'm not sure why we're spending so much time considering (13).


I think the other part of this is that for these people (and probably most people) losing access to your gmail account is a catastrophic event.

That access can be lost because of an attack or by losing the keys. The former is actually much less likely than the latter so mitigating in favor of it instead doesn't make sense in this threat model.


Any device that generates TOTP tokens needs the secret key available by design. Your could read the source code or spec sheets but an easy way to prove this by backing up Google Authenticator via Titanium Backup and restoring it to a new device. Now both devices generate the same codes.

There have been plenty of iOS exploits as well as Android and everything else. Phones have a lot of attack surface and are not a reasonable place to store 2FA of any kind, IMO.

The separate hardware TOTP device never exposes the private key to system memory or disk at all. Even if your phone was rooted by a remote attacker, they could not generate tokens.

Likewise even if someone physically stole your unlocked phone and your pin-protected key... you are still in pretty good shape.

With a hardware TOTP device is just a "viewer" for one code at a time, as generated by the token.


For virtually all users, their iPhone is in fact the most secure computing device they own. It's meaningfully more secure than a computer running a desktop operating system. If we're talking about protecting applications running on a desktop OS, the idea of keeping things off the phone because "phones have exploits" is pretty silly; in that threat model, the desktop is also owned up, and with it the email account --- it's now secured solely by a cookie in your Chrome cookie store on your compromised desktop!

The threat modeling here just doesn't make sense.


Yes the desktop is just as bad as the phone. This is why I don't alow any of those devices have secrets that could be used without me being physically present to access data of users I am responsible for, or my own.

You are coming at this from a threat profile of joe individual user. Okay, point taken.

I am talking about the perspective of trying to take every reasonable step to reamin secure while being targeted by skilled adversaries who have a lot to gain if they succeed. It is not that much extra work to reduce attack surface so much further than TOTP-generator-on-a-phone offers, so why not teach anyone best practices that will listen?

Say I have 30+ TOTP secrets in my mobile phone app, and also my password manager. Everyting from Gmail to my AWS root account.

If the TOTP secrets are on my phone and an attacker compromises my phone... they get -everything-. If I am using a hardware token for TOTP and I quickly expire the sessions of all really important things I don't log into often, like AWS... then an attacker only gets a slice of the farm instead of the whole thing.

What this buys is us is damage control and a much clearer picture of what an attacker could of accessed, and what they probably could not of because no cookies or secrets were available to memory or disk at that time. I can assert -maybe- this one token was phished, but that none of the others were at risk.

If the attacker is on a phone with Google Authenticator, they an just generate all the codes they want for every service. We lose the whole farm.


You keep doing this. I didn't say "desktops are as bad as phones". I said "phones are far better than desktops".

I promise you, my security requirements are as stringent as yours are. My 2FA stack is Hardware U2F, Software TOTP, and physically secured backup codes. That's what I recommend. You keep suggesting that this stack is inferior to yours, and I keep explaining why it isn't and why the threat model suggesting to you that it is is incoherent.


You didn't address my primary point.

Say we each have TOTP for say 20 accounts and our password managers on our phones with the credentials for them as well. We are system administrators with access to piles of PII. Account password resets require 2FA so email alone is not enough to spider to other accounts.

Both our phones have been rooted and are accessible by a remote attacker because some "coworker" sent us a new beta app that was in fact malware.

In both cases the attacker has all our passwords to all apps via our password managers. That is lost.

We each are logged into 5 of these services and the attacker steals the cookies. Those are lost.

Now what about the remaining 15 services we are not logged into? Things we don't log into super often but some of which ar quite important like AWS root credentials.

In your case, the attacker goes and opens the Google Authenticator sqlite database and gets every TOTP secret you have in plain text. You just lost all 15 remaining accounts.

In my case those secrets exist on a hardware token and can't be accessed at all. If I catch my intruder at this point I can be reasonably sure those remaining accounts were not impacted.

Hopefully this clarifies the wider model I am working from.


If only there were a similar guide to getting gpg agent working with the yubikey stored gpg keys and ssh. I've done it, but for the life of my I couldn't tell you how as it was mostly just trying magic incantations of things until it started working.


Here you go:

Simple GPG setup: https://github.com/lrvick/security-token-docs/blob/master/Us...

Advanced GPG setup with backups: https://github.com/lrvick/security-token-docs/blob/master/Us...

SSH Setup: https://github.com/lrvick/security-token-docs/blob/master/Us...

I also will be adding an alternate "quick ssh setup" guide via PKCS#11 flows to just store an existing ssh private key. Still I think GPG is the way to go in general given all the other use cases it opens up.

Please file issues with anything you want to see! I have a lot of unpublished content I can get polished/up if people care.


"You should disable any other keys that aren't backed by a security token" ... why? You don't need a security token to physically secure a backup key; just put it on a USB drive and stick it in a safe (or a sock drawer).

Security tokens are a nice little bonus for security, and they're a major corrective for the kinds of real-world attacks that screw real people over, like phishing (and dumb passwords). But they're pretty marginal against the kinds of attackers who will target SSH keys. Don't get fetishistic about them; at bottom, for serious systems security, they're mostly cosmetic. It feels good to say that all your SSH keys are held in secure devices, but it doesn't mean much.


It means a lot because I have to physically touch the device every single time I ssh.

I can even do agent fowarding taboos and know an attacker can't go creating new connections on that agent without a physical action from me each and every time.

Compare this to how ssh keys are normally used. You use it once, type in a keyloggable passphrase, and the key is unpacked plaintext into system memory for, in most cases, the rest of the time the system is booted.

You could invalidate the passphrase after every connection but this puts an unreasonable amount of work on the developer.

Simply tapping once for each connection and having no way for an attacker to avoid that is a great middle ground, imo. Particularly for high level production keys.


You're making a comparison to an example I didn't cite. I'm not saying that you should have software-only SSH keys on your computer alongside your Y4 SSH pubkey.


I was mostly responding to: "But they're pretty marginal against the kinds of attackers who will target SSH keys."

I am arguing it is not all that marginal. If the only non 0-day way into production is via ssh to a bastion host with a touch-based hardware token then their lives are more than marginally harder than an on-disk key.


Yubikey documentation for setting up ssh gpg and system login is ... Well, I couldn't find it when I looked circa 18 months ago. Thank you for this.


It's not as good as having a hardware key, but they still need your username and password and pwn your phone. That's a lot of trouble. So for most people, TOTP software is good enough security.


I agree mobile app 2FA it is probably good enough for most people. This article however is about using a hardware token for login.

If you have a need for hardware tokens, use them end to end. Using a hardware token and having a less secure backup method means you are only as secure as that less secure backup method.


That simply isn't true, because the hardware token defends against phishing attacks --- in fact, that is the entire reason why U2F tokens exist in the first place. It's literally the motivating use case for the standard: experts with code generators were still getting phished.

So, when you have the token handy, you use it, and you're not exposed to phishing. When you don't, you use the mobile app, and you're exposed to phishing (but not to weak passwords and breaches in sites). It's not complicated, unless you think the token does more than it really does for your overall security.


I will grant you the phishing use case, and that one is relevant to average users. I admit I mostly work with infra folks that would not easily be phished, but might have one of their devices compromised unknowingly rendering phishing moot.

TOTP is a mess in regard to phishing but if we have tools to avoid some of the problems while we are stuck with it, I feel they are worth mentioning.

Particularly for people savvy enough to purchase hardware tokens for personal use.


U2F happened because Google developers were being phished. We are all phishable, probably especially those among us who believe we're not.


Are there any safe software alternatives to Google Authenticator? Duo? LastPass?


I use Authy: https://www.authy.com/ Now I'm wondering are there similar security concerns here as with Google Authenticator?


Authy has the same core issues because the problem is they have to store the secret key in plain text somewhere for TOTP to work. Worse: it is closed source and does not allow itself to be easily audited so we don't even get to know for sure where the key is stored and how beyond what the docs promise.

Security is hard enough when everything is open source. Closing foundational security tools so only a select few biased individuals can deem them secure on a deadline is never a good plan.


Yes. But those concerns don't really matter. Just use whatever TOTP application you're most comfortable with, and, because even experts can be phished, try to use the security key as much as you can.


You can only use U2F for what, github and google? I am a big fan, but the 30 other services I need to access to do my job use TOTP.

We need all the help we can get on that until TOTP is finally phased out.


Yubico Authenticator and cryptostick.oauth are the only open solutions I am aware of that one can easily verify don't ever expose your secret key.

Both of course assume you have the secret key on a Yubikey, Nitrokey or similar.

Any app-only TOTP solution has to expose your private key somewhere by design and thus are best avoided in favor of hardware-backed solutions when possible.


U2F has some interesting properties. It cannot be phished (browser sends origin to the token), binds the credentials to username (you can use one token multiple times), can be attested (e.g. server can trust tokens only from manufacturer X), uses asymmetric crypto (P-256) instead of shared secrets.


Thanks for writing this!

One nitpick: the guide says "If you're curious why it's important to not have a phone number on your account, see the security key FAQ", but the linked security FAQ doesn't actually appear to say why it's important.


Sorry about that, I'm updating that FAQ next.

The answer is that SMS is not a secure second factor (it's easy to hijack and eavesdrop on), and in some cases when you give a service a phone number, it becomes possible to take over the account with just control of the phone number.


There was an article floating around on HN a while ago about someone basically stating that an attacker was able to call their cellphone provider and use social engineering tactics to transfer their account to their own number. Ultimately circumventing second factor auth via SMS and gaining access to their Google account. I believe it was from an actual Google employee. So they were able to recover the account quickly.


And some services that do phone calls can be tricked to save to voicemail (which was the case for Google, Facebook and various others previously): https://shubs.io/how-i-bypassed-2-factor-authentication-on-g...


Google now offers "Google prompt" which sends a push notification to your phone through the google app. How secure is this method?


Much better than SMS, but not as good as a security key, because if you can fool someone into logging in to an impostor site, you can get their email account.

It would be a reasonable backup in place of (or in addition to) Google Authenticator.


Which is why password+SMS is sometimes called "1.5 factor auth"


Because it can be a way to compromise your account[0]. HN discussion[1]:

[0]: https://blog.coinbase.com/on-phone-numbers-and-identity-423d...

[1]: https://news.ycombinator.com/item?id=12597609


> We were also able to get in contact with an outstanding Verizon employee who understood the urgency and impact of our situation and shepherded our case through the byzantine halls of inter-carrier communications. We had control of the phone number back by 2 PM (which, if you’ve ever tried to get two phone companies to talk to each other, is a significant achievement. We were initially assuming we wouldn’t be able to regain control until the following week).

How in the world is a random guy supposed to do this sort of thing? Anyone have any tips for people who get into these situations and who aren't already BFFs with C-level executives of phone companies or something like that?


I have answers/arguments along those lines here: https://github.com/lrvick/security-token-docs/blob/master/FA...

Would be interested in seeing contrasting views though!


These are indeed answers, but they aren't the real answers.

The real answer for "why not a smartphone app" is "because code generators are just as phishable as passwords". In the real world, that's how people are being compromised, not by elaborate phone exploit pivots but by phishing pages. It also speaks to why phone authenticators are acceptable backups to tokens.

The real answer for "why not SMS" is "because both teenagers and intelligence services can get a phone number redirected; your phone number is not your phone."

Obviously, you don't PIN-lock a U2F token; the answer to "what if it's stolen" is "whoever stole it probably doesn't have your password, which they'll need in order to use the token, so if your token is stolen remove it from your account and then fish $17.99 out of your couch cushions and buy a new one".


Code generators are super phishable and that is the whole reason to abandon them in the medium term. In the short term however they are all we have for most websites so protecting the secret in a hardwre token is as good as we can get.

No matter how much you protect the secret though, not getting phished is left to the hopefully paranoid user, which is for sure not ideal, but we are probably years out from TOTP being replaced with U2F for most sites.

TOTP via hardware tokens is a stopgap.

Great comments though. Will update to reflect them.


Code generators are indeed phishable, which is why your primary login factor is a U2F token. Meanwhile, because of the way phishing works, if you go log in of your own volition to your Google Mail account, the TOTP code provides about as much security as the U2F key does.

The idea behind the U2F/TOTP stack is to minimize your exposure to phishing attacks and at the same time minimize (to practically zero) the odds of you being locked out of your account. It accomplishes that nicely, which is why most of the other experts we talk to have U2F/TOTP/backup-codes as their Google 2FA stack.


The article says that any key will do. Is there any concern with buying a less expensive security key from a less established company, or even a third party seller on a site like Amazon? Could a malicious entity make an intentionally weak security key and sell it? How would such an attack be detectable?


I would stick to things like nitrokeys/yubikeys that have gone through rounds of side-channel attacks, research, and upgrades.

The only one I can generally suggest for most people right now, in spite of it being closed, is the yubikey 4. Mostly because it can be configured to require a physical touch for each operation. Something a remote attacker can't do.

I started putting some comparisons down here: https://github.com/lrvick/security-token-docs/blob/master/De...


U2F Yubikeys are so cheap and available (Amazon will ship them Prime) that I'm not sure why you'd waste time looking for alternatives.


I'd like some advice about safely accessing gmail from your phone.

In particular an android phone that might not have the latest version of android on it.

Also for situations where not only do you access your gmail from your phone but also your google authenticator app is installed on it.


The recommended way are app passwords. You basically generate a password for each app that needs to access your mail account. You can easily revoke access for a single app in case something goes wrong. Also, nobody gets the chance to read your actual password.


Unfortunately, you can't access GMail over its "native" protocol using app-specific passwords: it'll only work for IMAP. And the GMail client is a terrible IMAP client. My inbox and folders / labels would constantly desync. I'd moved to FastMail a while back and the problem persisted, so I'm reasonably sure it's the client.

I'm actually using Outlook as my e-mail client now. It's surprisingly snappy for my minimal needs. Maybe I should switch to iCloud for e-mail, and aim for the trifecta...


Did you try K-9 Mail?


Warning: Outlook for Android stores your mails and all other data in the cloud! It is former Acompli app.


Upvoted because I would love an answer to this as well. An associate in the phone-cracking hobby has mentioned to me that _no_ Android phone is malware-free. They have cracks and methods for phones and OS versions that have yet to hit the market. I've understood from him that the only "secure" device is the iPhone, and even that is not secure from targeted (government) attacks.

That said, I personally would never own an iPhone and I'm happily bliss on my Note 3 with Android 4.4. But I don't access my mail, bank, or anything else sensitive on the device. For me it is no more than a phone, camera, and Anki interface!


We recommend using the authenticator app to log in on your phone. It's not as secure as the key (since you can still get phished) but one to mitigate that is to type in the gmail URL by hand.

An iPhone is so much more secure than any laptop that it more than makes up for the small drop in second-factor security.


Encrypt + add PIN + disable SMS/Phone authentication? Is this insufficient?


Thanks but I don't know if it is or not. The steps you mention apart from the last are all to do with protection in case a phone is stolen. I suppose what really concerns me is someone hacking my phone through some kind of malware. If Android is as insecure as some say it is, then is it risky to log in to gmail on any android device ever? What about the Google Pixel range of phones? How do they compare to the iphone. I don't want an iphone if I can help it. I prefer Android to iOS. If fact, a lot people I know who had iphones have been switching over to higher end android devices. But security trumps everything else if it's a question of keeping email secure.


I'm not sure how you can protect against malware aside from not visiting sketchy sites or downloading shady apps. Pretty much by definition they use security holes, which if people were aware of wouldn't exist in the first place. Only attack vector I can think of is some kind of SMS attack, to which the "solution" would be to disable SMS, if you're really worried about that. Personally I'm not.


The HyperFIDO Mini (U2F Security Key) is the cheapest and smallest key I've found so far for $10. (Amazon)

The Yubico are probably the best key chain candidate. No one wants to trust their key to a weak nylon thread.

You can also set up a Google account to use more than one U2F key.

As for Google 2FA, I think Google caused a lot of confusion by how they set up the Google Authenticator app. Always opt for the text generator codes instead of a barcode. You can then use the code to use on a second Google Authenticator app on another device. Google at one time stated that you could only set up 2FA on a single device, which makes most users leary as one could lose his or her phone.


Remember to add at least two U2F keys. It's easy to lock yourself out in case the only one or lost / broken.


The HyperFIDO Mini (U2F Security Key)

clickable link: https://amzn.com/dp/B00WIX4JMC


So by switching to a security key that nobody else uses, you've saved $7.99. With that money, you could buy a cup of coffee at Starbucks and have some money left to donate to the change jar.


There's noting stopping you from scanning the barcode multiple times


Didn't it change the web page on your computer browser after you successfully added it into Google Authenticator?

I suppose you could always take a photo of the QR code and then rescan that. Text seems simpler.

edit: Anyone else remember this behavior? Old version? Browser specific?


It changes when you input current code. You can scan it multiple times, print it, and then input the code from one of your devices.


Also, if you have a rooted device, you can get the original secret from the SQLite database of the authenticator app.


"can get the original secret" is a phrase which should worry a security-conscious person


rooting their phone is not something a security-conscious person would do, either.

Edit: maybe I should have explained my position. There are a few security issues with rooting a phone, e.g.:

- rooting usually requires unlocking the bootloader. Once it's unlocked, anyone can flash or boot a custom recovery and modify your system partition. Enrolling your own keys in the recovery and re-locking the bootloader, while possible, is an undocumented and complex process that just about nobody uses, see https://mjg59.dreamwidth.org/31765.html . You're also screwed if a system update replaces the recovery. Once the bootloader is unlocked, anyone with physical access to your phone can mess with your system in malicious ways.

- it circumvents the system's permission model. A malicious app that tricks the user into granting it root rights (maybe for a legitimate reason) could access information it shouldn't have, install a keylogger, etc.


Even without root. Just run a backup and extract it from that. You can do it with just adb or helium.


That doesn't work for Google Authenticator. Apps can opt-out of being able to be backed up, which even prevents adb/helium backups (unless you're rooted).


The article mentions Yubikey at $18. As an alternative, the Nitrokey U2F is only €9 (€11 including delivery)

https://shop.nitrokey.com/shop/product/nitrokey-u2f-5


Plus, unlike with Yubikey, Nitrokey has open-sourced both hardware and firmware [1].

[1] https://github.com/nitrokey


True for their storage and encryption products. Unfortunately not for their U2F product. "Nitrokey U2F is a relabeled 3rd party product and hence not open source."[1]

[1] https://shop.nitrokey.com/shop/product/nitrokey-u2f-5


Yubikey U2F is much higher quality physically than Nitrokey U2F. I wouldn't trust the nitrokey to last very long. I may be wrong, but that's just the impression from comparing them in my hands


The article states:

>If you're curious why it's important to not have a phone number on your account, see the security key FAQ.

but this is not explained in the FAQ. I've never heard about this before, why is this important?


I guess because it's relatively easy to redirect/capture an sms/phone call. Trick the phone company into moving your number to a new sim for example


Yes. Also governments can see SMS in transit (a concern in many places), and SMS-es can show up on a lock screen. I'm updating the FAQ next; sorry for this dangling reference!


If you want to use your Google Account on your iPhone's Mail, Calendar, or Contact apps. Security Key doesn't work with apps that come on your iPhone, but you can use Google apps instead.[0] I'm using Google Contact on my iPhone. It seems security key is not for me. :(

[0]: https://support.google.com/accounts/answer/6103523


Have you tried using app specific passwords?


Is there any point in doing this if you do not use Chrome?


Only Chrome supports U2F. Firefox has experimental support of you enable special flags in about:config but I never got it to work.

U2F will be superseded by Web Authentication [0] that includes U2F and will be supported by all major browsers. Edge includes draft spec API that uses TPM to store keys.

[0]: https://w3c.github.io/webauthn/


You can use U2F in firefox with extension. Last time I tried it worked. However I use chrome most of the time so I am not sure if it still does.

https://addons.mozilla.org/en-Us/firefox/addon/u2f-support-a...


You should not use Firefox if you want the protection of a U2F key.


It works! Thanks.


Opera also has native U2F support, although some services (I'm looking at you, Dropbox) refuse to show the token UI to anything but Chrome via browser UA checking.


Bought a U2F Yubikey more than a year ago. It is pretty sturdy. Better buy two and use one as a backup. U2F is really convenient to use. Compare that to all the OTP apps out there.


My Yubikey recently died, so I'd +1 on this approach to have another as backup. They offered to replace it under warranty, though.


I am very curious what you did to kill it. I have been unable to with anything short of a hammer or soldering iron.


Can attest to sturdyness. I have washed and decased 2 of mine with acetone and they still work as expected.

As an alternative to a second key as a backup, you can always keep a set of printable backup OTP tokens offline in a safe etc, ideally encrypted.


How does the communication between the USB key and Google in a browser work? Will it work on all operating systems and browsers?


As for the operating systems: in order to use it on your phone, you'll need to be careful to use a YubiKey with NFC support and have a phone that supports NFC. You won't have any trouble using it on Windows/Ubuntu/macOS.

As for the browsers: Chrome/Chromium works fine. Firefox has an addon that adds security key support[0]. Unfortunately, this addon has a bug which causes high CPU load[1]. Firefox is also working on adding the native support for this, and I think that their ETA is somewhere in the second half of this year to get this thing to work natively. Unfortunately, even with the U2F addon on Firefox, you still won't be able to use it for Gmail, since Google hardcoded its browser to be the only one with U2F support, so you're going to have to change your user agent in order to use it, or default back to your second method of 2FA (Google Authenticator or something of the sorts). I have no knowledge about the U2F status on Safari and Edge.

As far as I was able to discover, the only service that didn't hardcode Chrome as being the only browser at the moment with U2F support is GitHub, and using YubiKeys with Firefox + U2F addon on GitHub works without any issues (other than the occasional high CPU load that I've already mentioned).

I wrote a short article on my blog about Yubikey usefulness for my usual setup about a year ago[2]. Things changed slightly to the better since then, but not by a lot.

[0] https://addons.mozilla.org/en-US/firefox/addon/u2f-support-a...

[1] https://github.com/prefiks/u2f4moz/issues/51

[2] https://blog.r3bl.me/en/yubikey-review/


U2F itself as a browser API is a dead-end and will be replaced with newer FIDO 2 WebAuthentication API, but fear not - it's supposed to be compatible with U2F tokens in use [0].

WebAuthentication will be supported in all major modern browsers [1], it just takes some time to implement.

[0]: https://bugzilla.mozilla.org/show_bug.cgi?id=1065729#c254

[1]: https://www.chromestatus.com/feature/5669923372138496


I would add that Yubikeys and a number of other tokens work just fine with USB OTG with TOTP if nothing else.

If you have a phone with USB Type-C and a Yubikey 4c you can just plug it in direct.


You can also use U2F on Firefox with FastMail.


The best part of the article:

"If you're on a newer mac, you may have to use a USB adapter, like an animal"


"We'll remove the phone number later"

Too late, Google now has it and can correlate my profile with other sources. There's literally no other reason why Google doesn't let you enable 2FA without a phone number.

I wish I didn't have to choose between security and privacy.


There is a more benign reason Google requires a phone number—they're worried you'll lock yourself out of your account.

I'll take the Pepsi challenge with anyone on Google bashing, but I think this one is not fair.

It does hold for Facebook, though :-)


Get a throwaway or a burner phone? If you're that concerned, pick up a 20$ prepaid and quit your whining


Relevant username, I see.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: