I'm a big fan of what Keybase is doing, but I've avoided installing it on my newest computer because I'm unhappy with their client.
To list a few points:
* No application preferences, so no obvious place from which to disable KBFS. Even though I never use it, it somehow idles at 160MB memory, and after each restart it adds itself to the top of my Favorites on Finder!
* Between the Electron app and Keybase services, another 360MB of memory gets used. In total that's 520MB, which is a fairly large amount of memory for a laptop with only 8GB.
* Quirky menu bar behavior which doesn't match macOS design guidelines.
* No option to disable automatic updates, or even reduce the check's frequency. Currently, it runs every hour, even though they can go weeks without a release!
They're trying to do way too much with a single app, desktop apps are usually more granular. You could break it up into three or four apps:
* Core. Includes account, device manager, identity proofs, and CLI tools.
* Chat. Includes contacts and people search. Heck, you could even offer to scan the user's contact list or keyring for matches on Keybase.
* KBFS. A menu item app with a preferences view. Functionality should probagbly be implemented as a Finder Sync Extension [0].
> No option to disable automatic updates, or even reduce the check's frequency. Currently, it runs every hour, even though they can go weeks without a release!
Why is this a problem? Is the check heavy somehow? An hour between quick checks sounds like a reasonable interval for a security-sensitive app.
What bugs me about the direction Keybase is going is that they still have not implemented a way of disabling the ability for users to send me encrypted messages.
I do not want Keybase to hoard encrypted messages I will never be able to read because I do not want to install their application on my computer. My Github issue for this has gone largely ignored:
I am thinking I am long overdue to placeholder my account until this is solved. I already have 10 encrypted messages I will never be able to read. I joined Keybase as a public key repository with external verification support, not for them to store private conversations -- encrypted or not.
> I joined Keybase as a public key repository with external verification support, not for them to store private conversations -- encrypted or not.
While I agree with your comments on feature-creep, in order for you to worry about someone having a copy of your encrypted communications you must assume that the encryption scheme is completely broken. This raises the question: why are you using PGP at all if you think the cryptography is broken?
Keybase has created an inbox in your name which in turn creates a social contract on your behalf to check it. Existing users signed up for something different, so no wonder some of them want to disable that feature.
Again, I agree with the feature-creep point. What I was asking about is why is the connotation about private messages seem to imply that they don't think encryption is sufficient for a third party to hold a copy of a message they will never read.
Not really. PFS is about protecting a long-term key from being broken and then historical communications being uncovered. If you receive a one-off message then it's not materially different to being PFS with just a single message.
I use PGP every day. Who messages me, how often, and at what times, is still private information and I should have a say in where and how that happens. My PGP-encrypted conversations tend to be much more sensitive than any other medium I use.
The cryptography is almost certainly not broken. That does not mean it won't be broken in the future. I would have the same concern if my TLS-encrypted traffic was being saved. If my ISP was saving TLS traffic or my XMPP provider (the one that I don't host, anyway) was saving OTR conversations, I would be equally concerned.
Even worse, actually. TLS (usually, nowadays) and OTR both employ forward secrecy. PGP does not, at least traditionally.
People could just post these encrypted messages on pastebin, Dropbox, whatever. It's someone else's choice to send you the message and paste it somewhere. You can choose to ignore it, but it's not really your right to tell someone else not to do it.
As a casual user who has occasionally used
Keybase for sharing work-related sensitive things, it will be interesting to see how KBFS vs. upspin plays out. As far as E2E encrypted file storage goes, the two seem to offer very similar abilities. Obviously, Keybase's other services offer more in the way of messaging and user social proof. I also like the way you can privately share things with individuals using the directory names, vs. the ACL file in upspin, though I guess that allows read-only sharing and a stronger concept of ownership... you can't do that for shared private files in KBFS right?
The other cool thing about upspin is the ability for anyone to spin up their own server infrastructure for storage. I can imagine a service offering an on-premise secure Dropbox to paranoid big companies. I'm not sure if you can deploy a private Keybase, though their client code at least is open source.
Yeah it's not new. The chat feature has existed for a short while, too, which deserves more fame as being an e2e encrypted multi-device IM system that isn't tied to your phone number.
> The chat feature has existed for a short while, too, which deserves more fame as being an e2e encrypted multi-device IM system that isn't tied to your phone number.
On the other hand, it doesn't feature forward secrecy.
You can sorta kinda almost bootstrap that by putting your 'signal public key' in your keybase public folder
(by signal public key I mean that your safety number with any given person is always a concatenation of a string that identifies you and a string that identifies them and you'll find that same string across your safety numbers with all your contacts)
I realize keybase has raised a lot of money but can someone clarify how they plan to make money other than make "cool" stuff that only highly technical engineers understand?
It is a good question, but no, I don't think anybody will be able to tell you more than what's in the "Business Model?" section of the article:
But, as stated above, there is currently no
pay model, and we're not trying to make money.
We're testing a product right now, and we'd
like to bring public keys to the masses.
They say they won't sell ads or data, and however they make money will be from organizational users with a goal to provide a useful free tier that can be used by every person in the world. I assume that means that in addition to staying free, it will get easier to understand over time.
It's pretty nebulous, but at least they are saying the right things while saying nothing... :)
keybase uses your gpg key more as a _bootstrap_ to prove your identity, and to then sign your _keybase key_ (which isn't gpg as far as I could tell from the spec).
e.g. you can see I used my GPG key to sign all of my proofs (proving identity), and to also sign my keybase keys which then have signed other keybase keys:
Aaah, okay, I actually do remember about that worked-key now. That makes sense.
But then the problem that I have with that is this worked-key is a lot less secure than my PGP key on a hardware token. What I'd like would be for keybase to make those keys depend on my PGP key, for instance by decrypting them at the beginning of each session.
I'm not sure I get the point of these device keys to be honest. Why not simply generate a new key every time one is needed, and then sign and encrypt it with my PGP key?
After all that's basically how basic PGP encryption works, it's encrypted with some symmetric cipher using a random key and then this key is encrypted with the assymetric cipher (sever times if there are several recipients). Nobody has to worry about those "intermediate" throw-away keys, they're just stored alongside the ciphertext.
Most people don't do what you are doing with PGP. In other words you are not in the demographic they are targeting. For their target demographic (people who don't use crypto, or use it poorly), what they are doing is much better than what the people have now.
How so? I don't really get the point of these device keys to be honest.
If anything it seems more complicated than what I'm proposing. People who don't use crypto will probably let keybase manage their private keys (at least at first) so this could be handled transparently.
I mean, you could turn it the other way around. If this system is confusing and unintuitive for somebody like me who is familiar with the details of asymmetric cryptography, how are less technical users supposed to figure it out and understand the trust model?
The point of the device keys is that you don't need a PGP key at all. PGP and carrying around a master key securely everywhere is unnecessarily complex for most people. I don't use PGP at all, but I can still use keybase to do encryption/decryption seamlessly across multiple devices which is great.
The point of a device key is that you can revoke such a key in the event it gets lost and not have people accidentally use it to send you secrets (assuming they use keybase online / are up to date )
But you could do this even if those keys were encrypted with the master PGP key.
I don't have any issues with using sub-keys, it's a very good idea actually, for the reason you mention. I just wish I had the option to tell keybase "never store those keys in cleartext, always encrypt them with the master key". Then it would ask be to decrypt the keys on startup and everybody would be happy (well, at least I would be).
I was very confused about this as well. I setup an account on my computer a few weeks ago with my yubikey, and since then I have reset my computer, so I decided to install keybase again. Unfortunately it asked for me to approve the new device from an existing one, and didn't offer the option to use my yubikey instead.
I have since emailed support, and they clarified that they don't support yubikeys as the auth layer since there aren't enough users with them for them to justify this work, which is understandable, but I have to say it was pretty confusing.
This looks really cool. I'm a little concerned that "there is currently no pay model, and we're not trying to make money. We're testing a product right now, and we'd like to bring public keys to the masses." What's the harm in charging $5/month for 50GB as a beta service that might not last forever to help fund development? Maybe even keep the same level but allow people to pledge monthly to support your efforts? It just always seems to be the case that there's a lot of runway until there isn't. Building a product and making money are not mutually exclusive.
The harm is that $5 / month for 50GB is the wrong model.
If this is valuable, they think 10GB is enough to test the fundamental premise and see people hit the edges of what they'd want to pay for.
There might be a segment of the market for whom this is worth vastly more than $5 / month. There might be a segment who wants to pay that little, but it's not about storage. They may not learn these things by priming people on a per-GB storage model.
I agree. My point wasn't necessarily that they should charge for storage but more so allow the community to support them financially as they're working on their product(s). The post saying, "we're not collecting money, we're not trying to make money right now" just seems kind of strange. Keep the free model. Let people play around, but I still don't see harm in allowing someone from the community to pay keybase for working hard on a hard problem. I'd be more than happy to pay them right now just because I love the work they do and want them to succeed.
They should charge to get an enterprise version with certain guarantees. Without payment and a contractual guarantee, it can only ever be for personal use.
You say this, but we use it as a convienient way to onboard new engineers. There is a huge amount of key material to exchange to bring one of our developers online and its a lot easier to move it without thumb drives or passwords shouted across the room (or the famous email-tar.gpg and slack the password).
It's not reliable for storing customer data, no. But it's quite useful for a lot of quick engineering work.
Neither seems terribly relevant, to be honest. In terms of namespacing, Corporations don't have their own space. What you do is have individuals send files amongst each other.
And, are you suggesting that a corporate process has to rely on technology that is always available? I'm fairly sure that they don't. If corporate Wi-Fi and Microsoft Office are any indication, anyways.
Please excuse any punctuation or capitalization today. I am training a new interface to Hacker News and the voice recognition sometimes produces slightly odd text.
I've come to love Keybase. I can be chatting with someone, and decide to share a file. So I just create a shared private folder, and copy the file there. All GnuPG authenticated and end-to-end encrypted.
I get that they don't implement forward secrecy. I gather that forward secrecy would break the GnuPG model, in that stuff couldn't be decrypted on any device with the requisite key pair.
I just used keybase fs the other day to send someone a draft of a book I'm working on, with nothing but our reply chain on HN as a starting point. It's like you're just one `cp` away from anyone who uses keybase!
Right. And if both of you are authenticated on HN, you don't need to know anything else about them. That is, anyway, if it's enough to know them by their HN username.
No, they have their own key and encryption scheme (called saltpack), which is - to best of my knowledge - is not compatible with GnuPG.
I'd wish they have made it compatible with modern GnuPG, making it possible to transform the container format for keys and messages (I'm not sure it's possible for messages, but certainly must be possible for the keys). Sadly, it's not.
Ah. I'm no expert, for sure. So are you saying that the file system and chat don't use users' GnuPG keys? Upon reconsideration, that makes sense. Remembering the setup process, that is. And the fact that I don't get GnuPG unlock prompts when I'm using the app. Feel a little sheepish about that, I do. And it'd be good if Keybase were clearer.
But still, it is GnuPG authenticated, I think. Because the account is tied to both a GnuPG key and the Keybase container encryption.
I'm interested in this as a technical solution, but I'm not sure what content I actually expect to share or see shared on Keybase. Public dropbox links work pretty well for me right now for "ordinary" files.
I've always wondered, is there a possibility for a state MITM if they manage to control the root-signing authorities? Can't they always replace it with their own signed certificate?
As I understood it, they'd have to MITM both Keybase.io and every single one of the websites where a given user verifies their identity (and then there's certificate pinning which I would assume the Keybase client would know about and respect, but then again if your download of the client was compromised ..). Not impossible, but there's a lot of logistics to get right.
Also, they'd have to do it as you sign up. Once your profile is active and you've followed people and their signatures are cached on your local computer, the root CAs are no longer the single point of failure and MITM'ing becomes impractical.
That said, if a state, especially the one you're in, targets you personally, you're going to have a bad time.
I don't know much about crypto, but this argument seems extremely compelling to me. Why would I want my network of trust to be public or centralized? Can someone more knowledgeable comment on whether parent is making sense?
EDIT: In fact look, from their /docs/server_security they have this:
> Here are the attacks we are most concerned about:
> Server DDOS'ed
> Server compromised; attacker corrupts server-side code and keys to send bad data to clients
> Server compromised; attacker distributes corrupted client-side code
Why would I want my GPG security to depend on whether some company got hacked or not? This seems like a terrible idea.
Anecdotally, it seems that new files (or new revisions) are downloaded on first use.
If you have sufficient local space and use KBFS only on a single machine, then you'll never wait. If you modify a file every time you open it, and do so on alternate machines, you'll wait every time.
I've slightly lost confidence in Keybase recently, since logging on after a while and discovering that they're _heavily_ promoting their app.
It's a bit disturbing to see them act like so many other sites that are pushing something you just don't need. Also, after asking them on Twitter about this, I didn't get a reply - for an organisation who's all about identity, this isn't a great sign.
There are at least a couple eyebrow raising things about Keybase given that they play off of something as hardened as PGP. But to my surprise Werner Koch uses it and verified it with gnupg.org itself. I would have continued to abstain but it makes no sense to be holier than the creator of gpg himself.
You can verify to keybase by posting a hash using the "about" field in your user page on HN. Simply making a comment is insufficient. The details of what to do are available at keybase.
We don't use GitHub releases for our official packages. Instead (if you're on Linux) we publish Debian/RPM/Arch packages that update automatically through your system package manager. See the install instructions here: https://keybase.io/docs/the_app/install_linux
That's partly true, but it also uses a build script that doesn't include the KBFS or GUI bits, which isn't directly under our control. At some point I'll work with the Arch maintainer to figure out what we want the official packages to be, but for now the easier control we have over the AUR package has been convenient. (It's shocking how much easier it was to get our Arch packaging working, compared to deb/rpm.)
Security-wise, you could verify our signature on the .deb package that aur/keybase-bin is downloading. The Linux install instructions describe where to find the sigs.
I'd recommend doing the same thing we do in Arch: repackaging binaries from the .deb packages. We release Linux builds almost daily (if the CI tests are happy), and I worry it would be silly for us to tag the repo at that rate.
I'm not a mod, but three out of your five comments contain the word "shit," and one of the five was flagged. HN has a lot to offer, but only if you put effort into it.
EDIT: Just to clarify, it's fine to call out something as bullshit, but you have to justify your reasoning. Ideally it wouldn't be so emotional, but an informative comment is valuable regardless of the clothes it wears. E.g. if you dig into the ToS and highlight clauses that illustrate why it's bullshit, that would be useful.
To list a few points:
* No application preferences, so no obvious place from which to disable KBFS. Even though I never use it, it somehow idles at 160MB memory, and after each restart it adds itself to the top of my Favorites on Finder!
* Between the Electron app and Keybase services, another 360MB of memory gets used. In total that's 520MB, which is a fairly large amount of memory for a laptop with only 8GB.
* Quirky menu bar behavior which doesn't match macOS design guidelines.
* No option to disable automatic updates, or even reduce the check's frequency. Currently, it runs every hour, even though they can go weeks without a release!
They're trying to do way too much with a single app, desktop apps are usually more granular. You could break it up into three or four apps:
* Core. Includes account, device manager, identity proofs, and CLI tools.
* Chat. Includes contacts and people search. Heck, you could even offer to scan the user's contact list or keyring for matches on Keybase.
* KBFS. A menu item app with a preferences view. Functionality should probagbly be implemented as a Finder Sync Extension [0].
[0] https://developer.apple.com/library/content/documentation/Ge...