I’m a software developer who runs Scribophile, an online writing group for serious writers, Writerfolio, an online writing portfolio service for freelancers, and Standard Ebooks, an open source project that produces liberated ebooks for the true book lover.

Getting traction for privacy-focused projects; or, security never sells

One of the biggest bits of news to cross the IT community in the past decade has been Edward Snowden’s revelations on the NSA intercepting and archiving vast swaths of internet traffic—email, http, and more. (Not that these revelations should come as a surprise to anyone with even a passing interest in internet architecture and current events; what Snowden revealed is a tiny logical progression from what William Binney and other leakers told us years ago.)

These revelations naturally made a lot of people pretty angry, and some of those people are trying to do something about it by writing software that’s built from the ground up to protect the user’s privacy and anonymity. That’s commendable—in a very real sense our privacy can only be protected to the extent that we as developers bake it in to the fabric of the programs we write.

But despite the renewed interest in privacy, anonymity, and encryption in the hacker sphere, I’m worried that it’s not going to make a big difference. And not for lack of technology or lack of effort—we’ve proven with stuff like GPG, Tor, and other projects that the OSS community can create and deploy truly powerful tech—but rather for lack of adoption.

After all, what good is the best encryption software if nobody uses it? Can we call a privacy-enabling product successful if you can’t use it to communicate with your doctor, your accountant, your lawyer, or your mom?

Adoption rate is the most important metric in the fight for getting our online privacy back on track. In fact, with today’s encryption schemes that require both parties to participate, it’s really the only metric that matters. If your friends aren’t using encryption-compatible software, then you won’t use it either.

We’re at a huge turning point in history now, and we developers quite literally have the future in our hands. Whether or not we get average users to use encryption today will significantly shape the zeitgeist of the next two decades.

Unfortunately today’s privacy-enabling software—with very few exceptions—fails miserably in adoption. Why?

Privacy is only as good as its UX…

We’ve had tons of great privacy-enabling software for a pretty long time. GPG is a standout in terms of functionality: Using a key of a certain strength pretty much ensures your data will remain protected until the universe explodes (or something like that). But I’d consider it a failure as a software project, because over a decade after being released, GPG is still only used by IT professionals and enthusiasts. There’s lots of reasons for that, not least of which is the difficulty of conceptually understanding public-key cryptography, but anyone in the IT circle will point straight to the biggest reason: GPG is user-hostile.

I defy you to try to teach a regular computer user like your mother, grandmother, or even teen, how to use GPG effectively. GPG’s confusing, dense, labyrinthine, and frankly tragic UX guarantees that even if you did teach them how to use it, they’d never actually want to use it on a day-to-day basis.

Hell, even I, a software developer and computer nerd for my entire life, manage to screw up GPG. Just a few months ago I wanted to send a prominent security researcher an email, and I thought I’d encode it with GPG. After creating and publishing a keypair, finding the right public key, installing and familiarizing myself with Enigmail, and triple-checking everything (a process of at least an hour and probably longer), I finally managed to send the email—that was somehow encrypted to myself! I’m still not sure how I managed to pull that off, but it wasn’t for failing to understand how GPG works, or for lack of trying. The UX is really that terrible.

The problem isn’t limited to GPG. Consider Gibberbot, an IM client that sells itself based on its security features. But not only is encryption not enabled by default for a chat session, but how to use it remains a mystery to me. On Android I tap the open-padlock button and I get a popup saying “Starting encrypted chat session…” and that’s it. The padlock remains unlocked, I get no other notifications, basically nothing has changed—so what do I have to do to actually use OTR encryption? Beats me, and if it beats me, then it’s going to beat a regular user, and the product UX has failed. And if the product UX has failed, then the product as a whole will fail to be adopted when faced with easier-to-use, unencrypted competitors—of which there’s no shortage.

…and it’s only as good as its marketing

Even if the UX of a privacy-enabling product is good and it’s released for free, it can still fail to be adopted. And a big reason for that is how the product is marketed.

Remember that if we want to beat snooping, then one of the goals of a software product must be widespread adoption. Encryption only works if both people are using it, right?

Well it turns out that when it comes to getting regular users to use a product, what it’s called and how it’s marketed matters just as much as how easy-to-use it is.

Unfortunately the landscape of privacy-enabling software is littered with examples of poor naming and poor marketing. Most such software is marketed with privacy or security as their main selling point—see Gibberbot, TextSecure, RedPhone, Tox, and so on. But here’s the thing: Marketing your software with privacy or security as the #1 feature will doom your product to failure!

Why? Because the average user doesn’t think of security or privacy as features they want or need. When they search for a chat client for their new smartphone, the word “secure” is not one they’ll put in the search box. Average users are looking for free, for beautiful, for easy-to-use, for integrated, for one-size-fits-all—security is simply not on their list of requirements. This is what’ll run through their head as they browse the app store: “Why do I want to use this thing that’s for security, if the one under it says I can send text messages (or some other feature) for free?” Marketing a product with “secure” or “private” in the tagline, regardless of how good it actually is, will consign it to be used only by enthusiasts.

Even the recommendation of a friend isn’t always enough to get a poorly-marketed privacy-enabling app installed. If I were to recommended a texting app that marketed itself as “the secure way to text” to all of my friends, maybe half would look it up, and those that did would think to themselves, “Why should I go through the effort of switching to this new app? I don’t need ‘security’, only my friends can see my texts anyway. (Wrong!) I’ll download it later, maybe.” And they never do. Security is just not something the regular user thinks they need. Privacy-enabling apps have to be marketed on features that actually affect user’s day-to-day lives to stand a chance amongst the competition.

Don’t do what Donny Don’t does

There’s no shortage of poorly-marketed privacy-enabling projects out there, but I want to pick on one of them to illustrate my point. I’m picking on it out of respect for the project’s goals and out of a desire to see the project succeed, not to demean the developers; and the points I’ll make are, I think, equally applicable to a host of similar projects.

Take a look at RedPhone, a free open-source Android app that encrypts calls. Unlike Gibberbot or GPG, it actually has a pretty great UX. Calls are transparently encrypted if possible, and if not, the app works like your usual phone app. That’s the best we can ask for: encryption without the hassle when possible, and a robust phone app when not.

But RedPhone has a tenth of the users of the next most popular call app, and the problem is in the marketing.

At the time of writing the Play store says RedPhone has 100,000-500,000 installs; competing call apps like LINE, KakaoTalk, and magicJack have 1,000,000 installs at the least and 500,000,000 at the most! Imagine if RedPhone was an app marketed like LINE, but oh, by the way, also had end-to-end encryption by default, if that’s your thing, but forget about that, look at the free calls you can make, and all your friends are using it too!

Without a better approach to marketing, RedPhone will sadly suffer the same fate that all such poorly-marketed privacy-enabling software is doomed to: forever marginalized, downloaded only by enthusiasts, the paranoid, or those “with something to hide”. The competition will eat its lunch while not bothering with encryption because they market and position themselves so much better.

Tox, while not yet released, looks to be on the same track in terms of marketing and branding: Padlocks everywhere, security as the selling point instead of ease-of-use or any of the zillions of other features regular users care about more, and a name reminiscent of the word “toxic” instead of something cute or accessible.

Learning from past successes

Not all privacy-enabling software has failed to be adopted. In fact, a certain bit of privacy-enabling software has become a fundamental underpinning of the web and online commerce: our old friend, SSL/TLS.

SSL is one of the most successful (in terms of adoption) privacy-conscious software projects ever. It’s almost guaranteed to be used by any internet citizen daily. If you use email, if you bank online, if you’ve ever bought from Amazon, then you’ve used SSL, and nobody (well, almost nobody) could read what you did during that online session—and you probably didn’t even realize it.

Why is SSL a success? Because:

  1. SSL’s big feature isn’t security. SSL’s big feature was that it enabled you to do cool things like buy stuff from the fledgling Amazon and eBay and have it shipped to your door from the comfort of your couch. Sure, those sites had padlock images on the checkout pages and “hacker-proof” logos and whatever, but the little padlock images aren’t what sold SSL, what SSL enabled you to do online is what sold SSL.

  2. Browsers weren’t marketed with security as a feature. Netscape wasn’t marketed as “the secure way to download HTML documents”, but rather as “the best way to easily access this great new thing called the world wide web that everyone’s talking about these days”. It just happened to include a powerful piece of encryption plumbing, but who cares, as long as you could use Amazon and visit Geocities?

SSL was successful because it was so easy that it was invisible (there’s almost no user interaction required, unless you ran in to an invalid certificate), and because its features weren’t security, but rather the cool things that the security enabled, or the other features of the browser it was bundled with.

Thanks to this fantastic bit of (nearly nonexistant!) marketing and the cool things SSL enables, it’s now likely the most-used piece of encryption software ever, all while remaining largely invisible to the average user.

Be SSL.

Skype is another great story of mass adoption of a privacy-enabling product. (Or at least it was, before Microsoft bought it, centralized it, and opened it up to the government.) Skype wasn’t marketed as a “secure video conferencing solution”. It was marketed as a free or cheap way to call friends and family abroad, and an easy way to have video chats for free. It just happened to also be based on a decentralized peer-to-peer architecture with client-side encryption that was resistant to snooping.

Would Skype have been as successful if it had been called “SecureVideoChat”, marketed as “Video chats that protect your privacy”, and led its marketing with a bullet point titled “Client-side encryption guarantees your chats are safe”? I highly doubt it. Skype won because it gave users features that solved their problems, wrapped it up in an accessible package, and topped their marketing material with things users care about, like chatting with family for cheap, and not things they don’t care about, like the number of bits in the encryption protocol. Encryption was just a fantastic side effect.

Be Skype. (Before Microsoft, and not closed-source.)

Getting wide adoption for your privacy-enabling project

I really, really want more privacy-enabling projects to succeed. I want my mom to download SecureText instead of WhatsApp. I want to be able to chat on Gtalk with OTR by default, without a hassle. It’s in our hands to protect our communications from those who would eavesdrop on them, archive them for the rest of our lives, and use them against us decades later.

If you’re thinking of starting your own project or working on an existing project, remember that UX and marketing are often what determine whether a project is a success.

Keep UX and marketing in mind and there’s a better chance your project will succeed. The most successful privacy-enabling projects we’re going to see in the next few years are the ones that loudly solve an everyday problem for the average user, but also feature robust encryption by default and quietly, without hassle. These projects will be marketed not as apps for people with something to hide, but as apps that solve your problems (and just happen to be strongly encrypted).

Here’s to the wide adoption of privacy-enabling technologies in the next five years!

Comments

  1. Anonymous Coward

    cryptocat? What can be cuter than a cat?

    The pixelated logo, however appeals more to the 80’s retro-gamer people.

    A bunny? BuddyBunny?

    I know a person who struggles with Facebook’s UX. She finally figured out how to use like 2 features, and she uses them all of the time. Why did she struggle through it? Because her friends were all there.

    What happens when you tell someone: “You shouldn’t accept a friend request on FB unless you know them personally, and can call them on the phone to verify that they really are the person who friended them”? “Yeah, right. I can see them. That’s their picture. What’s the problem?”

    The problem is that 99.9% of the time, they are right, and the out-of-band verification is unnecessary.

    So we have the UX problem, and we also have the network effect. Each of these stacks the deck against us. PGP is hard to use, plus we have to get our friends to go to a key-signing party to even be able to talk to them. And if we want to create a new anonymous/psuedonymous online identity, we can’t even use our existing network: having a trusted friend be the first signer of our anonymous key is a dead giveaway.

    Then there is key management: how can we get the masses to take care of their keys in a trustworthy way? One instance of “convenience trumps security” and the whole thing collapses.

  2. Anonymous Coward

    I call myself Anonymous Coward, but give my real email address with the assurance that it won’t be published.

    My email address isn’t revealed, but my profile picture is. Doh! That laugh is on me. The convenience of using an existing email address instead of creating a new one…

  3. Alex Cabal

    True, but in GPG’s case getting keys signed isn’t a hard requirement to doing something like sending encrypted mail to family. People can start using GPG without worrying about signing at all (at first).

    There is certainly a chicken-and-egg problem here, and in GPG’s case a part of the solution would be improving the UX in a way that lets regular folks just use it. Right now it’s so complicated that even techies like me manage to routinely mess it up—that means there’s no hope at all for any kind of regular-person traction. Until people can actually use it, the chicken-and-egg problem is moot.

    Regarding Cryptocat, it’s actually a perfect example of the kind of bad marketing I’m referring to. It has the word “Crypto” in its name, leads its marketing material with “Cryptocat lets you chat with privacy” instead of “Easily chat with friends and family for free” (or any other feature that regular people would actually care about), and lists other features that regular people don’t care about, like its being open source and translated. It does have a cute cat in the logo, but if you ask me the logo is so stylized you can barely tell it’s a cat anyway.

  4. Alex

    I think that asking a privacy-preserving project to excel against and compete with all other projects out there, and slip in privacy under the users’ nose is asking a little bit too much.

    It’s very difficult to protect people against their intentions. That’s why we have laws and penalties about wearing seatbelts in cars — otherwise nobody but the “paranoids” would. But there are no such incentives for people to choose privacy-preserving technologies, especially in light of the network effects of already widespread non-secure technologies like Skype.

    Privacy has got to have a recognised value, which will offset or at least counterbalance the unfortunate dip in usability that it brings — because real privacy will always, at some point, require a human to make an informed choice.

    Using SSL as an example is not fair to privacy protecting projects. SSL only protects against casual criminals, while we expect privacy protecting technologies nowadays to (at least partially) protect against the NSA.

    The protocols are there — ZRTP, OTR, OpenPGP — I would see priority #1 as creating software with better UX than the current apps. Marketing will come, in terms of word of mouth, as more and more people realise the risks they take by leading a life completely transparent to the government.

  5. Alex Cabal

    I don’t think Americans will realize the value of privacy until it’s too late. That’s why we have to slip privacy in and market it as something else—for everyone’s own good, whether they realize it or not. It’s possible to have privacy without knowing about it—e.g. SSL, which, when correctly implemented by the server, can and does protect against direct surveillance in many ways. (Root certificate coercion notwithstanding, and self-signed certificates are a possible solution to that.)

    We’re in a very special political and technological place in time right now where we can still make it happen. The internet is still just barely in the realm of benign neglect from a governmental standpoint, and technologies for strong privacy exist. If we don’t take advantage and make these technologies ubiquitous, we could find ourselves in a 1990’s-style “PGP is illegal munitions” situation again. What the law giveth, the law taketh away—but not if it’s too entrenched in the fabric of people’s daily lives.