Getting traction for privacy-focused projects; or, security never sells
One of the biggest bits of news to cross the IT community in the past decade has been Edward Snowden’s revelations on the NSA intercepting and archiving vast swaths of internet traffic—email, http, and more. (Not that these revelations should come as a surprise to anyone with even a passing interest in internet architecture and current events; what Snowden revealed is a tiny logical progression from what William Binney and other leakers told us years ago.)
These revelations naturally made a lot of people pretty angry, and some of those people are trying to do something about it by writing software that’s built from the ground up to protect the user’s privacy and anonymity. That’s commendable—in a very real sense our privacy can only be protected to the extent that we as developers bake it in to the fabric of the programs we write.
But despite the renewed interest in privacy, anonymity, and encryption in the hacker sphere, I’m worried that it’s not going to make a big difference. And not for lack of technology or lack of effort—we’ve proven with stuff like GPG, Tor, and other projects that the OSS community can create and deploy truly powerful tech—but rather for lack of adoption.
After all, what good is the best encryption software if nobody uses it? Can we call a privacy-enabling product successful if you can’t use it to communicate with your doctor, your accountant, your lawyer, or your mom?
Adoption rate is the most important metric in the fight for getting our online privacy back on track. In fact, with today’s encryption schemes that require both parties to participate, it’s really the only metric that matters. If your friends aren’t using encryption-compatible software, then you won’t use it either.
We’re at a huge turning point in history now, and we developers quite literally have the future in our hands. Whether or not we get average users to use encryption today will significantly shape the zeitgeist of the next two decades.
Unfortunately today’s privacy-enabling software—with very few exceptions—fails miserably in adoption. Why?
Privacy is only as good as its UX…
We’ve had tons of great privacy-enabling software for a pretty long time. GPG is a standout in terms of functionality: Using a key of a certain strength pretty much ensures your data will remain protected until the universe explodes (or something like that). But I’d consider it a failure as a software project, because over a decade after being released, GPG is still only used by IT professionals and enthusiasts. There’s lots of reasons for that, not least of which is the difficulty of conceptually understanding public-key cryptography, but anyone in the IT circle will point straight to the biggest reason: GPG is user-hostile.
I defy you to try to teach a regular computer user like your mother, grandmother, or even teen, how to use GPG effectively. GPG’s confusing, dense, labyrinthine, and frankly tragic UX guarantees that even if you did teach them how to use it, they’d never actually want to use it on a day-to-day basis.
Hell, even I, a software developer and computer nerd for my entire life, manage to screw up GPG. Just a few months ago I wanted to send a prominent security researcher an email, and I thought I’d encode it with GPG. After creating and publishing a keypair, finding the right public key, installing and familiarizing myself with Enigmail, and triple-checking everything (a process of at least an hour and probably longer), I finally managed to send the email—that was somehow encrypted to myself! I’m still not sure how I managed to pull that off, but it wasn’t for failing to understand how GPG works, or for lack of trying. The UX is really that terrible.
The problem isn’t limited to GPG. Consider Gibberbot, an IM client that sells itself based on its security features. But not only is encryption not enabled by default for a chat session, but how to use it remains a mystery to me. On Android I tap the open-padlock button and I get a popup saying “Starting encrypted chat session…” and that’s it. The padlock remains unlocked, I get no other notifications, basically nothing has changed—so what do I have to do to actually use OTR encryption? Beats me, and if it beats me, then it’s going to beat a regular user, and the product UX has failed. And if the product UX has failed, then the product as a whole will fail to be adopted when faced with easier-to-use, unencrypted competitors—of which there’s no shortage.
…and it’s only as good as its marketing
Even if the UX of a privacy-enabling product is good and it’s released for free, it can still fail to be adopted. And a big reason for that is how the product is marketed.
Remember that if we want to beat snooping, then one of the goals of a software product must be widespread adoption. Encryption only works if both people are using it, right?
Well it turns out that when it comes to getting regular users to use a product, what it’s called and how it’s marketed matters just as much as how easy-to-use it is.
Unfortunately the landscape of privacy-enabling software is littered with examples of poor naming and poor marketing. Most such software is marketed with privacy or security as their main selling point—see Gibberbot, TextSecure, RedPhone, Tox, and so on. But here’s the thing: Marketing your software with privacy or security as the #1 feature will doom your product to failure!
Why? Because the average user doesn’t think of security or privacy as features they want or need. When they search for a chat client for their new smartphone, the word “secure” is not one they’ll put in the search box. Average users are looking for free, for beautiful, for easy-to-use, for integrated, for one-size-fits-all—security is simply not on their list of requirements. This is what’ll run through their head as they browse the app store: “Why do I want to use this thing that’s for security, if the one under it says I can send text messages (or some other feature) for free?” Marketing a product with “secure” or “private” in the tagline, regardless of how good it actually is, will consign it to be used only by enthusiasts.
Even the recommendation of a friend isn’t always enough to get a poorly-marketed privacy-enabling app installed. If I were to recommended a texting app that marketed itself as “the secure way to text” to all of my friends, maybe half would look it up, and those that did would think to themselves, “Why should I go through the effort of switching to this new app? I don’t need ‘security’, only my friends can see my texts anyway. (Wrong!) I’ll download it later, maybe.” And they never do. Security is just not something the regular user thinks they need. Privacy-enabling apps have to be marketed on features that actually affect user’s day-to-day lives to stand a chance amongst the competition.
Don’t do what Donny Don’t does
There’s no shortage of poorly-marketed privacy-enabling projects out there, but I want to pick on one of them to illustrate my point. I’m picking on it out of respect for the project’s goals and out of a desire to see the project succeed, not to demean the developers; and the points I’ll make are, I think, equally applicable to a host of similar projects.
Take a look at RedPhone, a free open-source Android app that encrypts calls. Unlike Gibberbot or GPG, it actually has a pretty great UX. Calls are transparently encrypted if possible, and if not, the app works like your usual phone app. That’s the best we can ask for: encryption without the hassle when possible, and a robust phone app when not.
But RedPhone has a tenth of the users of the next most popular call app, and the problem is in the marketing.
Its tagline is “Secure Calls”, not “Free calls to friends and family”, or “The beautiful, easy way to make calls”. Remember: Users don’t think of security as a feature. When faced with a range of similar apps, they’ll pick the one that loudly solves the problems the user thinks they have, every time.
The icon features an outdated candybar phone with padlock. Again, focusing on security in the branding when only enthusiasts care about security as a feature. The icon is the chance to create a memorable brand: why not a cute mascot, like Pidgin, or something pretty, glossy, and attractive. Like it or not books are judged by their covers, and in an app store context the project icon is often the first (and last!) thing a user sees about an app.
The first five bullet points in the app description are things that normal users—mom, grandma, your teen—don’t actually care about: things like which dialer you’re using and how things are encrypted. (“What do I need encryption for, my phone calls are already private!”—wrong, Mom, wrong!) The bullet point that normal users do care about—namely, being able to make calls without using minutes on your phone—is the very last bullet point and doesn’t even use the ultra-important word “free”.
At the time of writing the Play store says RedPhone has 100,000-500,000 installs; competing call apps like LINE, KakaoTalk, and magicJack have 1,000,000 installs at the least and 500,000,000 at the most! Imagine if RedPhone was an app marketed like LINE, but oh, by the way, also had end-to-end encryption by default, if that’s your thing, but forget about that, look at the free calls you can make, and all your friends are using it too!
Without a better approach to marketing, RedPhone will sadly suffer the same fate that all such poorly-marketed privacy-enabling software is doomed to: forever marginalized, downloaded only by enthusiasts, the paranoid, or those “with something to hide”. The competition will eat its lunch while not bothering with encryption because they market and position themselves so much better.
Tox, while not yet released, looks to be on the same track in terms of marketing and branding: Padlocks everywhere, security as the selling point instead of ease-of-use or any of the zillions of other features regular users care about more, and a name reminiscent of the word “toxic” instead of something cute or accessible.
Learning from past successes
Not all privacy-enabling software has failed to be adopted. In fact, a certain bit of privacy-enabling software has become a fundamental underpinning of the web and online commerce: our old friend, SSL/TLS.
SSL is one of the most successful (in terms of adoption) privacy-conscious software projects ever. It’s almost guaranteed to be used by any internet citizen daily. If you use email, if you bank online, if you’ve ever bought from Amazon, then you’ve used SSL, and nobody (well, almost nobody) could read what you did during that online session—and you probably didn’t even realize it.
Why is SSL a success? Because:
SSL’s big feature isn’t security. SSL’s big feature was that it enabled you to do cool things like buy stuff from the fledgling Amazon and eBay and have it shipped to your door from the comfort of your couch. Sure, those sites had padlock images on the checkout pages and “hacker-proof” logos and whatever, but the little padlock images aren’t what sold SSL, what SSL enabled you to do online is what sold SSL.
Browsers weren’t marketed with security as a feature. Netscape wasn’t marketed as “the secure way to download HTML documents”, but rather as “the best way to easily access this great new thing called the world wide web that everyone’s talking about these days”. It just happened to include a powerful piece of encryption plumbing, but who cares, as long as you could use Amazon and visit Geocities?
SSL was successful because it was so easy that it was invisible (there’s almost no user interaction required, unless you ran in to an invalid certificate), and because its features weren’t security, but rather the cool things that the security enabled, or the other features of the browser it was bundled with.
Thanks to this fantastic bit of (nearly nonexistant!) marketing and the cool things SSL enables, it’s now likely the most-used piece of encryption software ever, all while remaining largely invisible to the average user.
Skype is another great story of mass adoption of a privacy-enabling product. (Or at least it was, before Microsoft bought it, centralized it, and opened it up to the government.) Skype wasn’t marketed as a “secure video conferencing solution”. It was marketed as a free or cheap way to call friends and family abroad, and an easy way to have video chats for free. It just happened to also be based on a decentralized peer-to-peer architecture with client-side encryption that was resistant to snooping.
Would Skype have been as successful if it had been called “SecureVideoChat”, marketed as “Video chats that protect your privacy”, and led its marketing with a bullet point titled “Client-side encryption guarantees your chats are safe”? I highly doubt it. Skype won because it gave users features that solved their problems, wrapped it up in an accessible package, and topped their marketing material with things users care about, like chatting with family for cheap, and not things they don’t care about, like the number of bits in the encryption protocol. Encryption was just a fantastic side effect.
Be Skype. (Before Microsoft, and not closed-source.)
Getting wide adoption for your privacy-enabling project
I really, really want more privacy-enabling projects to succeed. I want my mom to download SecureText instead of WhatsApp. I want to be able to chat on Gtalk with OTR by default, without a hassle. It’s in our hands to protect our communications from those who would eavesdrop on them, archive them for the rest of our lives, and use them against us decades later.
If you’re thinking of starting your own project or working on an existing project, remember that UX and marketing are often what determine whether a project is a success.
Think of the UX first. If your mom can’t use it in five minutes, the UX has failed. Be SSL, not GPG.
Never list security as the big feature. Regular users, the ones you want to get, don’t care about security or erroneously believe they’re already secure. Market the product based on the other great features it has that make the regular user’s lives easier, and oh, by the way, everything’s encrypted by default and they don’t have to think about it!
How you brand your project matters. Don’t use tin-foil-hat words like “secure” or “private” in your project’s name. Don’t name it something edgy and hacker-ish like “DarkChat” or “SecretWeb”. To the public, apps with names like that are only for people “with something to hide”. Don’t name it something that could be interpreted negatively (Like “Tox”, which immediately brings to mind “toxic”, or “RedPhone”, which to many boomers might bring to mind commies, rather than Batman’s phone). Be accessible, easygoing, and cute even, if you can stand it. Don’t sprinkle padlocks everywhere, like in your project’s logo or icon.
Keep UX and marketing in mind and there’s a better chance your project will succeed. The most successful privacy-enabling projects we’re going to see in the next few years are the ones that loudly solve an everyday problem for the average user, but also feature robust encryption by default and quietly, without hassle. These projects will be marketed not as apps for people with something to hide, but as apps that solve your problems (and just happen to be strongly encrypted).
Here’s to the wide adoption of privacy-enabling technologies in the next five years!