Listen to this story



Off Brand

Why Every CEO Needs to Think Like a Hacker, Stalker, or White Nationalist

It’s 2019. Every product should be built with bad actors in mind.

An illustration of mysterious characters surrounding several electronic devices.
An illustration of mysterious characters surrounding several electronic devices.
Illustration: Tom Guilmard

FFans of the buzzy young startup Superhuman, a $30 a month premium email product, rave about its speed and clever, feature-rich design. But recently one of those features proved controversial: a supercharged “read receipt” ability showing users when recipients of their missives had opened them, how many times, and from what state or country.

Recipients likely had no idea this was happening, and no easy way of opting out if they did. You can probably also imagine why one might not want any given emailer to know whether you’ve opened any given message, and where you were when you did. Late this past June, software veteran Mike Davidson wrote a sharp critique detailing this use of “hidden tracking pixels,” pointing out ways in which the data could be misused by, say, a hostile ex or other stalker-y type, and charged: “Superhuman Is Spying on You.”

Soon Rahul Vohra, Superhuman’s founder and chief executive officer, who had sold his previous company to LinkedIn, announced some changes. Conceding that location data could be used for “nefarious purposes,” he wrote that it would no longer be tracked or revealed to users. Separately, the read-status feature would be turned off by default (although it could be reactivated by a Superhuman user). “I am so very sorry for this,” the University of Cambridge alum wrote. “When we built Superhuman, we focused only on the needs of our customers. We did not consider potential bad actors.”

The problem is that digital design isn’t cynical enough.

Some (including Davidson) didn’t find that satisfactory. But set aside the specific case of Superhuman, or even the deeper question of privacy, for a minute. Think instead about the broader implications of that last statement: We did not consider potential bad actors.

There’s something weirdly clarifying about this absurdly weak defense. The typical critique of large tech companies and digital design is that it is manipulative, the product of puppet masters who know how to addict and control us. It is powerful — and cynical. But maybe that’s not quite true. Maybe sometimes the problem is that digital design isn’t cynical enough.

After all, surely we know in 2019 that we should absolutely consider — and build sufficient safeguards against — potential bad actors. To cite a few obvious examples, it would have been smart to think ahead about how neo-Nazis might use Twitter, how pedophiles might use YouTube, or how a mass murderer might use Facebook Live. Less sensationally, Airbnb, designed as a way for travelers to connect with locals, is also a way for absentee landlords to displace locals altogether, arguably tightening the supply of residential rental units. (The company is wrestling with regulators over such issues in multiple cities.) Or consider Nextdoor, a platform designed to build community connection: Neighbor reports of suspicious behavior so frequently boiled down to Existing While Black that the service had to rebuild its “crime and safety” reporting feature with a variety of prompts and warnings meant to discourage users from blatantly offensive racial profiling.

And so on. The point is that not only does it seem naïve to say “we did not consider bad actors,” it seems like bad business. Designing with bad actors in mind — Design for the Worst, let’s call it — ought to be a priority: not to help them, of course, but to thwart them.

Imagine instead a sort of Black Mirror Department devoted to nothing but figuring out how the product can be abused.

“Red teaming” (creating a group with an explicitly adversarial role, to challenge an organization’s strategy or structures) happens in military and intelligence contexts, and even in tech design, when the underlying issue is security or fending off hackers. Maybe big digital-centric companies, and small ones that aspire to scale, need a variation that’s not about fending off direct adversaries. Imagine instead a sort of Black Mirror Department, devoted to nothing but figuring out how the product can be abused — and thus how to minimize malign misuse.

Clearly any such effort would have limits: You can’t redesign the hammer in a way that preserves its core functionality yet renders impossible its potential misuse as a murder weapon.

But digital-era design really is different, not just because it can be so pervasive, but because of the way it has developed. For starters: the relative lack of outside regulation. Sure, it’s possible to misuse a car, or even airplane. But consider the rule structures that surround, constrict, and guide the manufacture, sale, and operation of those technologies. In contrast, the digital ethos assumes that regulators and gatekeepers impede progress; crowds are wise and mobs are smart. To serve the user, you have to move fast and break things; ask for forgiveness, not permission.

This is ultimately an optimistic worldview: Tech-pundits, entrepreneurs, and designers, are by nature focused on the boundless wonderful possibilities of the future. When the head of Superhuman said the company had simply been trying to please its users, he was repeating familiar industry dogma. “Focus on the user and all else will follow,” Google declared years ago. “Fast is better than slow,” its early mission statement added. “Democracy on the web works.”

There are reasons why we might not want to lose this optimism. But it’s worth thinking about how it might be tempered. One of the things that made Kickstarter unusual in its early days was the founders’ thinking about the kinds of projects it wanted off the platform — they thought explicitly about avoiding a “let the crowd rule” attitude totally dominate, and implementing an approval system to prevent a home for creative projects from being misused as a forum to “buy Jenny a prom dress,” per their shorthand at the time.

That’s a lower-stakes example, but it’s instructive: Surely the time has come to think more rigorously about how to behave in ways that are less likely to break things, or require forgiveness. “I now recognize that we must deeply consider the overall ecosystem when designing software as fundamental as email,” the Superhuman founder wrote in his mea culpa post.

Yes. “Here comes everybody!” means everybody, bad actors included. Which is precisely why “democracy on the web” isn’t the end of a conversation, but the start of one. The gosh-we-didn’t-see-that-coming defense when the utopian future goes sidewise doesn’t work anymore. Seeing that coming is part of the job.

Author The Art of Noticing. Related newsletter at

Sign up for Buy/Sell/Hold

By Marker

A newsletter that's 100% business intelligence, and 0% investment advice. Take a look.

By signing up, you will create a Medium account if you don’t already have one. Review our Privacy Policy for more information about our privacy practices.

Check your inbox
Medium sent you an email at to complete your subscription.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store