Cover photo

Cobra Theory in Action: How We Fucked the Internet

I want to talk about how we managed to royally screw up the internet.

It's a classic case of Cobra Theory - a bunch of tech “geniuses” and boy-wonders thought they could improve the internet and instead turned it into a dumpster fire of Brobdingnagian proportions.

What is Cobra Theory?

The story goes that British colonials in India, fed up with Delhi's abundance of cobras, offered a bounty for dead snakes. Predictably, seeing an opportunity to make some serious money, some folks started breeding cobras. When the Brits caught wind of the scheme, they axed the program, leading to a mass release of captive cobras and – surprise, surprise – even more snakes than before.

It's a great story. It's quotable, it's memorable, and it perfectly encapsulates the idea of a solution backfiring spectacularly. To some, it's the academic equivalent of a urban legend – widely circulated, rarely questioned, and frustratingly difficult to verify.

But the Cobra Effect, whether based on fact or fiction, is actually a very real phenomenon. There are examples throughout history.

Hanoi in the late 19th century, under French colonial rule, had a rat problem. Some bright spark came up with the idea of paying people for rat tails. Can you see where this is going? Enterprising locals started breeding rats, cutting off their tails, and releasing the now-tailless rodents back into the sewers to breed even more rats.

Wells Fargo was a textbook case of corporate Cobra fuckery if there ever was one. They set up a system of aggressive sales goals and juicy bonuses. Hit your targets, and you could be looking at a sweet 15-20% bump in your paycheck if you're a personal banker, or up to 3% if you're a lowly teller. Miss those targets? Tough shit, they get added to tomorrow's goals. No pressure.

Now, put yourself in the shoes of these employees. You've got bills to pay, maybe a family to feed. You're staring down these impossible targets day after day. What do you do? Well, if you're one of the folks at Wells Fargo, you start opening fake accounts. Forging signatures. Issuing credit cards nobody asked for.

The result? About 2.1 million unauthorized accounts. That's not a typo. Millions of people suddenly had accounts they never asked for, some racking up fees, others potentially screwing with their credit scores. All because some higher-ups thought it'd be a great idea to turn their employees into used car salesmen.

When the shit hit the fan, it hit hard. Billions in fines and settlements, a CEO resignation, and the kind of reputational damage that has Wells Fargo still trying to scrub the stink off years later. The Federal Reserve even put an asset cap on them, basically saying, "You can't grow until you prove you're not a bunch of crooks."

This all started with what probably seemed like a good idea at the time. Incentivize employees, grow the business. Instead, it created a perverse incentive that encouraged fraud on a massive scale.

Look at Colombia. The problem: too much traffic and pollution in cities like Bogotá. The solution? "Pico y Placa" (Peak and Plate). It was a system to restrict car usage based on license plate numbers. Certain days, certain plates stay home.

Genius, right?

Wrong.

Instead of reducing the number of cars on the road, folks just went out and bought second cars with different plate numbers. Now they could drive every day of the week, environmental restrictions be damned.

The result? More cars on the road, not less. More pollution, not less. It's like trying to put out a fire with gasoline.

Next, they introduced a system where you could pay to bypass the restriction. Sounds fair, right? Pay to pollute. Except now you've got a situation where rich people can just buy their way out of the inconvenience, while poorer folks are left dealing with the restrictions.

And because people are paying a fixed cost for this exemption, they're actually driving more to "get their money's worth." It's like an all-you-can-eat buffet for car usage. You paid for it, might as well use it.

In all these cases, we see the same pattern. Someone identifies a problem: too many snakes, too many rats, not enough cross-selling, too much traffic. They come up with a solution that seems logical on paper. But they fail to account for how people will respond to these new incentives. And instead of solving the problem, they make it worse.

That's the Cobra Effect in a nutshell. It's what happens when we're too smart for our own good, when we think we can engineer human behavior without considering the law of unintended consequences.

Human behavior is complex, adaptive, and often guided by self-interest. When we implement policies or incentives, we're not just solving a problem – we're creating a new system. And in that system, people will find ways to maximize their benefit, even if it completely derails the original intention. We set out to kill some snakes, and end up breeding a whole new population of them.

That’s a perverse incentive in action. It's what happens when you create a reward system that backfires spectacularly, encouraging exactly the behavior you were trying to prevent.

And on the internet, Cobra effects are everywhere.

Let's start with social media. The original idea was noble enough: connect people, share ideas, maybe see what your high school crush is up to these days. But then came the need to monetize. Enter the attention economy, where your eyeballs are the product being sold to advertisers.

So what did the tech geniuses do? They designed algorithms to keep you engaged. More time on the platform equals more ad revenue. Sounds reasonable, until you realize what actually keeps people engaged: outrage, conflict, and sensationalism. Suddenly, your news feed isn't filled with your aunt's cat photos anymore. It's a non-stop parade of the most inflammatory, divisive content imaginable.

The perverse incentive? Be as controversial as possible. Say the most outrageous thing you can think of. Start a flame war. Because that's what gets the clicks, the comments, the shares. That's what the algorithm rewards. We wanted connection, but we incentivized division.

And it gets better (or worse, depending on your perspective). Remember when fact-checking was a thing? Those were the days. Now we've got a system where being first is more important than being right. News outlets, desperate for clicks in a world where print is dying, rush to publish before verifying. The perverse incentive? Prioritize speed over accuracy. Why? Because by the time the truth comes out, everyone's already moved on to the next outrage.

We told people they could make a living by building an audience online. Democratizing media and so on. But to build that audience, you need to game the algorithm. And what does the algorithm want? Consistent content. Lots of it. All the time.

The perverse incentive? Quantity over quality. Depth of thought? Nuance? Careful research? Fuck that. You need to churn out three videos a week to stay relevant. So we end up with a flood of shallow, repetitive content. But hey, at least it's optimized for engagement.

Remember when privacy was a thing we cared about?

Then we got seduced by the convenience of personalized services. Free email, free social media, free everything - as long as we're willing to let companies harvest our data. The perverse incentive? Give up more of your personal information to get better services. And now we're shocked - shocked! - that our data is being used in ways we never intended.

We know all this, and yet we can't seem to stop. We're trapped in a system of our own making, pressing the lever for another hit of dopamine like rats in a Skinner box. We've created a digital environment that exploits our psychological vulnerabilities, and then we wonder why we feel anxious, depressed, and polarized.

So what's the solution?

Hell if I know.

But I do know this: we need to start by recognizing the perverse incentives we've created. We need to understand that every time we engage with clickbait, every time we share without fact-checking, every time we give up our data for a bit of convenience, we're feeding the beast.

Maybe it means rethinking how we monetize online content. Maybe it means stronger regulations on data use and algorithm transparency. Maybe it means a fundamental redesign of our social media platforms. Or maybe it means a mass exodus from these digital Skinner boxes we've trapped ourselves in.

Whatever the solution, it's going to be uncomfortable. Because fixing this mess means giving up some of the things we've grown accustomed to. It means valuing truth over convenience, depth over engagement, genuine connection over shallow validation.

The internet is still an incredible tool. It still has the potential to connect us, to inform us, to expand our horizons in ways our ancestors couldn't have dreamed of.

But we need to stop fucking it up with our short-sighted incentive structures, and we need to question the behaviour we’re rewarding vs the behaviour we’re punishing.

Until we start dismantling these perverse incentives - we're going to keep breeding cobras.

And let me tell you, those fuckers bite.

@Westenberg logo
Subscribe to @Westenberg and never miss a post.