Coordinated with Fredrik

The $8.8 Trillion Foundation Nobody Owns


Listen Later

There is a number that keeps rattling around in my head since recording this episode: $8.8 trillion. That is the demand-side value of open source software according to a recent Harvard Business School study. Not the market cap of the companies selling software. The value of the code itself. The actual lines sitting in public repositories, running the global economy, maintained by volunteers, hobbyists, and a rotating cast of corporate contributors who might get reassigned next quarter.

To put that in perspective, it rivals the GDP of the entire Eurozone.

And here is what keeps me up at night: if I buy a physical component for our energy infrastructure, I get a warranty, a supplier, and a paper trail. If it breaks, I know who to call. But the software stack underneath all of it — the grid management, cloud infrastructure, data pipelines — that sits on a foundation that legally comes with zero warranty. None. Express or implied.

That tension is what this episode is really about.

When software was just the manual

We tend to think of software as the product. But go back to the 1950s and 60s, and software was an accessory that shipped with the hardware. Nobody hoarded it because there was no reason to. IBM customers formed a user group in 1955 called SHARE, and their motto was disarmingly simple: “SHARE is not an acronym, it’s what we do.” By 1959, they had collaboratively written an entire operating system. Just engineers helping engineers get million-dollar machines to work better.

That culture crystallized into something close to a philosophy at the MIT AI Lab in the 1970s. Code was left open. If you needed a program for your experiment, you walked to a cabinet, copied the source from a paper tape, added what you needed, and put it back. It was communal by default.

Then the world changed.

A printer jam that reshaped the global economy

The proprietary turn started with a legal shift, not a technical one. IBM unbundled software from hardware in 1969 under antitrust pressure, and overnight, code got a price tag. Bill Gates fired the first cultural shot in 1976 with his open letter to hobbyists, essentially arguing that if software has value, creators deserve to get paid. Fair point from a business perspective. But to the hacker community, it felt like someone was putting fences around a public park.

The real breaking point, though, was absurdly petty. Richard Stallman wanted to fix a paper jam on a Xerox printer at MIT. He had done it before on their old printer by writing a script that notified users when their print job got stuck. But the new Xerox printer ran proprietary code. When he asked a researcher at Carnegie Mellon for the source, the guy said no — he had signed an NDA. Stallman viewed this as a moral betrayal. That printer jam radicalized him.

He launched the GNU project in 1983 and created the GPL, a legal hack that used copyright law against itself. Copyright restricts sharing. Copyleft mandates it. The GPL said: do whatever you want with this code, but if you distribute changes, you must keep them open under the same license. It was viral by design.

The hobbyist who built the engine

Stallman had the philosophy, the legal tools, and the foundational programs. But by the early 90s, GNU was missing its kernel — the part that actually talks to the hardware. Their kernel project, Hurd, was stuck in architectural perfection debates. While they were building a cathedral, a 21-year-old Finnish student named Linus Torvalds posted a casual message on Usenet in August 1991: “I’m doing a free operating system, just a hobby, won’t be big and professional like GNU.”

Probably the greatest understatement in the history of technology.

Linus licensed his kernel under Stallman’s GPL, not for ideological reasons, but because it was a fair trade: I show you my code, you show me yours. His monolithic kernel was messy, technically “wrong” by academic standards, but it was fast and it worked. History proved that worse is better. A massive, chaotic network of volunteers connected by the internet could iterate faster than any closed corporate team.

When the boardrooms noticed

Corporate America eventually could not ignore it. But “free software” sounded anti-capitalist and legally terrifying to a CIO in 1997. So in 1998, at a strategy session in Palo Alto, Christine Peterson suggested the term “open source” — stripping away the moral philosophy and replacing it with a pure engineering and business argument. Less vendor lock-in. Shared maintenance costs. Better, faster, cheaper.

Microsoft was terrified. Internally, their own engineers admitted Linux was competitive. Publicly, Steve Ballmer called it “a cancer.” Their strategy was embrace, extend, extinguish. But then IBM showed up as the white knight. In 2001, they announced a billion-dollar investment in Linux. Not because they cared about the OS — they made their money on hardware and consulting. It was a move to commoditize the operating system layer and destroy Sun Microsystems’ expensive proprietary Unix business. IBM told every Fortune 500 CIO: Linux is safe. You will not get fired for running it.

And once it was deemed safe, it started eating the world. The LAMP stack (Linux, Apache, MySQL, PHP/Python) let startups build for zero licensing cost. Facebook, Wikipedia, WordPress — all built on free infrastructure. By 2002, Apache ran 58% of all websites.

The irony peak came in 2018 when Microsoft, the “cancer” company, acquired GitHub for $7.5 billion. They finally realized the cancer was actually the cure for their own irrelevance.

The fragility underneath

Here is where it gets uncomfortable. XKCD comic 2347 shows all of modern digital infrastructure as a massive tower balanced on one tiny block: “a project some random person in Nebraska has been thanklessly maintaining since 2003.” It is funny until you realize it is basically a documentary.

Heartbleed in 2014 showed us that the encryption library securing most of the internet’s traffic was maintained by essentially one person full-time. Log4j in 2021 scored 10 out of 10 on the severity scale and compromised 40% of business networks overnight — a logging library so boring nobody paid attention to it.

But the one that should genuinely scare any CEO is the XZ Utils backdoor from 2024. This was not an accidental bug. A persona calling themselves “Jia Tan,” likely a state-sponsored actor, spent two to three years earning the trust of a burned-out volunteer maintainer. They submitted helpful patches, took load off the tired guy’s shoulders, and were gradually granted repository permissions. Once they had the keys, they injected a backdoor designed to subvert SSH authentication — the way administrators securely log into servers. If it had hit stable Linux releases, attackers would have had a master key to millions of servers worldwide.

It was caught by pure luck. A Microsoft engineer named Andres Freund noticed SSH logins lagging by half a second during routine benchmarking, got curious, decompiled the binaries, and found the most sophisticated supply chain attack in history. Half a second of latency saved us.

Maintainer burnout is not just a sad open source HR problem. It is a national security vulnerability.

AI breaks the definition

And now we have AI, where the very meaning of “source code” falls apart. A traditional open source project is human-readable text files. An LLM is three things: architecture, training data, and weights. When Meta releases Llama and calls it “open source,” they give you the weights but not the training data. It is like handing someone a compiled binary without the original code. You can run it, but you cannot reproduce it or deeply audit it.

The geopolitics here are loud. Meta releasing Llama was not charity. It was scorched earth. If capable AI models become a free commodity, OpenAI and Google lose their moat. Meta protects its core ad business by destroying competitors’ margins. It is IBM and Linux all over again, just at a different scale.

And now nation states are playing. The UAE funds the Falcon models. France backs Mistral. DeepSeek R1 from China matched GPT-4 reasoning at a fraction of the training cost and briefly crashed NVIDIA’s stock. China is using open source as a competitive wedge against American AI dominance.

Satya Nadella made a point that stuck with me: data sovereignty is not just about where your servers sit. It is about tacit knowledge. If you rely entirely on a closed API from an American tech giant, every prompt you send makes their model smarter. You are exporting your own intelligence. But if you download an open-weight model and fine-tune it locally, you keep the weights. You own the brain.

For any company running critical infrastructure, this is not a theoretical debate.

Grazer or gardener?

The episode ends with a question I have been sitting with: when I look at our tech stack, do I see a pile of free resources to consume indefinitely? Or do I see a fragile supply chain that requires active stewardship?

Because if you are not contributing back — engineering time, financial support, participating in governance — you are not a neutral user. You are part of the risk profile. You are building skyscrapers balanced on the shoulders of that tired volunteer in Nebraska.

Open source is not just code anymore. It is the invisible critical infrastructure of the modern world. And infrastructure requires maintenance.

Listen to the full episode of Coordinated with Fredrik wherever you get your podcasts.



This is a public episode. If you would like to discuss this with other subscribers or get access to bonus episodes, visit frahlg.substack.com
...more
View all episodesView all episodes
Download on the App Store

Coordinated with FredrikBy Fredrik Ahlgren