How a GNU/Linux Distribution Succeeds, Part 2: Red Hat and Debian Steps to Success

How a GNU/Linux Distribution Succeeds, Part 2: Red Hat and Debian Steps to Success

October 7, 2022 - by Andrew Oram

The first article in this series looked at the key roles played by community and policy from the beginning in the Red Hat and Debian distributions of Linux. We’ll look at two other key success factors in this concluding part.
Following a professional process to develop a reliable product

After the bumblings of the aptly named Slackware distribution in the early 1990s, Debian was seen (according to Matt Welsh) as "the distribution that did it right." A huge amount of effort goes into picking the right mix of software to include in a distribution—giving users what they need without introducing bloat or fragile software—as well into testing, packaging, distribution, security scanning, performance tuning, and support. Both Debian and Red Hat must have carried off these daunting tasks excellently and thus won users' trust.

When Physical Media Ruled

The earliest Linux distributions came on floppy disks, even though CD-ROMs had been invented years before. I suppose it took a while for CD-ROM readers to get into personal computers, and for support to come for these devices. Anyway, my memory (which is the best source I have for this point, unfortunately) tells me that Red Hat was one of the first distributions burned onto CD-ROMs. No one likes loading 50 or more floppy disks, so the change of medium made Linux really accessible for the first time.

Bruce Perens adds, “I bought a CD-writer for $1,400 and made Debian’s own CD image, because the early CD makers sold CDs with several distributions on them, but kept getting Debian wrong. I don’t believe that my motherboard knew how to boot from CD, and thus the Debian installer consisted of two 3.5 inch floppies and a CD-writable. The first floppy contained just the Linux kernel, and the second contained Busybox (which I wrote for the purpose of booting), a tiny root filesystem, and an installer written in the shell.”

Red Hat came along when Linux was getting buzz, and capitalized on it well in their marketing. Scott McCarty recalls when you could get Red Hat CD-ROMs at Best Buy, an American "big box" store that sells home appliances such as refrigerators.

Of course, I haven't seen software distributed on CD-ROM for many years. Red Hat changed its focus to the enterprise and then to cloud computing. McCarty says that the focus on the enterprise occurred in the early 2000s with Red Hat Linux Advanced Server 2.1. Their latest gadget is software aimed to provide an "automated image-building service for hybrid clouds."

SUSE Gains a Foothold

Around the same time as Red Hat, SUSE entered the market with an offering that seemed similar. But it was Red Hat that prospered (80% of the paid Linux market today, according to McCarty) and SUSE comes up as an afterthought. Why?

Perhaps it was simply that the U.S. presented a larger market than Germany, so Red Hat had the home team advantage. Welsh recalls that SUSE suffered from a vaguely "European feel" that Americans weren't comfortable with, along with an inappropriate governance structure and a choice of KDE (a project that started in Germany) as SUSE's default desktop when more Americans prefered GNOME.

After Novell bought SUSE in 2003, Welsh said that users and developers no longer saw SUSE as a community effort, although OpenSUSE persists. SUSE as a company even reported increased earnings in its most recently tallied quarter.

McCarty is more specific and concrete, citing professionalism again as Red Hat's advantage over SUSE. For instance, SUSE tended to tell customers to upgrade to the current version of their software instead of backporting bug fixes to earlier versions, whereas Red Hat understood enterprises' need to avoid disruption by running a single version for many years.

And Now For Ubuntu

Another company got its start by addressing Debian's main weakness (as I saw it): difficulty installing the software. The Debian process was not easy for technically unsophisticated users, and (according to Linux expert Don Marti)  its mailing lists were sometimes unfriendly to people asking for advice on installation and getting started. These areas are where Canonical built on Debian and provided a more appealing experience through their Ubuntu distribution.

Creating a robust packaging system

Part of a professional approach to maintaining a distribution is to provide a simple but trustworthy way to install and update software. Welsh says that packaging was a new concept at the time Debian and Red Hat started, and that both projects did a superb job with it. Indeed, their packaging formats (.deb and .rpm respectively) are the only two that have really mattered for many years. (A newer entrant into the packaging area, Flatpak, is jostling things somewhat.) Perens says, for instance, that the terms “upstream” and “downstream” were probably used first at Debian in reference to package producers and projects that build on the packages.

Welsh helped to develop Debian's first package manager. He says they were thinking architecturally, and saw the importance of structuring packages into families (audio, telephony, etc.) A good packaging system brings in more software, because software projects can easily provide a package, as well as more users.

Successful distributions cover all fronts

We’ve seen in this exploration of Linux distribution history that in order to succeed, everything has to be done well. A project must stay close to its community while putting money and sweat into professionalizing every aspect of the development and distribution processes. A project must do even more, though: It must remain supple and be willing to shift dramatically in response to change. Debian, for instance, did this in developing its New Maintainer's Process, and Red Hat in shifting to virtual machines, containers, etc.

As GNU/Linux continues to spread and take on new tasks, the need for new distributions arises. There are many popular distributions not covered in this article. I hope the article is useful to project leaders who want to make the enormous work they put into their distribution succeed.

<< Read the previous post of this series

About Andrew Oram:

Andrew Oram

Andy is a writer and editor in the computer field. His editorial projects at O'Reilly Media ranged from a legal guide covering intellectual property to a graphic novel about teenage hackers. Andy also writes often on health IT, on policy issues related to the Internet, and on trends affecting technical innovation and its effects on society. Print publications where his work has appeared include The Economist, Communications of the ACM, Copyright World, the Journal of Information Technology & Politics, Vanguardia Dossier, and Internet Law and Business. Conferences where he has presented talks include O'Reilly's Open Source Convention, FISL (Brazil), FOSDEM (Brussels), DebConf, and LibrePlanet. Andy participates in the Association for Computing Machinery's policy organization, USTPC.