Tony Mobily [opinions]
6/23/2009

2009: software installation in GNU/Linux is still broken -- and a path to fixing it

fsmsh.com/3162 [linux] [software] [installing]

GNU/Linux is slowly invading everybody's everyday life. I won't say "The year of the GNU/Linux desktop is here". Been there, done that. But, GNU/Linux is definitely imposing its presence -- think about Android, or the number of people who are currently using GNU/Linux as their main desktop.

And yet, software installation in GNU/Linux is broken. No, not broken... it's terribly broken. Why is that, and what can be done to fix it?

Free Software Magazine is now on Twitter!

Free Software Magazine has joined the Twitter crowd! You can see Free Software Magazine's twitter posts. You can also see our Twitter RSS feed. So, once again:

http://twitter.com/fsmag -- FSM's twitter account

http://twitter.com/statuses/user_timeline/48408821.rss -- FSM's twitter RSS feed

The current story

Most distributions today (including the great Ubuntu) are based on package managers. If you want to install a piece of software, you grab it from one of the official repositories, and your package manager will "explode it" onto your computer's file system. A program will place bits and pieces in /usr/bin, /usr/lib, /etc, and so on. This is normally done through a package manager. In Ubuntu, for example, you would probably use Synaptic. A package manager will normally solve all the "dependency problems" for you. Ah, dependencies... basically, an image viewing program might need, for example, libjpeg to function (libjpeg being a library of functions to open, save, and generally deal with JPEG files). This is a very Unix-ish approach. It works perfectly well for servers, but fails on several levels for clients. Why?

Advertisement: ZenOSS Enterprise Monitoring

ZenOSS is a powerful, enterprise level monitoring system -- and yes, it is fully free software. Sounds convincing? Download Zen OSS now!

http://www.zenoss.com/in/mi/fsm

There are several drastic problems with this approach. Here is a comprehensive but by no means exhaustive list (which will probably grow as pepople e-mail me):

It's 2009, and GNU/Linux is still plagued by all of these problems. Even Ubuntu, a distribution I love, is plagued by all of these issues -- and we are talking about a distribution aimed at end users!

What the story should be

The story should be very simple:

All this is true with Apple's OS X. They got software installation just right -- although a few programs, lately, seem to come with an ugly installation process.

Where does the problem come from?

Don't get me wrong: I think Ubuntu is a fantastic system, and gets a lot of things right. It think the problem stems from an issue that is more philosophical than anything else.

The issue is at the heart of this article, and deserves to be put in bold.

Every GNU/Linux distribution at the moment (including Ubuntu) confuses system software with end user software, whereas they are two very different beasts which should be treated very, very differently.

I think using dpkg/apt-get or rpm/yum for system-wide software, libraries and so on is the way to go. GNU/Linux's success in the server arena is not a coincidence: a distribution is made up of several independent "bricks" which create a majestic building.

However, using the same philosophy -- and therefore architecture -- for end-user software is just too limiting. My point list above is not "a list of unfortunate drawbacks". It's one of the major reasons why GNU/Linux hasn't achieved mass penetration in the desktop arena.

What bothers me is that while all of the other problems are being solved (vendor support amongst the others), this one is a persistent thorn in every GNU/Linux user's side. A painful one.

Existing material about this problem

A lot of debate-- as well as software -- exist about this issue. In terms of software, you can get a whole distribution -- GoboLinux -- which follows exactly this priority: one directory per program. There is a problem with GoboLinux's approach: it applies the "one directory per thing" approach to everything -- including system libraries and server-side programs. GoboLinux also goes one step further by changing completely the file system -- an idea I am strongly against.

In terms of what's been said, there are several discussions about this in Ubuntu and Debian. A good start is the [Rename Top Directory Names]http://brainstorm.ubuntu.com/idea/6243/) in Ubuntu. This link has a long list of duplicates. There are also many, many "blueprint" drafts in Ubuntu's launchpad. There are so many in fact that you will get lost reading them. A lot of them talk about a simplified directory structure for the system, which as a consequence would imply simplified software installation.

What's wrong with GoboLinux?

I don't think GoboLinux's approach is a winner for two reasons:

  • The Unix file system has been around for a long time -- for good reason. It does work extremely well to keep a system sane and working.

  • It would meet too much resistance in the GNU/Linux community -- for good reason.

However, GoboLinux gave us a practical example that this change can be made. It's actually possible.

Four steps to fix the problem

I can't really fix this problem. It will take a lot of effort, and a lot of courage from major players to even start heading in the right direction.

The first step, is to face the truth and admit that there is a problem. This is the aim of this article, which -- I hope -- will resonate within the GNU/Linux community.

The second step is to set out a path which might eventually lead to a solution. This is what I will attempt to do in this article. The solution will be generic and I will try to borrow from as much existing software as possible.

The third step is to improve on the proposed solution; this is tricky, because there needs to be the right balance between too little and too much planning. It also requires somebody to coordinate the discussion, able to lead everybody towards a full solution. My secret dream is that somebody from Canonical, or from Red Hat, would do this.

The fourth step is implementation. This is the hard part. I am sure that implementing it will reveal problems, limitations --and more.

My own semi-technical take

Here is my idea. I haven't programmed in C in years; this means that I might make some silly mistakes. However, I am confident I can provide a good starting point.

Here we go.

Whoever manages this system should look closely at what OS X does, because OS X's engineers had the exact same problems to solve -- and solved them successfully.

Conclusions

This article might start a revolution -- or it might just be yet another article complaining about installing software in GNU/Linux.

I have a dream. A dream of a world where people distribute applications as bundled directories, and these bundles work in Ubuntu, Fedora, etc -- and they keep on working when a new version of the operating system is installed. A world where software installation in GNU/Linux is easy and applications can be swapped by simply copying them onto a memory stick.

I wonder if I will ever see this in GNU/Linux?.

P.S. Some will say, "if you like the way OS X does things, use OS X". My answer to that is, "I like the way OS X does things, it works, it solves problems, but let's rather be inspired by it and improve it"


License

Verbatim copying and distribution of this entire article are permitted worldwide, without royalty, in any medium, provided this notice is preserved.