Several weeks ago, the Ubuntu related blog OMG! Ubuntu! published an extremely interesting piece entitled, “Many Hands Make Light the Work; Few Make It Shine” by Benjamin Humphrey (of Ubuntu Manual fame). The article repeated and expanded upon several mantras currently popular in the Ubuntu community right now, specifically:
- Developers should give very careful thought to the features they add to their programs and ensure that they integrate with the desktop as a whole
- Linux desktop has a large number of minor issues (often referred to as papercuts) which detract from its consistency and usability; these need to be fixed
- It’s not the ideas that matter, but their implementation; and if you’re going to do something half-assed, it’s worse than if you don’t do it at all
Overall, I agree with the message of the article. It shows that the design philosophy and attention to detail of the Mac community is starting to permeate the Linux community; and that is a spectacularly Good Thing.
But even though I agree with the message of the piece, I found myself in opposition to it. I wrote several diatribes in the comments, and then proceeded to defend those positions to the death. (Even though a few of them were pretty extreme.) That’s not something I do very often.
Since I can already see the strange looks and hear the unasked question, I’ll just go ahead and give it voice:
What on earth could set you off like that? (Especially in such an innocuous article.)
There is both a simple and complex answer to this question. Here’s the simple version.
You might say that the article (and especially some of the comments) touched a nerve. Actually, that’s not quite right. The article didn’t just touch a nerve, it scraped it raw, stretched it out, and encouraged Michael Flatley and his troupe to Irish stepdance all over it.
Ready for the more complicated version?
While pushing for a more polished and refined product is a Good Thing, we must exercise extreme care in the methods used to do so. As Humphrey highlights in his article, implementations matter and a poor implementation can be worse than no implementation at all.
Unfortunately, there were several poor implementations inherent in the OMG! Ubuntu article. These weren’t things that were explicit in the text, but rather in a few insidious ideas implicit in the criticisms. The ones that really got me riled were:
- Creation of an ideal that has never existed
- Tremendously high expectations without providing resources for goals to be achieved
- The need to serve “Average Joe User”
In the remainder of this essay, I will try and explain why these ideas can be so dangerous.
Idolizing Mac OS X
When speaking to people in the Linux community, they will often point to Mac OS X as a paradigm of perfect computing. The mantra goes something like this, “OS X is polished, intuitive, and beautiful. It represents what a desktop operating system should be.”
Yet, most of the idolization of OS X seems to come from people who don’t actually use it. For this reason, they often overlook the very significant problems which it does have in favor of an ideal that has never existed.
Since I can already hear you asking, “What problems?” I should probably give an example. So in addition to the issues with the Finder, consider the piss-poor way that Mac OS X installs and removes software.
Again, you’re probably saying “What’s so hard about dragging a program to the trash in order to get rid of it. It’s logical and easy!”
To which the only reply I can offer is, “Nothing.” But if you think that an uninstaller simply removes the executable, I’ve got some news for you, that’s only one of the things that it does; and it isn’t even the most important.
The uninstaller also gets rid of configuration files, temporary work product, and many of the program assets. On Mac, there is absolutely no way to get rid of this information short of manually tracking it down and nuking it. On Windows or Linux, however, it is removed by the package manager.
It also offers a single way to add and remove software. You’re not left with fifteen different ways of trying to get rid of Growl because it has decided to randomly stop working. (Again.)
And that is only one problem! Once you beneath the hood, there are many issues with Mac OS X. For example, they randomly change and break things; if you leave the walled garden, things aren’t guaranteed to work; and worst of all, Apple is really hard to work with, which means that if you’re doing anything advanced, you’re on your own.
Expectations and Resources
The second major danger in Humphrey’s article is the assumption that Ubuntu is a product. Sure, open source processes produce software, but it’s not a particularly good idea to think of the result as a product.
Products are produced by companies and then sold on the market. They have an expected lifespan and an iterative development cycle. They are eventually replaced by new iterations that then obsolete the older versions, which prompt you to pay for an upgrade.
That description doesn’t really describe open source development very well.
First, an open source codebase is not a product. In all likelihood, it isn’t going to be sold. And even if it were, any customer would be free to access the source code and undercut the business of the original distributor. Can you imagine Apple or Microsoft adopting such a business model?
Second, an open source program has no expected lifespan to speak of. Things that work get used and continue on. Things that don’t get abandoned. There’s a reason why LaTeX is popular in academic circles – even after thirty years. It does it’s job, very well.
Finally, new iterations do not necessarily obsolete the older implementations. Far more often, the newer version supplements the older version, or extends it. If you don’t want to upgrade, there is no reason to. I’m aware of many a server that still runs Python 2.4, even though it is years out of date.
In the same vein, open source isn’t really a service, either. A great deal of the code is written by volunteers or by developers to meet a very specific need. Yet, that same code is then co-opted for use in an environment that is very different from its original intent. In the original article, Humphrey references the case that Gwibber and Ubuntu’s “Social from the Start” initiative feels half baked. (And to be clear, it is). But before we begin to castigate and pile on Gwibber’s developer, it’s important to remember that he’s a volunteer. He doesn’t work for Ubuntu or get paid for the time he spends on the program. He isn’t offering a service to the community. Gwibber’s developer has another job and other responsibilities that have little to do with his open source project.
For this reason, it isn’t possible to greatly improve things or make sure that the code is “fully baked” before releasing it. Nor is there always time to release fixes on a set and highly rigid (or even timely) schedule. It may be frustrating to hear, but it’s the truth.
if you want to see things happen faster … well … the source is available, you could fix the problems yourself. Or, you could donate to the project the amount of money needed to fix said bug (thereby transforming the work into a service). (If you do this, estimate the amount of time you think it would take and then pay between $20 – $30 an hour for that time.) I’ve occasionally received such donations on Time Drive, and you can rest assured that the bug in question is fixed quickly.
Trying to treat open source code as a product, or its development as a service is very dangerous. It tries to allocate accountability where none can exist and raises expectations where there are no resources. It causes users (and other developers) to forget that many rough edges and “unacceptable” design choices arise from a lack of resources.
But that’s not the only reason we should be wary. Trying to treat Ubuntu as a product also causes us to forget that Open Source development is qualitatively different from closed source development. In the case of open source, all of the development, testing and politics happens out in the open.
It’s impossible to make a polished product without iteration and feedback. One of the reason why Apple appears to release polished products on the first try is that this iteration happens behind closed doors. They have the resources (and a paid developer community) to test things and round off the edges before releasing them to the public. In general, Open Source does not. If you are using the first iteration of an open source program, it might be a scary experience. It’s probably untested, or it’s only been used in a very narrowly defined scope.
Thus, if it feels like you’re being experimented on, that’s because you are. But his isn’t necessarily a bad thing.
Serving Average Joe User
Because, you see, users are part of the development team. They might not be writing the actual code, but they are still integral to its creation. Now compare that to the role users have in the proprietary world. Yes, Apple may release polished software, but that comes at a price. You get absolutely no say in the development process. You will take what they give you and be happy about it, or you can go elsewhere. Instead of having the software tailored to your needs (through feedback and iteration), Apple instead creates an imaginary abstraction that I will refer to “Average Joe User.”
Average Joe is meant to stand in for you, your mom, and anyone else who uses Apple products. Apple attempts to understand Average Joe’s needs, and then designs their products to fill them. Of course, there is just one problem: there is no such thing as Average Joe User. He. Does. Not. Exist.
Don’t believe me? Let’s contrast how I use my computer to how my mom uses her computer. I’m an engineer and I’m a geek. I work with numbers, write software, and run simulations. In my spare time, I like to bandy words about and play with design. I use graphics programs, numerical tools, CAD, and word processors. In short, I have some rather contradictory computer needs.
My mom, on the other hand, does not. She mostly surfs the web, sometimes answers email, and listens to music through iTunes. We use our computers for wildly different things. How in the world do you come up with an Average Joe that is representative of both of us?
Is there some way to split the difference? Does my usage of the computer (admittedly on one side of the spectrum) somehow balance hers (which is on the other)? Is there some group in the middle that has some of my mom’s needs and some of mine? And does how I use the computer harm how she does?
No, there’s not. Yet, Apple pretends that there is a way to bridge the divide. They create advanced “usage scenarios” where they debate about which features are needed, and which are not. They try and simplify complex programs, and cut out the bits that are not needed; and to do this, they rely on a collective figment of imagination known as “Average Joe User.”
Sometimes this works out okay, but often it does not. Sometimes the cut features are simply not needed (what developers call bloat) for either real users, or imaginary ones. But other times, they are. For example, do you remember when iMovie ‘08 was released? It was a disaster. A classic example of what happens when you try and cater to the needs of Average Joe over real people.
iMovie ‘06 was a wonderful program. It was highly usable, easy to understand, and functional. You could use it to create short videos or even longer productions. It’s what David Pogue used to create his weekly video for his Times Column and I know others who were able to use it in production environments.
Then, along came iMovie ‘08. Instead of building upon and refining ‘06’s core features, Apple instead opted for a total rewrite. The program was transformed from a highly stable, exceptional product for amateur filmmakers into something suitable for publishing YouTube clips. Even now, more than two years later, iMovie is a shadow of what it used to be (though it’s getting better); and it’s still buggy as hell. I’m sorry, but that is not improvement.
Whenever we talk about raising the bar, or refining the UI, somehow Average Joe gets drug in. We say silly things like:
“Average Joe just wants his computer to work”
“Average Joe doesn’t want to mess with settings or config files. In fact, he shouldn’t have to.”
“Software should Just Work and Average Joe shouldn’t have to suffer half-assed implementations.”
But in addition to the fact that Average Joe doesn’t exist (and how can an imaginary person have expectations?), there is another problem. Finding bugs, developing opinions, and collaboration is part and parcel to Open Source. Users are part of the development team (remember?), and that’s also a Good Thing.
It’s why we release early and release often. It’s why open source developers join mailing lists, publish blogs, frequent IRC and generally try to be available. We don’t want to deal with Average Joe, we want to deal with real people who are going to use the software to do Real Things.
Software that “Just Works” doesn’t allow for any of that. If it works from the moment you open the box, then there is no need to seek out the devs, complain, and help improve the product. You take what you are given, and try to be thankful for it.
How boring is that? In such a model, there’s little room for sloppiness, confusion, need, innovation or experimentation. It’s certainly not what I want, nor is it what I would expect from most users of open source. After all, many went through the hassles of finding their tech, configuring it and learning how to use it. That’s a pretty big hurdle for a crowd of luddites who don’t wish to deal with complexity.
Which leads me to my closing point. There is certainly a place for software that Just Works. People license Redmond’s OS and buy Apple hardware. Those companies then hire developers and pay them for quality assurance. it conducts focus groups and tries to hammer what Average Joe user would like. And it can do all of these things because they have more money than God.
But open source users shouldn’t expect to open the box and coo with delight. If they want things to become more polished, then they need to contribute and support. They need to file bug reports, provide feedback, and donate money. it doesn’t matter if they’re a developer or not. To get better, open source requires time, effort and cash. There is no free ride.
Perhaps this sounds jaded, or cruel; but I’m past the stage in my life where I see things through the lens of idealism. If you want to see improvement, you have one of three options. You can pay for what you use (Microsoft and Apple), give up your privacy so that companies can serve soul stealing ads (Google), or you can work for it (open source).
In each case, you aren’t going to find perfection. All operating systems have problems. Macs do entertainment well, but leave much to be desired in the world of work and scientific computing. Windows works for work, but lacks the Unix underpinnings available in Mac OS X/Linux. Linux lacks for entertainment and media consumption. Regardless of which you use, you probably won’t be completely satisfied and quite a few things aren’t going to Just Work. But at least Linux users can fix it, if they want to; and that means something. If you want to add a corkboard and outliner to your writing program, you can. No one is stopping you. Want a high quality backup utility? You can have that too.
Open source empowers you, the user to do what you want. If you’re not happy, you can do something about it; and that is significantly more than Mac or Windows users can say.