|
Home | Switchboard | Unix Administration | Red Hat | TCP/IP Networks | Neoliberalism | Toxic Managers |
(slightly skeptical) Educational society promoting "Back to basics" movement against IT overcomplexity and bastardization of classic Unix |
Nikolai Bezroukov. Portraits of Open Source Pioneers
For readers with high sensitivity to grammar errors access to this page is not recommended :-)
Prev | Up | Contents | Next |
Papers | 2008 | 2007 | 2006 | 2005 | 2004 | 2003 | 2002 | 2001 | 2000 | 1999 |
|
|
Author's note: Alan Cox needs little introduction--most will know him for his long-standing work on the Linux kernel (not to mention his appreciation and promulgation of the Welsh language among hackers). Cox is one of the keynote speakers at EuroOSCON this October, where he will talk about computer security.
According to Alan Cox, we're just at the beginning of a long journey into getting security right. Eager for directions and a glimpse of the future, O'Reilly Network interviewed him about his upcoming keynote.
Edd Dumbill: You're talking about the next 50 years of computer security at EuroOSCON. How would you sum up the current state of computer security?
Alan Cox: It is beginning to improve, but at the moment computer security is rather basic and mostly reactive. Systems fail absolutely rather than degrade. We are still in a world where an attack like the slammer worm combined with a PC BIOS eraser or disk locking tool could wipe out half the PCs exposed to the internet in a few hours. In a sense we are fortunate that most attackers want to control and use systems they attack rather than destroy them.
ED: Linux sysadmins see a security advisory and fix practically every day now. Is this sustainable, and does it harm Linux that this happens?
AC: It isn't sustainable and it isn't going to work forever. Times between bug discovery and exploits have dropped dramatically and better software tools will mean better and faster written exploits, as well as all the good things.
I think it harms Linux perhaps less than most systems because Linux security has been better than many rivals. However, even the best systems today are totally inadequate. Saying Linux is more secure than Windows isn't really addressing the bigger issue--neither is good enough.
ED: You say that we're only just at the beginning of getting computer security right. What are the most promising developments you see right now?
AC: There are several different things going on. Firstly, the once-stagnant world of verification tools has finally begun to take off and people have started to make usable code verification and analysis tools. This helps enormously in stopping mistakes getting into production.
Related to this, languages are changing and developing. Many take some jobs away from the programmer and make it harder or near impossible to make certain mistakes. Java for example has done a lot to make memory allocation bugs and many kinds of locking errors very hard to make.
The second shift has been towards defense in depth. No-execute flags in processors and software emulation of them, randomization of the location of objects in memory and SELinux help control, constrain and limit the damage an attacker can do. That does help. There have been several cases now where boxes with no-execute or with restrictive SELinux rulesets are immune to exploits that worked elsewhere.
SELinux also touches on the final area--the one component of the system you cannot verify, crash test, and debug: the user. Right now, systems rely on user education and reminding users "do not install free screen savers from websites" and the like. The truth is, however, that most users don't read messages from their IT staff, many don't understand them and most will be forgotten within a month. SELinux can be used to turn some of these into rigid policy, turning a virus outbreak into a helpdesk call of "the screen saver won't install."
This last area is very important. We know the theory of writing secure computer programs. We are close to knowing how to create provably secure computer systems (some would argue we can--e.g. EROS). The big hurdles left are writing usable, managable, provably secure systems, and the user.
It's important perhaps to point out here that secure programs, reliable programs and correct programs are all different things. Knowing how to write provably secure programs is very different from saying we know how to write reliable or correct programs.
ED: Can security in software development be meaningfully incorporated into tools, so it doesn't end up stifling the productivity of developers?
AC: The current evidence is yes. Many of the improvements actually increase programmer productivity by taking away tedious tasks like memory management, or identifying potential bugs at compile time and saving the programmer from chasing bugs for days, and because many of them use labeling techniques where you have to indicate when you mean to do unusual things--actually making code easier for other humans to analyze.
There is no evidence that sparse has slowed kernel development, tainting features have hindered Perl, or that Java memory management harmed most productivity.
The tools are doing by machinery what is hard to do by hand. Bad tools could slow people down, but good tools do not.
ED: Isn't there a fundamental level at which security concerns and the freedom of individuals to innovate are opposed? Is there an end in sight to open source software created by small numbers of people?
AC: There are areas where they come together--obvious ones are safety critical systems. It's just possible that you don't want nuclear power station employees innovating on site, for example.
There are 'security' systems such as 'trusted computing' that can be abused by large corporations to block innovation, and unfortunately the EU ministers seem to want to help them, not their citizens. Whether the EU commission is corrupt, incompetent, or just misguided is open to debate but the results are not pretty. We've seen that with the X-Box. Microsoft sells you a product and threaten to sue you for using it to its full.
Those same tools, however, are valuable to end users, providing they have control over them. The same cryptographic techology that will let Apple lock their OS to apple branded x86 computers is there for me to keep personal data secure if a future laptop is stolen. It is a tool, unfortunately a tool that can be easily abused.
To a homeowner a secure house is generally good. but if you lose control of the key, it can be a positive hindrance. TCPA is no different.
ED: Where is the ultimate driving force for implementing secure software going to come from? It seems that regulatory enforcement, such as in the pharmaceutical industry, might be the only way to properly protect the consumer.
AC: At the moment it is coming from the cost of cleaning up. Other incentives come from statutory duties with data protection, and also from bad publicity.
In the future they might also come from lawsuits--for example, if an incompetently run system harms another user--or from Government. In theory as we get better at security the expected standard rises and those who fail to keep up would become more and more exposed to negligence claims.
The bad case is that someone or some organization unleashes a large scale internet PC destroyer before we are ready and legislation gets rushed through in response. That will almost certainly be bad legislation.
Edd Dumbill is editor at large for O'Reilly Network and coauthor of Mono: A Developer's Notebook. He also writes free software for GNOME and packages Bluetooth-related software for the Debian GNU/Linux distribution. Edd has a weblog called Behind the Times.
Do you think this step is a threat to Linux?
No. There is a real difference between "you can have a quick look at some of our source code but you can't do anything with it, and you have to be a government and we have to like you". In addition to which, just having the source code isn’t really that useful because unless you have the code for compiler and you can rebuild all the code and verify it, how do you know the source code they have given you is anything to do with the actual operating system your running? And that is a very real question. One the other hand we've had military and government security people who've actually taken a basic Linux system including the compiler, they’ve audited that compiler and have built and audited every single single sub set pieces of Linux they use so they can definitively say that it doesn’t have any back doors in it. At least in their version of it and they are happy with it and any back doors they have put in themselves for their own use.
During a talk last weekend at the Free and Open source Software Developers European Meeting, FOSDEM, on the challenges of maintaining a stable Linux kernel, Cox revealed that although Linus is good at developing code, he does not enjoy some of the other jobs that go along with software development such as bug fixing and beta testing.
"Linus is a good developer, but is a terrible engineer," said Cox. "I'm sure he would agree with that."
Cox explained that he and Torvalds sometimes have different approaches to fixing a problem, due in part to their different responsibilities. As the maintainer of the development kernel Torvalds needs make sure the kernel code is easy to maintain, while Cox is more interested in kernel stability and is not so worried about "hacking" the code to get it to work.
"One of the hard problems to fix is design errors," said Cox. "These are a pain because they need a lot of refactoring. Linus' approach is to re-write it to a better design. But to get a stable kernel you tend to do small horrible fixes. Linus is very keen to have maintainable code, while to have a stable kernel I'm keen to have code that works."
Cox said that Torvalds does not always let people know when he has fixed a security bug in the kernel. This can be a problem as the patch will take a while to make it to production, which means that hackers can exploit the vulnerability before it is made available to individuals and enterprises running Linux.
"Linus has this bad habit of fixing security holes quietly," said Cox. "This is a bad idea as some people read all the kernel patches to find the security holes."
Linux enjoys a reputation as a particularly secure operating system, compared to rivals such as Microsoft's Windows. Last month a mailing list was set up to help Linux kernel developers share information on security flaws.
Deciding what bugs to fix in the Linux kernel is not always easy, particularly as fixing it can impact other applications. Cox said he gives top priority to bugs that are reported soon after the release candidate is made available.
"Release candidates will pick out a lot of the stupid bugs, and what are plain stupid ideas," said Cox. "Two or three days after the release candidate we will have 150 emails with same bugs." These early issues can be easy to fix as they are often obvious bugs. "Early problems you get are normally very easy to fix," said Cox. "As soon as the release comes out bug reports say 'You've broken this'. Almost immediately you go, 'Whoops, that's my mistake'. Ten minutes later the fix is in the development tree."
But kernel bugs that appear easy to fix can be misleading. "Sometimes you see a fix and think 'this is perfect, move my fix into the kernel tree'," said Cox. "Later you think, 'I must have been drunk. Don't apply that patch'."
Most of the time the ability to influence Politicians is remarkably limited. They ignore letters, and often all the major parties reflect only large proprietary interests, ensuring you get CD's that won't play in a car, arrested for helping the blind read protected ebooks and prevented from writing and using software by the patent lobby, intent on locking up technical creativity the same way the soviets locked up the typewriter.
In about a week you get an opportunity to send the EU politicians a message they cannot ignore - but sadly only 18% of UK citizens will bother to do so. While most British people would like to ignore the EU, the simple fact is that it is the EU that passed the EUCD, it is the EU that can fight software patents and it is the EU that is currently working to create even more draconian "intellectual property rights" laws.
This letter is aimed at the other 82% of hackers, open source enthusiasts, or just people who want the rights to use CD's they paid for fairly and honestly. That little piece of cardboard is your chance to call the EU to account, and thanks to the EU voting system you can make a difference, in fact you count five fold due to the expected low turnout.
There are two parties that are fundmanetally opposed to things like Software Patents.The Green-EFA alliance (Green Party, Plaid Cymru and friends) have been fighting the patent fight from the beginning including organising events in Brussels, as well as fighting to make the EU more democratic (to stop unelected Beaurocrats overturning the will of the parliament). The UKIP (UK Independance Party) is opposed on the ground that EU legislation like software patents clogs up British business and harms Britain as a sovereign nation. Unlike our parliamentary elections the EU voting system means it is not a two horse race.
To defeat software patents now needs an absolute majority in the parliament. That is going to be hard to achieve, but you get to adjust the make up of the parliament, and every vote is going to count.
Please, if you were not going to vote, either vote for the UKIP or Green-EFA alliance members. Ideally pick the one of the two that is most likely to win in your area, but if you have philosophical reasons for favouring one of the two (such as a dislike of the EU) please go vote for the one you favour. These are the people who will have to decide how to fix the EUCD, these are the people who will have to decide on Software Patents. Whether you believe in the EU or not, the people you vote for (or the pro patent, pro DRM people who will get in by default if you do not vote) will dictate your future rights.
The turnout in the UK is expected to be 18%. That favours anyone who can mobilize and get out and vote. It's a one off opportunity to kick the pro-patent lobby somewhere that hurts.
Vote, get your friends and families to vote, get LUGs to vote en-masse. Call out the troops - it's payback time...
Alan Cox
FOSDEM - What is your feedback about your sabbatical year ?
Alan Cox - I enjoyed the MBA a great deal. I've learned a lot of useful stuff that helps when tying computing into the real world. I'm still working on the thesis and need to go interview more folks using Linux on the desktop in business and/or planning to do so yet.
FOSDEM - Some security websites published unpatched security issues affecting the stable kernels. There is no highly critical remote hole right now, but how can we improve the way the security fixes are made?
Alan Cox - The obvious improvement is more tools so that they don't happen in the first place. I'm personally of the opinion that responsible security disclosure involves telling the developers first, and perhaps giving them 14 days to respond and resolve the problem. If you don't force a time then large vendors tend to take forever, if you release immediately then many people can be harmed before a fix exists.
FOSDEM - Should we, for example, name a security maintainer who would handle all the security advisories and bugfixes for stable kernels ?
Alan Cox - Definitely. We sort of have that for the vendor kernels but not officially for the base kernel. For 2.4 Marcelo is part of vendor-sec so he's both 2.4 maintainer and security guy. 2.6 is less clear.
It also has to be more than one person. It's no good if a serious hole occurs and the named security person is flying to Australia that day, or ill or whatever.
FOSDEM - You are working for a well-known Linux distribution. Does your employer impose you any sort of contraint or does he allow you some freedom?
Alan Cox - Red Hat primarily pays me to work on the kernel. I'm mostly trusted to use my own judgement on what that means, and guided by the hot issues customers see. There are things I get through Red Hat, such as vendor pre-production systems and documents that are restricted but nobody in Red Hat demands I run Red Hat products for example. Except for the little boxes (running Debian) I do run Red Hat Fedora but that's by choice.
FOSDEM - Linux is now developed by professionals, who are paid by companies having sales targets. Is this kind of development less fun? What is in your opinion the consequence of the fact linux is now more and more developed by professionals?
Alan Cox - Less fun for some, more for others. It's harder to do research type 'blue sky' projects with Linux in some ways but there are people who love total reliability, verification and quality and those kind of skills are becoming more and more demanded in the Linux world. Big Linux servers have to stay up and companies demand more and more stability and quality as a result.
The kernel itself definitely has changed, its much more "finished" now. There is no real feeling that there are big pieces of catching up to do. The desktop is perhaps today more like the kernel was a few years ago.
FOSDEM - Can GNU/Linux or *BSD take any advantage of the access to the source code of OpenSolaris?
Alan Cox - The licensing really prevents code sharing. We have multiply licensed code that we share with BSD so in theory third parties can usefully contribute code to all the systems. It may also be useful for driver/hardware information if there is actually anything Solaris drives that Linux does not.
FOSDEM - From your valuable insight and broad overview of the whole Linux kernel, what should be the 5 items which have to be addressed by the upcoming kernel releases (apart from hardware support which obviously cannot be addressed by the solely kernel team) ?
Alan Cox
Better performance on small machines Virtualisation (Xen etc) More security features Resolving the X and kernel video muddle properly World domination
FOSDEM - There has been a great improvement between the 2.4 and 2.6 kernel versions. A lot of developers have been hacking since the first release of the 2.6 kernel, but nothing has been carried out on a 2.7 version. What are your comments on this?Alan Cox - I'm still watching this experiment with interest - it reflects the changes in the kernel from development to mostly finished. No conclusions as yet beyond the need for 2.6.x.y subreleases of fixes for each 2.6.x
FOSDEM - What do you expect from your FOSDEM talk?
Alan Cox - A lot of hard questions. FOSDEM seems to have a reputation for being a real developer conference so it should be a lot of fun. I hope the beer is good.
A large part of the software industry has never heard of the science of quality assurance - or if it has, it doesn't believe in it. Thus spake Alan Cox, Wales' most famous Red Hat employee and one of the most influential voices in the IT world. Currently wrapping up his MBA at Swansea University, it's clear that Cox has been spending a lot of time thinking about what the software world can learn from everyone else about quality.Cox was speaking at the launch of an advanced technical computing group for Wales, run by IT Wales, part of Swansea University's computer science department. IT Wales' other activities include running events for SMEs in South and West Wales, and working to retain IT skills in Wales by matchmaking computer science graduates with Welsh businesses.
The advanced technical computing group aims to bring best practise to Welsh software engineers from organisations such as the British Computer Society, the Natural Computing Forum and the Welsh e-Science Centre. Activities kick off in January 2005.
Cox, a graduate of Swansea University, discussed a number of trends which are allowing developers to produce better quality software. While some of these trends relate specifically to the computing world, others are simply a case of that world putting into practice the kinds of techniques which have been seen as essential in traditional industry for some time.
Starting with the statement that "all software sucks", Cox compared software engineering to its counterpart on the hardware side of the equation, where the economic incentives for getting it right first time are indisputable; with hardware, a single error can cost millions.
Using microprocessor manufacturers as an example, Cox said, "They put over 100 million gates/transistors on a tiny piece of silicon. On that piece of silicon there are more lines than there are on a roadmap of London - and they work. There are very very few errors in a microprocessor."
When software doesn't work the way it should, it's easy and cheap to ship an upgrade or a patch to the users, who are then inclined to accept buggy software as the normal state of affairs, Cox said.
Even though there has been a movement for some time to introduce traditional engineering concepts such as quality assurance to software development, Cox sees today's software engineering as "the art of writing large bad programs rather than small bad programs".
Of the much-vaunted 'holy grail' of reusable objects, Cox said, "As far as I'm concerned these all generally suck too. Part of the problem is that they're sold as products and the original ideas behind a lot of reusable products is that you wrote it once. If you write it once, it has to do everything. If it does everything it's complicated, and if it's complicated, it's broken. That's not always the case but it is quite frequently the case."
As for QA, "Everybody in the real world will agree - the moment a project is behind deadline, quality assurance tends to go out the window. People go through the specification and everything marked 'optional' becomes 'version 2', and everything marked 'QA needed' becomes, 'we'll find out from the users if it works,'" Cox said.
Another factor that's led to the current state of affairs is that of canny software companies which shift bad software as quickly as possible, on the basis that once the end user has one piece of software for the job it becomes harder to switch to another one - in that context, Cox considers Microsoft's release of early versions of MS Windows as a very sound economic and business decision.
Compounding the situation even further is the incentive for businesses to deny all knowledge and point fingers when software errors are uncovered. If there are several parties responsible for the maintenance of a piece of software, he said, it's in everybody's interests that the other person fixes the bug because the customer will assume that whoever fixes the bug was responsible for it. Most businesses, particularly SMEs, don't have that luxury.
Gladly, it seems there are good reasons why this situation can't go on for much longer. One large incentive for improving matters is security. "We're looking at very large numbers of PCs being taken over every day, used as zombie machines, fed software which makes them dial the internet via Ghana, and in particular, something known as zero day holes. In other words, someone who's finding a security flaw and exploiting it before the rest of the world knows."
"The update side is becoming a problem. You take a WinXP machine, you plug it onto the internet, on average you have 20 minutes before it is infected with something, if it's not behind a firewall. That is considerably less time than you need just to download the updates. These are becoming economic issues, because they're starting to cost businesses all over the world astronomical amounts of money."
So, how does one make the world a better place by writing better software? For starters, Cox says, we need to accept that humans are fallible and that software engineers, no matter how well trained, will make large numbers of mistakes in their software - so we should start using the right tools to keep the error count as low as possible.
Here, then, are Alan Cox's hot tips and tools for writing better software...
Execute-only code: One of the classic ways of attacking a web server with a known security hole is to feed that server with a command that triggers the security hole, and which contains a piece of code that is run as a result. Cox cited recent developments in microprocessor design which allow execute-only and read-only areas of memory, which provides protection against such potential damage because, for instance, any data fed to trigger a security hole won't run if it's not in executable memory.
Firewalling by default: "Red Hat has been doing this for four years now, Microsoft is finally doing it, Apple has been reasonably intelligent about this for a long time as well. You don't leave your front door open just in case you need to walk in and out. It's much much safer to have your front door shut. So by having firewalling by default, it actually allows users to accept, there is probably insecure software on my computer system. It may have bugs in it. But if the rest of the world can't get at my software, I don't care - not too much."
Languages are very important, particularly when it comes to the issue of memory allocation.
"If computer programmers get the memory allocation wrong, why are we letting the computer programmers do the memory allocation? The computer can do this. The world has moved on since the design of languages like Fortran and C.""So for other newer languages, we have garbage collection, we have sensible memory allocation, and this means we can take things away from the programmer, so that providing the language has done it right, the programmer cannot make that mistake anymore. And this works out incredibly effectively when you look at the kind of bugs you get in software. Even when just doing it by getting programming interfaces right, we see huge improvements."
"I looked at this for some of the Linux desktop code. And instead of using standard C functions for a lot of the memory handling for text, it has a library which doesn't allow the programmer to screw it up. If you look at the history of this kind of error, almost none of them occurred in desktop [environment] compared to a very large number that were found elsewhere in applications on Linux. So it tells us that using the right tools works."
Validation tools: "They used to be very expensive, they're getting a lot cheaper. So we know for example if a given function takes a lock, it should also get rid of the lock in all paths. So one of the cases where the error code forgets to do things, we catch."
Type safety: "Things like type safety are now taken for granted. When I was an undergraduate at Swansea University, we thought it was a novelty when the C compiler told you if you passed a floating value to a function instead of an integer."
Tainting: "The idea is that when you've got untrusted data, you actually tell the computer this data is untrusted, because then you can look through how the untrusted data is used, and what other data it creates. And you can look for cases where you're doing stuff with untrusted data that you shouldn't be - like relying on it. And so we catch human mistakes before we ship them to the consumer."
Rule verification: "If you have rules in your software, you know how certain bits of it should behave, you can start to use software in some cases to verify or to validate these rules."
Good interfaces: This is another surprisingly effective one. If you look at a lot of other businesses, if you're a car manufacturer and you find you've got a lot of faulty cars coming off the production line because someone's put a part in backwards, the first thing you do is make a new version of that part which has a knob on it or something so it won't fit backwards. That's the immediate reaction. So we've started to do this kind of thing in software. So we have things that are simple and hard to misuse."
"An example of this is, with locking, instead of having one function for taking a lock and another function for releasing the lock, which inevitably means that someone always has an error handling or an unusual case where they forget, you have a single function which calls another function locked; it takes the lock, calls the function, and drops the lock. All of a sudden it's another mistake you can't make because the computer won't let you, because fundamental to your language, fundamental to the way you're coding, is the idea that this lock must be released. And it turns out you can do a lot of these things in languages like C++ by being a bit clever."
Defensive interfaces: "Locks with corrupt flags is another example. One of the things the telco industry cares about is that systems stay up. So eventually your software crashes with somebody owning the lock - someone currently has the sole right to some critical data structure. And in this case what the telecoms people do with newer systems is that after a certain amount of time, the system has a watchdog, much like your video recorder does. If the video recorder or your DVD player crashes, it just reboots after a certain amount of time, as if nothing has happened. This is great until you've got locking, and you kill a particular part of your phone switch and it owns some critical part of the system."
"[With] defensive interfaces, I can now take a lock and I can be told, 'I'm giving you this lock, but be aware that something terrible happened to the last user of it' - which means that when you take this lock you can actually start to take defensive actions."
Mathematical models: "People have started to use mathematical models for things like defect rates. Turns out all the models exist - the large part of industry that actually makes physical objects has known about them for a considerable number of years. They tell you interesting things like when you should release software beta. Providing you've got a good estimate of the cost of finding faults yourself, and the quality of the fault finding relative to your beta testers, you can actually do the maths to tell you when you should be going into beta testing."
Scripted debugging: "Traditionally you think of your debugger as something that you use after your software has crashed. But a debugger turns out to be very useful in quality assurance, because you have a lot of things in your software which you can't easily inspect. You can actually use a debugger as part of your QA testing to go in at the end of the run and say, are all the internal values right? Does the software appear to have behaved as we expected on the inside as well as on the outside?"
Brute force testers: "These are beta testers, and users of dot-zero versions of software, of course. And tools like CrashMe, which is one of the ones we use for Linux. And there are application level equivalents of this. The basic idea is, generate random input, feed it to the application, keep doing this until the application breaks. It's surprisingly effective. In a recent study they did this with Windows application software, feeding random Windows events to it, so effectively it simply sat there at full computer speed continuously clicking randomly, closing and opening dialog boxes, picking menu items, and typing. And about half the Windows software they subjected to this particular torture, crashed."
Root cause analysis: "I've got a friend who works on aeroplanes, and he has the wonderful job of, when a piece of an aeroplane falls off, cracks, or something before it was supposed to, they go to him and say 'why did it happen?'. And it's then not a case of saying 'oh, this analysis is wrong', it's saying 'how did this analysis come to be wrong? How did it make this wrong decision? Where else have we made this decision?' People are starting to do this with software."
"The OpenBSD Project started doing it with security in particular, and found it very effective. Every time somebody found a mistake, they'd take the entire software base for these systems - bear in mind, working in the open source world you have a lot of source code, so it's much easier - and you look, with the aid of automated search tools, for every other occurrence of the same problem, in all your software. Because if someone's made a mistake once, we know lots of other people will have made the mistake.
"All of this sort of analysis then leads back to things like, what tools didn't we use? Are our interfaces wrong? And because you're able to actually start digging in and get data, you can start to understand not only the 'oh, it's failed, I'll fix it', sort of the car mechanic approach to software maintenance, but actually the need do the kinds of things that should be done and which go on elsewhere, where you say 'Why did this fail? Where else have we got this? Where else will it fail? What should I do proactively? How do I change the software component involved so it can't happen again, or so that it blows up on the programmer when they make the mistake, not blows up on the user when they run the software?".
Document trails: "I've worked for several large software companies, before I worked for Red Hat, and trying to answer questions like, 'Who wrote the first version of this software?' and 'What other code is this function in?' can be interesting.""So you're looking at an ISDN router and you say, that's a security hole. And you have no idea where else this code appears in your company's product line. So you have no ability to test all the cases. Someone has to test each one individually, and possibly get it wrong, possibly find the code. So document trails are also a big help; where did this code come from, where is it going, what things do we know programmers get wrong with it? Actually carrying the documentation around with this software not only makes you get the documentation right so you can tell the programmer, by the way, people always get this wrong, but more importantly, you can fix it so they can't get it wrong. Because after all, programmers don't read documentation - you know that."
Rigorous reviews: "The effect of having to explain it to a second person is sometimes truly startling, as people try to explain what the code is doing and then realise that what they've written doesn't do the same thing."
Statistics: "And the final one which turns out to be really useful is statistics. Because if you've got enough copies of a piece of software out there, you can actually do statistical analysis, and so we've been doing this now with Linux, and you can start asking questions like, is there a 90% probability that all of these mysterious crashes with this kind of pattern, happened on a machine with a particular physical device, like a particular SCSI controller? Did 90% of them happen on a machine with a USB keyboard? We've actually pinned down hardware problems in this way - in one case we managed to pin down a fault in a particular brand of disk drive, because we looked at it and we realised it is directly correlated to this particular make of disk. And we went to the disk vendor, who ignored us, and eventually enough Windows people hit the problem that Microsoft went to the disk vendor, whereupon it got fixed."
A video of the presentation is available at IT Wales.
One of the head programmers behind Linux, Alan Cox talks exclusively to Builder Australia about the uptake of Linux, Microsoft’s plans to share its source code and his Linux predictions.
You have been working for Red Hat since January 2000, What is
your title and what are you doing?
I was contracted a bit before that for a while. I am a fellow; most
of what I'm doing is working on the kernel, some of it other applications
that need fixing. I also deal with awkward bugs, things that matter
to important customers with support contracts that reassures them we
can fix bugs whenever they turn up
What do you think of large organisations looking at Linux for
their solutions?
I see a lot of big financial institutions looking and deploying very
large amounts of Linux on servers. Desktop people are looking looking
at it more and more I think because open office is highly available,
accessibility, user interface among other considerations to a point
where the stuff is useful, but not necessarily perfect just yet. The
one problem they've got is the lack of available software, which in
big business normally isn’t a problem because you have large numbers
of people who need very small sets of software, but smaller businesses
can be a challenge because you’re trying to use the same PC apps for
email, for accounts, word processing, so you many find the PC's might
not be there yet.
Recently Microsoft have announced they are releasing their source
code to governments...(interrupts)
To a few governments, it leaves the question: "If they have to release
the source code to governments what does it say to the companies they
won't release it to?" It’s also interesting to note that in some countries,
where perhaps the relationship between governments, business and the
people are not so good that by giving it only to government it is giving
government the power to use all security holes in Windows so they may
have done harm to business and individuals. But (releasing the source
code) it is a step in the right direction.
Do you think this step is a threat to Linux?
No. There is a real difference between "you can have a quick look at
some of our source code but you can't do anything with it, and you have
to be a government and we have to like you". In addition to which, just
having the source code isn’t really that useful because unless you have
the code for compiler and you can rebuild all the code and verify it,
how do you know the source code they have given you is anything to do
with the actual operating system your running? And that is a very real
question. One the other hand we've had military and government security
people who've actually taken a basic Linux system including the compiler,
they’ve audited that compiler and have built and audited every single
single sub set pieces of Linux they use so they can definitively say
that it doesn’t have any back doors in it. At least in their version
of it and they are happy with it and any back doors they have put in
themselves for their own use.
What do you think of Sun's announcement to release Mad Hatter
midway through this year?
It’s just another distribution. I think it's quite clear where all the
work is coming from. I don’t see where the value added is unlike those
like Dell, but maybe they have things they'll do.
Where do you see Linux in 2 years?
I think in two years we'll see more Linux on the desktop. It will be
very interesting to see what happens, its very hard to judge.
How about 5 years?
Hopefully world domination! (laughs). We shall see, it could
be in five years time, someone comes along with something much neater,
and we'll all be wondering "those old operating systems were awfully
clunky."
What desktop environment do you prefer to use?
I'm mixed, open office, a lot of GNOME stuff and one or two KDE apps.
If you have Red Hat they all look the same, so it ceases to be what
desktop do you run and it becomes a question of which program is the
best, which program environment do I prefer to write software, so it
gets rid of that divide.
One subject you bought up in your talk is documentation in Open
Source. What grade do you give it at the moment?
It varies by project. Some projects are absolutely brilliant, a lot
of it is pretty poor. Certainly in terms of things for example Solaris,
where where the Sun people have had this long going very tight discipline
about documenting everything in a very definite and clear format. So
we have some way to go in that field.
In yesterday's Q and A session, there was concerns raised about
the benchmark getting higher and higher to start working on the kernel.
Is this a real concern?
There is a thing about the core of the kernel because it's a very very
complex, very very refined piece of software. In terms of writing device
drivers it has actually gotten easier as there is a lot more infrastructure
in the kernel so there is a lot less code you have to write and a lot
more code to copy. Being open source the way you write a driver is to
find something similar, copy it and go from there. It's perhaps a little
different in the Windows world.
When is the next Red Hat release?
We don’t pre announce releases and I can’t speak for Red Hat.
There are many standards bodies around. What do you think of standardisation?
A lot of them go way back before Linux. Most of the standards bodies
things are useful. The problem sometimes comes when standards bodies
standardise things that are dumb and generally free software ignores
those standards. Most of the standards bodies are not necessarily interested
in free software but they see free software as part of the universe
they are trying to build standards into. Having one set of standards
for free software and one set of standards for Windows doesn't really
work, the only person you hurt at the end of the day is the customer
as they can't switch easily.
How have you found Linux.conf.au 2003?
Really good. There is some good technical stuff here. The other main
conference I go to is the Ottawa one, which is the other technical one.
Things like Linuxworld which is marketing and press releases just isn't
my thing.
Copyright © 2005 CNET Networks, Inc. All rights
reserved.
Builder is a registered service mark of CNET Networks, Inc. Builder
Logo is service mark of CNET Networks, Inc.
Part I: In an interview with ZDNet UK, the Linux 'kernel hacker' gives his views on the GPL, 64-bit computing and why grandmothers should want to use LinuxAlan Cox is generally referred to in the open-source developer community as a "kernel hacker" -- someone whose programming responsibilities cover the Linux kernel, or core, itself. Thousands of developers all over the world, from hobbyists to IBM engineers, are constantly contributing to open source software, so Cox's role of organising and applying improvements is vital.
Cox makes use of the decentralised nature of the Internet to work from his home in Swansea, despite the fact that his employer Red Hat is based in the US; in fact, at the moment he prefers not to visit the States, because of concerns about the Digital Millennium Copyright Act (DMCA).
He spoke with ZDNet UK in Swansea in a wide-ranging interview touching on the latest challenges for Linux at the high- and low-end, the arrival of revolutionary 64-bit hardware and why it's hard to argue with the economics of open-source software.
Q: What were some of the biggest developments for Linux last year?
A: I guess the 2.4 kernel coming of age is one of them. We now have a good solid, very scalable kernel, which all the vendors are shipping, that's improved no end on four- and eight-way machines.Is scalability mainly an enterprise issue?
It's pretty important for the big enterprises. It helps everybody. If you look at the direction the market is going, you've got things like Intel's new Pentium 4 with hyper-threading, where a single CPU is effectively two, so you've got scaling issues going on in single-processor machines.Beyond that a lot of the user interface work is really starting to show. The Nautilus file manager... some really, really good work being done with KDE and also configuration tools.
What's the importance of improving graphical user environments like KDE and GNOME?
Within a very large organisation what they want is a small number of people who can maintain large numbers of machines, and they'll be very skilled people. The moment you go into smaller businesses and outside of the big-business, large-government role, you need machines where anybody can say, "Ah, what's happened here?" and fix it. So it doesn't necessarily become a case of needing the power, you start to need the usability. Even if the usability means you can't do some of the clever things, these people still need to be able to do things like set up a simple firewall and configure their email and stuff. So that is very important.On a technical level, other things would be clustering work being started.
IBM recently introduced its first dedicated Linux mainframe server. Linux on the mainframe seems to be getting a lot of attention at the moment.
It actually goes back a fair way. It's been around for a year or so. Now it's really starting to take off with this server consolidation thing, and IBM has sort of hit the wave with the blade server people as well. If you can run a thousand copies of Linux on one machine, what are the savings of not having a room full of computers? I think they just have been at the right place at the right time in a sense. They've got hardware which is very fault-tolerant and they can ship it now while everyone else is very much -- all the PC style hardware doing this is very new.IBM is spending a lot of money promoting its Linux plans. Do you see any problem with IBM integrating Linux into its corporate strategy?
There are one or two points of friction for IBM. Certain drivers for the S/390 (mainframe) are closed source, which has proved problematic for customers, because they can't upgrade to the versions of the kernel they want, because they can't get the right drivers from IBM. On the whole, though, I think it would be fair to say that IBM have been extremely good citizens of the open source world, they've contributed a lot of very, very good code. I don't think that's a big issue, just this one S/390 issue.What are the most important developments coming up for the Linux kernel?
In the desktop world there are a set of transitions for the legacy-free PCs which we have to be ready for -- we're pretty much in the right spot. So you see machines where USB is basically the only plug-in interface. ACPI (Advanced Configuration and Power Interface) is becoming a requirement on machines, so you have to support the ACPI configuration.We've covered the Pentium 4 hyper-threading. There are more scaling questions, because we have more memory, bigger disks, again and again and again. Possibly the Intel IA-64 processors, depending on if they take off, and the AMD Hammer could be a very, very big thing. That looks like that will actually be a consumer-oriented 64-bit processor. It will be able to run 64-bit and 32-bit as well.
Who will mainly benefit from 64-bit?
We get large numbers of people, particularly people with large financial analysis systems, electronics design automation, even things like SAP, where believe it or not, 4GB of address space, 4GB of memory, is just not enough. The SAP people have to actually try and squash their code into it, the software is so powerful. For those kind of roles 64-bit is basically essential, and at the moment a lot of these people are stuck on very expensive proprietary system, and once they can go 64-bit, they'll be able to dump a lot of this hardware, and move over to mainstream PC platforms and save an absolute fortune.The consumer level impact is, I suspect, just speed improvements here and there. Obviously being able to do 64-bit integer math is good for certain kinds of 3D work, so it might help the gaming people. There's all sorts of other applications where having 64-bit just happens to be a help. But the really, really big gains are for the big electronic design automation systems.
It opens up a whole new market for Linux, doesn't it?
It's effectively extending the PC itself, not just Linux, extending the whole PC into another market area where currently there's a barrier. In some ways it could be as big a shift as the 386 was in opening the 32-bit world to the PC. From that the PC became the ubiquitous 32-bit system; now it could become the ubiquitous 64-bit system.Last spring you met with some of the other main Linux programmers in the first Linux Summit. Was that useful, and will it happen again this year?
Provisionally so, at the Ottawa Linux Symposium (OLS), I believe, which is the main kernel developer forum. It was strange in a way. The official part of it was actually very non-productive. The amount of work that got done over beer and at three in the morning cannot possibly be overestimated.A lot of it is meeting people. Meeting somebody occasionally you get to understand them enough that you can follow things in email or other discussions that you just would not pick up normally. So yeah, it is a big, big help.
Sometimes lots of things come out of it. At the last OLS I went to there were people literally sitting around with their laptops and they would just look over each other's shoulders and get to talking and pick up on things that never would have occurred to them before.
How militant are you about which licences people use for their software, and how they use them?
People who are not following the (free software) licence are pirates, it's as simple as that. It's no different if you take GPL (GNU Public Licence) code and don't give people the source code, or if you make copies of movies and sell them to people, it's the same thing. In terms of other software, it really depends on the people who write it. I don't think you have a right to dictate how somebody controls their own work, apart from the very, very basic standard you'd expect.Ximian recently decided to switch class libraries for Mono (a clone of Microsoft's .Net) away from the GPL. How controversial is that to you?
I've only looked briefly at the reasoning behind it, but I think at least part of the reasoning was that, for something like Mono they wanted people to be able to link proprietary code with the free software code and mix them up and get it to work. Really you have to ask the Ximian people about that. I don't really have a problem with it; it's their software.You feel it's important for Linux that free software licences are able to coexist with proprietary licences, don't you?
We're being very careful with that. We specifically allow people to use all the system call entry points for Linux for driver software, and the main libraries you need to build applications are under (a different) licence. So the library itself you have to provide source for, but not the application. Because obviously Oracle are not going to give source code to their tools. But you don't want to create a system where you arbitrarily shut people out -- that's the Microsoft world.It's actually ironic that, because Microsoft has started putting licences on Windows libraries now which basically forbid you from writing free or open source using their Windows libraries. They're specifically trying to shut out and control. They're monopolists.
Their role as a monopoly changes the way they approach software, you're saying.
An application and operating system should be totally different things. They're different works. It should not be Microsoft's business how (an application) is written and vice-versa. And as a monopolist, even more so -- the fact that you can say "oh, you're not allowed to licence your code like this and run it on our system," that's 90 percent of the desktops, bang, gone. So as monopolists they have duties beyond the norm.Do you feel it will be Linux's ultimate role to be ubiquitous on the desktop the way Microsoft is today?
It may play in the same markets, but it can never play that Microsoft kind of role, because being open source, you can't control people, you can't force the prices up 40 percent every year, you don't need to force people to upgrade because it's a service-based industry anyway. So looking at it from that perspective it's very different. In terms of being able to get it into lots of very different market areas, then yes, I think that is important.See Part II: The battle for the desktop
Cox, one of the chief contributors to the Linux kernel, says 64-bit computing will open up a huge new market for open-source software. He also finds fault with proposed Microsoft-influenced guidelines for reporting software security bugs
The advent of affordable 64-bit computing could be the best thing to happen to Linux in a long time, opening up a new market potentially as important as the original PC market, according to Linux "kernel hacker" Alan Cox. He also criticised new guidelines suggested by the Internet Engineering Task Force (IETF) covering the reporting of software security holes.
Click here for part one of ZDNet UK's exclusive interview with Alan Cox
New processors emerging from AMD and Intel -- whose main focus has until now been desktop chips -- will allow many companies currently locked into expensive computer systems to switch to mainstream chips and open software like Linux, Cox said. The new Itanium line from Intel and the upcoming Hammer range from AMD offer similar performance to the RISC processors made by the likes of IBM and Sun Microsystems, but aim to achieve desktop-level prices.
"Large numbers of people will be able to dump a lot of expensive hardware," said Cox in an interview with ZDNet UK. "It will effectively extend the PC into a whole new market area. It could be as big as the 386." The 386 was an Intel processor introduced in the late 1980s, known to Linux developers as the first consumer processor powerful enough to run industrial-strength software like the Unix operating system.
The 386 was instrumental to the early growth of Linux, a Unix-like operating system that many say could replace Windows as the dominant software on PCs. Once reasonably powerful hardware was available for a low cost, a large number of programmers began installing and improving Linux -- Cox, then a student at the University of South Wales, among them. The result, more than ten years later, is the software that runs a good number of the servers on the Web and many of the protocols that make the Internet work.
AMD's Hammer is particularly promising, Cox said, because it will run on both consumer and server platforms right away. Unlike Itanium, Hammer is optimised to run software based on the current x86 instruction set as well as 64-bit software. Itanium places the emphasis on 64-bit code, leaving the consumer market to the Pentium 4 for the near term.
Cox, an employee of Linux vendor Red Hat, is now one of the chief developers on the Linux core -- or kernel -- and is largely responsible for coordinating and integrating the contributions of hundreds of developers around the world. Linus Torvalds, who initiated the Linux project as a student in Finland, still has the final say on modifications to the kernel.
A controversy arose recently over whether the job of applying "patches" was getting to be too big for one person, but Cox says he feels the solution that emerged, involving automating the kernel changes, is ultimately satisfactory. "The free software community has a way of self-correcting when problems arise," he said.
Linux is based on the "free", or open-source, development model which requires developers to make the original programming code of their software improvements freely available to other developers.
Communal debugging is central to open-source development, and Cox bridles at recent attempts to change the way bugs are reported to software vendors. A recent draft protocol from the Internet Engineering Task Force (IETF), for example, has been criticised for stigmatising those who report security holes before the software vendor has had a chance to create a patch, and Cox tends to agree with some of the criticisms.
"It's too prone to let things run and run and run," he said. "If the vendor hasn't fixed the bug in 28 days, then tough -- after that you're not reporting bugs, you're covering up for a company's incompetence, and there's a very big difference."
The immediate challenges for Linux developers include extending its capabilities for both power users and grandmothers, Cox believes. On the high-end, scalability is becoming an increasingly important issue, while it's also crucial to make Linux accessible through simple interfaces, he said.
The emerging use of hyperthreading within Intel's Pentium 4 processor means that Linux must scale even within the chip. Hyperthreading is designed to improve performance by allowing the chip to behave like a two-processor system. It is present in all Pentium 4s but is only made use of so far in the new server P4, called Xeon.
Progress was made in this direction last year, said Cox. "The 2.4 kernel is coming of age," he said. "It is getting more scalable, especially with four- and eight-way machines. That's the direction the market is going."
On the other hand, simplicity is more important than power if Linux is to penetrate certain markets, like small businesses and the home, Cox argues. He applauded the advances last year made by organisations focused on user interfaces, such as GNOME and KDE.
The home market is the toughest market to crack in many ways, he said, because of the particular needs of consumers. Ultimately, though, the all-purpose PC as championed by Microsoft may prove to be less attractive to home users than a simpler, less expensive machine specialised for applications like Web use and productivity tools. Linux is ideal for such machines because of its low cost, reliability and flexibility.
"You could question whether the consumer PC market will survive in its current form," he said.
The shift to specialised, Web-centric devices is made easier by the fact that the new applications users want no longer require a monolithic, standard operating system, but simply a standards-compliant Web browser, Cox said. Microsoft argues that users will always want a uniform operating system so that everyone can run the same applications; it also says its operating system monopoly makes things easier for developers by providing them with a huge, homogenous market.
Cox's views, not surprisingly, tend in the opposite direction. "There shouldn't be one worldwide operating system," he said. "Peoples' needs are all very different."
One Linux innovation last year that was less than a complete success was last year's "Kernel Summit", where many of Linux's main contributors, including Torvalds and Cox, met face to face to coordinate their plans. The official part of the programme wasn't very productive, although things picked up after hours.
"The amount of work that got done over beer at 3 am cannot possibly be calculated," he added.
Alan Cox is one of the most influential IT innovators in the world. A graduate of the University of Wales, Swansea, he has been a key developer of the Linux kernel for nearly a decade. Currently working for Red Hat® writing kernel and application code, Cox was previously responsible for the original Linux multiprocessing support, and for much of the early work on networking. Here we ask him about his changing role at Red Hat, and learn about the benefits Linux brings to business.
itwales.com: You're a leading kernel developer on Linux. What exactly does your role entail?
Alan: Mostly I am involved in making sure changes get integrated and that the changes are of a high enough quality. Often this also means working through longer-term plans for the Linux kernel. It also gets to be fun because many of the contributors have conflicting aims and it is necessary to find problem solutions that work for all cases √ from Linux on a PDA to Linux mainframes.
itwales.com: The Linux kernel is a modular one. What benefits does this bring to the OS user?
Alan: Modularity is an essential part of a reliable system. If you cannot change one part of the system without needing to modify the rest of the system you cannot fix a bug without risking introducing thousands more.
The modularity is more important to developers. With developers working on Linux on all continents its essential that everyone can make changes without full communication.
itwales.com: You recently stated that you plan to work more closely with customers. How do you see your role changing in the near future?
Alan: Red Hat is starting to pick up a number of enterprise customers. These people pay for √ and expect √ a very high standard of service. That includes improving the OS kernel to provide facilities that they depend on in legacy platforms they want to discard. One of the things Red Hat has to be able to do is to deliver those facilities.
itwales.com: The Linux OS is renowned for being stable, fast and virtually virus-immune. How have you achieved this?
Alan: Open development. People have spent ten years looking over each other's code able to refine the existing code and to spot security holes. The same process of peer review that ensures university research quality and that bridges don't fall down has simply been applied to software, which as an engineering discipline should always have been the case.
itwales.com: As a student, you installed Linux on the Swansea Computer Society computer. Is this how you began experimenting with the Linux kernel and became interested in Linux?
Alan: At the time the 386 based PC began to take off it was apparent that this was the better longer-term option for the society. We had two old minicomputers kindly donated by the computer center but we needed to move on. It also appealed to quite a few of the society people because it was not a closed box. The computer society's goal was educational and a bunch of students attempting to debug their own kernel certainly proved that.
itwales.com: Open source' means software is owned by everyone, and anyone can contribute to it. Is the sharing of ideas important to you? Was choosing to work within the free software community an ethical decision?
Alan: Technically the software is still owned by the person who wrote it, but that is more of a credit thing √ which is important in the community. For me it wasn't really an ethical decision, it's simply the right way to do engineering. You don't build reliable bridges by refusing to let anyone see the plans.
There is a real problem in both the US and Western Europe today with people trying to own and control ideas, but that is something bigger than just software or free software. Ironically it is having the same effects on free software as other things √ all the great innovation is moving to Eastern Europe, India and South America.
itwales.com: What are the advantages of an open community when it comes to product development?
Alan: From the developer point of view it means there is a huge range of talent. No matter how obscure a problem or a requirement is there will be someone who wants to solve it and who understands the field.
It also allows the sharing of development work. A large part of a computer system nowadays is generic and the revenue is in customisation and services. In the open community the cost of building the generic parts of a system are shared not duplicated. For researchers it has turned out to be a very big blessing too. It is possible to take an open source OS and modify it to test research theories and algorithms in real world environments without building costly throwaway mock ups. Furthermore, if it works out, it can be folded into the main project.
itwales.com: Linux has yet to be widely adopted as an OS by businesses, but the expense of Windows new XP operating system might change that. How are you targeting businesses?
Alan: Larger companies are adopting Linux rapidly for server systems in particular. Getting further into that market is now mostly about growing the quality of high-end support services.
The desktop is more challenging because desktop users are an extremely varied bunch of people. It demands a high quality and an easy-to-use environment √ which is now mostly there √ and it demands a large application portfolio which tends to be the chicken and egg problem.
At the moment the desktop market for Linux is growing in two areas. Firstly in providing large numbers of easily managed desktops running either custom or very standardised software (such as the Star Office suite), secondly in the technical desktop market where the tools wanted are primarily the powerful development tools Linux has had for many years.
The ever-rising price of MS Office is increasingly pushing companies to look at Star Office both on Windows and on Linux. In many ways the effective forty per cent price hikes in Microsoft pricing have been the biggest driver of Linux on the desktop.
itwales.com: Are the merits of Linux's business applications attracting users?
Alan: The main things that attract business at that level are the pricing, reliability and the reduced business risk. The fact that there are multiple suppliers of the operating system gives a great deal of comfort to companies using it. In addition the license ensures that they can always get a custom change made for their own use, even if the main distributors are not interested. In the open source world one example of this was Y2K. When packages had Y2K problems and were no longer maintained by their authors, anyone or any group of users could fix or pay for fixing work. There was no ⌠enforced upgrade■ risk.
itwales.com: It's been said that in the last year, particularly with IBM's use of Linux technology, Linux has become a mass-market alternative to Windows. Was 2001 a turning point for Linux?
Alan: It didn't strike me as a turning point. There has been a continuous trend in the increasing use of Linux particularly server side. With some of the big names now using and supporting it, visibility has increased.
itwales.com: Do you think Linux markets itself effectively to businesses?
Alan: That is really a job for the vendors, and I think they are doing a good job. There is a difference between effective marketing and claiming to be the one true solution to all problems. Linux is not the one true solution (if such a thing truthfully ever can exist), but we are working on it.
itwales.com: Why should an SME choose Linux as an operating system?
Alan: Because it will save them money and do the job better. If at this time that isn't true for their application set then they shouldn't choose it. The desktop monopoly has perhaps clouded things but with any tool the same fundamental rules apply, be it a hammer or a web server. Is it the right price, is it reliable, will it do the job?
itwales.com: How does it save SME's money, specifically?
Alan: As an SME you can pick from multiple vendors, or download it yourself. You can install it on as many machines as you like without expensive software auditing. If you need specialist features you can go to a company with experience directly in the matter. You can buy support from where you feel happiest, including companies that actually listen to their customers. No single company controls the ability to modify the software.
In many ways the lack of a per seat license to install the software is a side effect of the recognition that it's more efficient to develop openly. The better overall pricing, improved reliability and removal of vendor lock-ins are the really important factors.
itwales.com: How can Linux overcome Microsoft's dominance at the desktop? Will you have to come up with radical new technology?
Alan: In part this depends on the legal settlements. One of the big problems right now is getting Linux pre-installed on a PC. When you investigate why this is hard you end up looking back at questionable monopolist influences.
With the settlement, the large number of civil lawsuits pending, possible EU action, and the question now raised in the US about whether business practices of not paying dividends are in fact allowable or an illicit tax haven there are several chances for justice to be done.
Beyond that, the open source model is faster and more cost effective. It improves more rapidly, and for less investment. It's very hard to compete against a fundamentally more efficient model.
itwales.com: Microsoft recently implied that it's going to seriously target Linux in 2002 as a competitor, plus any vendors that support it (such as IBM). They are especially concerned with the server marketplace, and aim to find out about the use of Linux in their customer base. How can Linux combat this "assault' from the IT giant?
Alan: Primarily by being cheaper, more reliable and higher quality. End users believe their own experiences over a salesman. Company directors talk to each other as well as to sales people. In terms of advertising, IBM have already been running Linux TV advertising in the USA.
itwales.com: In recent years, commentators have warned of a fragmentation of Linux (in a similar fashion to Unix). Because the OS is open source, programmers can come up with different versions, and applications may not run on every version of the OS. Do you think a level of competition will be introduced by this?
Alan: Competition and product differentiation don't have to mean incompatibility, and in fact the incompatibility story is mostly a marketing myth put about by a certain large vendor. The Linux companies care about compatibility a great deal, and one recent result of this was the Linux Standard Base, which defines precisely the base behavior of the core Linux software that applications rely on. You can expect to see compliance statements in the next series of vendor releases.
itwales.com: You resigned from the Usenix ALS committee earlier this year, reportedly because Dmitry Sklyarov, the Russian programmer, was arrested in the US. What do you think of the situation in the US at the moment with regard to the Digital Millennium Copyright act?
Alan: At the moment I consider the USA not a safe place for a software engineer to visit. Money and lobbyists buy many things but when it comes to the courts I don't think that the DMCA aim to send people to jail for even discussing security flaws is going to stand well against the US constitution. Until then I'd rather play safe.
These things happen. Right now the UK Government is busy trying to pass the similar European copyright directive into law in a way which may well make it a criminal offence to help a blind person read an electronic book if it has been protected by some mechanism that interferes with their screen reading software. It also puts web caches that do filtering (for example pornography filtering for schools) on questionable legal ground.
itwales.com: What is your opinion on the Government's involvement with Microsoft? Do you think that governments, as a rule, should use open source technology?
Alan: When the prime minister is appearing at product launches by a company twice found by courts to be abusing a monopoly, and facing billions of dollars in lawsuits you have to ask questions.
Governments should evaluate open source technologies certainly. The fact they get the source code and can audit it has been a reason for some countries to adopt open source, pricing is another. However, I don't think its right that government should have fixed rules beyond ⌠fair review■. There may be situations where proprietary software is genuinely the right choice.
itwales.com: In terms of its skills base and its WDA initiatives, do you think Wales is improving as a venue for software development?
Alan: In some ways √ and the lack of London pricing means it is cheaper for an SME to get the staff (as well as a higher standard of living for the staff) than in the South East. Right now we seem to have a problem in that all the IT literate people move to the South East because there is little Welsh IT employment. As a result of them moving there is no expertise here so there are not enough Welsh IT companies. Thus the cycle continues.
It is a very hard problem, and one I am glad I don't have to solve!
1) Is Linux heading towards a major rather than minor computing role, or is it still too early to tell how things will unfold?
I think it is still hard to predict. The market is changing so fast.Not only do we have the shift from the desktop to servers and thin client machines happening - and it is happening at least in consumer spaces with things like NTL's TV set internet stuff and the ongoing projects from other big vendors. We also have the exploding mobile market, the phone/PC fusion and more. So it's hard to guess.
Linux does well as a server OS, it does extremely well on the thin client and embedded environments too. The configurability and the fact that it isnt owned by a competitor to the companies using it make it a big win for them.
The desktop will be the hardest battle. The Microsoft lawsuit might help there if there is enough will in the US Government to split Microsoft and force the units to act alone. We now have most of the GUI environment we need for end users.
2) What do you think has been most integral to Linux's growing success and visibility in the past 18 months?
a) In terms of the OS itself (eg, why is Linux such a contender?) and
b) In terms of marketing success (eg commercial deals, product announcements, backing of big name players, etc).
The biggest thing of all has been the rest of the world seeing free software going from a crazy concept to a marketed advantage and marketed in a way that the business community understands. The stock market flotations also gave it a definite air of reality.You can get news out of hype, and there has been a lot of Linux hype from companies either because they sell Linux products or because they see it as a stick to bash Bill Gates with (or both). Hype doesn't actually create a large user base and sustained deployment which Linux unlike Java is getting.
In terms of Linux itself I think the Gnome and KDE desktops have been the big shift in thinking. Linux has picked up people who believe in OpenSource and who want Linux to be part of a bigger community than just the computer wizards.
3) Where would you like to see Linux go today?
- what could be done to most enhance the OS itself and why would that element be important (user interface, killer app, or whatever) and
- in terms of commercial success.Is total world domination (in the commercial space) important?
The big step is the desktop. There are vendors with clear interests in this sort of area - Helixcode and Eazel for example who are doing real money work on the GNOME desktop. The other big area to deal with is high availability clustering. Wensong Zhang did the first free Linux clustering for web server failover and the like but that is only the baby steps. To do full clustering and to be able to position Linux to replace things like VMS as a highly available (and here we are talking minutes of downtime a year) clustering solution is no small job.I personally think the desktop or at least the thin client end of the desktop is the most important. Linux has good security features which makes it ideal for things like call centre environments, although perhaps less so for their staff. Building a thin client Linux environment with hotdesking, sensible shared file store and good network efficiency is a golden opportunity.
Total domination is bad. The Microsoft dominance already badly misled people about how to choose systems. Instead of 'what tool do I use for the job' it's 'well it was shipped with the box'. Linux is a tool, Windows is a tool and so are numerous other systems. It's really important people go back to looking for the right tool for the job. That will never always be Linux. No single tool can do everything well.
4) If it's important, or if you think that Linux is going to take a larger role in the commercial world anyway, where do you think it will have its greatest success, and how might that change the computing industry?
For example: at first, Linux seemed a desktop OS for enthusiasts, then began to sneak onto servers and has staked out some serious web server territory, of course. But IBM is eyeing it, for example, as being as important to applications as TCP/IP is to the internet. Software companies will port to Linux, they say, because it's easier than porting to Unix flavours or NT. That benefits hardware companies and big app companies like SAP, according to that view. I'm interested how you think things might develop; if you'd agree with the IBM scenario, for example.
Linux is the sum of contributions so it will go where the contributors take it. Right now that is everything from IBM mainframes to pocket computers. Some of it is through investment and funding and a lot of it is because someone just thought it would be neat if Linux did XYZ and had fun working on it.I can see Linux becoming the generic OS for embedded applications and servers. Where people will pick Linux unless they need some given feature that favours another system. I don't know what will happen on the desktop.
Only time will tell.
5) There's much talk now of Linux fragmenting into flavours. Do you think this could happen?
The ISVs don't and won't stand for it. The customers won't stand for it and the community definitely wouldn't. There is a lot of work right now to get definitive standards via the Linux standards base (linuxbase.org). That has real big vendor backing because the vendors want to ship 'Foo for Linux' not 'Foo for Red Hat', 'Foo for Mandrake' etc. Also non-Linux people are quite interested in this too. We may end up creating a common interface for applications on the PC in any Unixlike environment. That is definitely good for customers.
Ultimately nobody who wants to build the Linux space either commercially or for fun has any business or technical reason for creating an incompatible variant. In fact the pressure is entirely to improve compatibility.
6) Can Linux and the open source attitudes of the developer community surrounding it co-exist with Linux's commercial success? Do you see strains arising out of these developments?
The developer community on the whole seems not to care too much.
Certainly the developers I work with both in and outside Red Hat are more loyal to Linux than to their companies. There are certainly marketing people who see other vendors as the enemy. Programmers never cared for marketing peoples' views anyway ;)
7) Linux companies hit celestial IPO heights recently then have crashed since then. Is this a 'correction'? A failure of Linux to meet expectations in the commercial world? A pointer that an open source OS won't make anybody money?
I think there is plenty of scope to make money, as it seems to do a lot of people. Its certainly not going to make Microsoft like profits and that's probably a good thing for everyone long term. It is about time people got fair prices.
The stock valuations just remind me that the US technology stock market is basically a gambling den. It seems to be based on riding hype to the highest value you dare then trying not to be the last one to sell. In time I imagine the Linux valuations will settle down.
8) Finally, what motivated your own involvement with Linux and why does it remain so central to your life?
Well I got into it by accident. I wanted an OS to debug my multi-user game on and Linux hacking became a sort of hobby.
Now I'm paid to give away code and watch it empower people all over the world - especially developing countries. I don't know many jobs to compete with that one.
Alan
At the Linux Expo we had a chat with Alan Cox about everything from kernel hacking to Jolt cola. Check out the transcription by hitting [DETAILS].
Interview with Alan Cox
Please note that this is a transcription of a conversation.
Everyone knows you as the kernel hacker in the Linux world. What do you do when you are not hacking the kernel?
At the moment I am in the middle of trying to buy a house. It involves a house where all the windows need replacing and a few other things like that. Thats one of the things keeping me occupied. Trying to figure out how to get rid of all the hardware is the other. Two things are hard. One of which is finding space for it and the other is getting stuff you don't need back to people, so there is a fair bit of space management going on.
How is Telsa taking all of this?
Better than she took two MicroVAX 2's in the living room a few years ago.
Those are the ones about the size of a fridge aren't they?
Thats about right, yes.
Did you manage to get Linux running on them?
I was just looking after them temporarily while waiting for someone who was a bit late taking them off my hands. They get a little bit touchy after a while.
What else do you do hobbies wise?
Well, Linux is my hobby. I have been playing around with various things; I have a spider plant I am growing and stuff like that.
Any cacti?
The machine room would probably do very well for a cactus.
Getting on to the more technical side, what are the main elements of the kernel that you *enjoy* hacking?
I have great fun with all of it. Most of what I am doing now is coordinated across large amounts of the kernel, so there is not really any specific bit of the kernel that I am concentrating on anymore. In some ways thats more fun, because every day you are dealing with a different piece of the code with a bunch of people.
On average how much time do you spend hacking away?
Dunno, probably about 60 hours a week.
A fair bit then...
I do sleep which is something contrary to rumor.
The Linux community seems to have given you the status of the archetypal hacker. Do you think that you fit the stereotype of a hacker; sitting there with a pointy hat drinking loads of Jolt cola?
Well, unfortunately the Swansea supplier of Jolt cola no longer sells Jolt cola. I am currently investigating whether you can buy it by the crate wholesale.
We have had similar problems. We wanted to buy it from Think Geek but unfortunately they don't ship it over here.
I know there are UK suppliers, so somebody needs to get into the business of selling Jolt in the UK, by the crate load over the Internet.
Yeah...we should maybe think about that.
On the kernel front, the jump from 2.0 to 2.2 everyone thought that happened too slowly, it took over a year to come out. You seem to have gone quite quickly from 2.2 to 2.4 and that kind of worries me. Why has it taken so little time?
Part of it is how many things you stick in each time. With 2.2 we just kept thinking..."we will just squeeze this and this in". We could probably have stopped two or three times in the 2.2 build up. We thought 2.1.40 something was pretty stable, we will go from there; we didn't do that though; we might as well try this as well. It's always longer than you intend anyway. We thought that we want to get a release out, so we won't put journaling in the standard kernel. Journaling is great but it will have to miss if we are to get it out.
Are you going to put the Reiser FS in the 2.5 kernel?
As I understand it we will take all the code for the journaling file systems and create a single layer that everyone can use, so with the ext3, Reiser, JFS, XFS; a lot of what they need is common, so we want to get them together to build the right common code. Also, NWFS, the network file system is apparently now journalled.
What's the first thing your going to be adding to 2.5?
I have no idea. It depends what the first patches are that I get.
How did you originally get into coding the kernel? I assume you did user level stuff before the real low level stuff...
I got involved with a multi user game. Having done that I kind of hacked UNIX like behavior into the Amiga so I could test MUD on it. I wasn't as arrogant as Linus, so I didn't believe I could write my own kernel. Arrogance and stupidity were two of the starting ingredients in some ways! Finally there was a reason to own a PC; PCI had got better but there was nothing to run on it. 386BSD was announced, but I downloaded the earlier release as 386BSD required a floating point chip, and those were about £50 extra; I wasn't far past being a student, and I wasn't very rich. I managed to scrape together this old 386 with 4MB. I did some stuff with the university computer society; the university went Ethernet. We got the new TCP/IP stuff implemented and bang...ok...debug it a bit...bang...debug it a bit...oh it's staying up...log in...bang...and we were actually the first people putting the first Linux networking code on really busy networks. The campus network was busy and had lots of protocols on it, so we were the first people seeing lots of bugs, so we started fixing them because we kind of wanted the university bulliten board up for more than three hours a day. With 0.99 to 1.12 we were having more uptime hours than anyone else. Fred took over the networking for a while, and he wanted to rewrite it, and we were trying to work with the original code, so I sent out a bunch of patches called net2debugged and Linus started applying them, and somehow I ended up in charge of the networking code. 2.0.29 was really the last one that I was in charge of the networking.
One of the most fundamental questions to ask is what is your favorite editor!?
Joe. The first editor I really worked with was Wordstar, and Joe has Wordstar keys.
Have you ever been tempted over to FreeBSD or Windows?
Windows...no. If Linux had not been around then I would probably be using FreeBSD. It is a really technically excellent Operating System.
Do you enjoy the press attention?
I try to keep the journalists away...at shows I can't avoid them. At the show they hassle me, when I leave they don't, and thats fine.
KDE or GNOME?
Currently I run GNOME. Saying that, I run whatever I happen to prefer, it just happens that GNOME fits my desktop how I like it. I am not pro either of them. The reason I didn't use KDE because of the Qt non-free license, but now it's a free license it doesn't matter, so it's just a choice of one or the other.
Linux Journal: How did you first learn about Linux? What were you doing in your own life at the time?
Alan Cox: I was hacking bits of ideas for my own OS and working on a MUD called AberMUD. I had pondered getting a decent PC, since the Amiga was getting a bit long in the tooth. 386BSD came out, and it looked like there was finally an OS worth running on x86 hardware. Linux came out around the same time, but didn't need an FPU, so I started running Linux.
LJ: What attracted you to it, compared to FreeBSD, proprietary Unix systems, or lucrative areas such as Windows? What made you want to help with development?
Alan: Linux was a lot easier to set up in the early days; MCC Linux and then SLS made it really easy to install by the standards of the time. I looked at the BSD systems, but I liked the way the GPL meant I was writing code that nobody could run off with. I didn't really chose not to hack 386BSD; I was just having too much fun with Linux to bother.
LJ: What part of Linux were you personally interested in and working on? How are you still involved with Linux development?
Alan: Initially, I was working on the networking code after Ross Biro stopped maintaining it, and Fred van Kempen basically dropped the mainstream code to rewrite it. I ended up maintaining the code and getting it going. Nowadays, I do the stable kernel releases and a fair bit of patch merging and debugging of drivers.
LJ: What was most important to you about Linux? What's the very best thing about Linux?
Alan: I think the most important thing about Linux is that it gives people the ability to do what they want. The "Penguin Powered" logos people love should really be "Penguin Empowered". That, I think, is the best thing about Linux, too. We've given the computer back to the user.
LJ: How important was the GNU project, and how did the GNU Hurd factor in to your thinking? Should Linux be known as GNU/Linux?
Alan: I knew about GNU several years before. In fact, in many ways, Linux exists because GNU chose to pursue the Hurd rather than using UZI as their Unix OS core, as they could have done. GNU/Linux is perhaps overstating it, but ignoring the FSF contribution is even worse. Richard has perhaps made a few enemies by insisting on GNU/Linux — but it does remind people.
It's really x11/BSD/GNU/....../Linux.
LJ: What was it like to be working with others over the Internet at a time when several computer luminaries thought that organizing successful software development over the Internet was difficult, if not impossible? Did you realize how revolutionary this approach was?
Alan: I don't think it was revolutionary. People had been doing it for a long time before that, a very long time. The modern Internet and large-scale access just made that project a bit easier — the lack of people and slow networks killed earlier equivalent projects like UZI and OMU, stopping them from spreading.
One thing Linux taught me: far too many people write about software design, but have never run a real-world computing business. The folks who seem to get software design right are mostly engineers. They want it to work, they want to solve the problem and they aren't totally obsessed by reusable components, object orientation, Java ... whatever the meme of the month is.
LJ: What are you doing with your life now? What's a typical day like? How do you find time for work and Linux, and how do you balance free software with the need to make a living (or the desire to become rich)? What do you do for fun?
Alan: I work for Red Hat. I work from home, hacking free software — it's great. It's been a bit busy, but I have a lot of fun when I'm visiting shows and abroad, both meeting the people and also getting to see other places. It isn't just the shows; I've visited the depths of Iceland with a bunch of mad 4x4 drivers, spent a week in a very snowy Vienna, and been to the Glengarry Highland games in Canada.
LJ: Who do you think other than Linus has had the most influence over the Linux community, and why?
Alan: Umm, that might be me, although I try not to. It really depends on which part of the community you mean, or even outside of the community ... People like Eric have had a big influence on business folks, which I certainly don't have.
LJ: What do you think is the most important addition or change needed by Linux in order for it to succeed further? In what direction does Linux development need to go? Where is Linux's future the brightest? What is the #1 biggest threat to Linux today?
Alan: Probably the biggest thing Linux needs now is better applications and user-space tools. We need to take Linux to the level where you can give it to your grandmother, and not expect a phone call back except to say "thanks". The biggest thing the kernel needs now is documentation.
I'm not sure what the threats to Linux really are. The biggest one is probably Linux fragmenting. I don't think that is going to happen in the mainstream, but we are already seeing a few vendors pulling that way in embedded space.
The app vendors and the users, I don't think, will tolerate a vendor going off at a tangent.
LJ: How do you feel about Linux's current popularity? Would you have preferred it stayed contained in the hacker community? Would it have survived on the fringes?
Alan: It was a bit of a surprise. On my first trip to Red Hat, they had about six people, and the new boy was this Donnie Barnes guy. Now they are heading for five hundred.
Linux would have survived on the fringes, I think. There has always been a market for things people can actually play with and tune.
LJ: Would it have survived without the IPOs and financial backing? What impact has the commercialization of Linux had? How do you feel about Linux profiteering and the people who make millions off of other people's volunteered efforts?
Alan: I'm working for a vendor. I get regular mail from people trying to find Linux-aware folks to hire. I think those who wrote code for fun have plenty of opportunity to reap rewards. Even when I wasn't working for Red Hat, it didn't bother me. I wrote it for fun, and the fact that people found it useful was a greater reward than money. We've made it possible to put computers into places that could never have afforded Microsoft products.
LJ: How can Linux compete with Microsoft in the desktop sector, and will we be able to hold the commercial sector if we don't take the desktop as well? Can we take the desktop without ruining the spirit of Linux by dumbing it down? Where will our next areas of growth and expansion be?
Alan: Trying to predict the desktop is hard. Firstly, I think it's safe to say that the PC desktop of today is probably the dinosaur of tomorrow. Most end users want simpler systems. They want to trade flexibility for ease of use, and power for size.
There will be plenty of people who choose the full PC, some because they enjoy it and some because they need all the power. Those, I suspect, are the minority.
The machines of tomorrow are mostly going to be web-oriented, or very mobile (or both). People will expect them to just work. Folks like Palm have taken the first stumbling steps in this direction, with huge success.
Linux is a good OS for building embedded systems with, and to be able to extensively customize. GNOME and KDE will give people a good battle on the desktop and beyond.
LJ: How do you feel about commercial applications being written for Linux, and proprietary software and protocols in general? Do you run Linux more for philosophical reasons or practical reasons? If something that appeared to be better came along, would people jump ship? Conversely, would we stay with Linux even if it somehow degenerated, took a wrong turn, or stopped progressing?
Alan: I don't believe open source works for everything. There are some cases where the ideas in the code truly have value, but not many. I don't currently use any proprietary software generally, except Netscape. And Mozilla is now within a hair's breadth of replacing that.
I like the flexibility and the control of free software. Most of my experiences with proprietary software have either been getting screwed as a user, or being part of a large company that had to threaten its suppliers with lawsuits to get service. Proprietary software could work, but not while software companies spend all their time and effort lobbying the U.S. government to be basically exempt from all reasonable law on quality and fairness to the consumer, instead of giving people what they want.
LJ: Do you think the community should support only open-source/free software? How would the community survive hard times if there were a lag or down time in the continuing success of the open-source methodology? Is the free software philosophy strong enough and with enough adherents to pull us through?
Alan: The real community consists of those who contribute — not just code, but bug fixes, documentation, etc. That community can survive a lot. Some of the rest of the Linux user base is certainly there because "it's cool", "it's not Microsoft" and for commercial reasons. It's up to the free software methodology to work out for those people. If it does, they will go that way.
LJ: How do you feel about the different licenses? GPL, LGPL, QPL, etc.?
Alan: The old Qt license was a problem. I don't see one with their current licensing. Too many licenses can be a problem, but the basic few we have now seem to fit a wide spectrum of beliefs well.
LJ: Is there a world outside of computers? Are you ever afraid that you'll wake up one day and feel you have wasted your life in front of a computer?
Alan: Most of what I am doing, even that involving computers, is about people and interacting with people. That's true for most computing use. People are using e-mail, IRC, messaging systems and web-based discussion systems a lot. It's about people communicating.
Alan Cox is a long-time Linux kernel hacker, Red Hat Software employee, and general all-around great guy. More importantly, he's one of the people that works behind the scenes to make Linux a great product through his relentless efforts in improving and enhancing the Linux kernel--as evidenced by the numerous "ac" monikers after kernel revisions.
Linux Today Editor Paul Ferris caught up with Alan on the LinuxWorld Expo show floor as the show was winding down. Here, Paul and Alan discuss important issues: how free software is making a difference in the world, what vendors are making a difference in the Linux world, why the commercialization of Linux may not make a difference in the larger scheme of things, and how Scrappy Doo didn't make a difference in the Scooby-Doo ethos. For those of you who would like to know what Alan is doing a day-by-day basis, he keeps a diary online at www.uk.linux.org/diary/.
LT: So you spend 10 hours a day doing Linux stuff....Do you enjoy your day pretty much? Do you work at home?
Alan: Yes, I work at home. It's very rare that I'm with the Red Hat people, maybe once a year.
LT: A lot of internal Red Hat discussion probably goes by on your e-mail, so you don't have to deal with it?
Alan: Occasionally it's corporate, but it's not normally an issue. I stay in touch with them all so I know what's going on. I keep an eye on things so I can keep track of where things are going.
LT: What about kids or pets?
Alan: No children, no pets.
LT: Your wife?
Alan: Yes, Telsa.
LT: You like to cook a lot, I take it?
Alan: I have fun cooking [laugh]--it's not necessarily a good thing for the people who have to eat it.
LT: Do you make your own recipes?
Alan: No, I started off because I decided that cooking might actually be something interesting after all. I can vaguely cook now.
LT: What kind of food do you like mostly?
Alan: Oh, Chinese food mostly. One good thing about being in south Wales is that you can get a awful lot of good Indian food.
LT: You like spicy things? Hot stuff?
Alan: Within limits. There's stuff that's sort of hot and spicy and it's nice hot and spicy. And then there's the stuff they serve people who have had 12 pints of beer and are staggering home in the evening.
LT: Where you can't tell how hot it is?
Alan: Which I wouldn't go anywhere near! It's the sort of stuff that's hot to prove how cool you are for being able to eat it.
LT: Switching subjects, I understand you like to watch Scooby Doo.
Alan: Rather a lot, yes.
LT: Every episode?
Alan: The stuff with Scrappy Doo is best avoided, I think. [Ed. note -- Scooby Dum is pretty pointless as well.]
LT: They always get their guy, that's the important thing. You know: "I would've gotten away with it, too, if it hadn't been for these meddling kids!"
Alan: [laugh] That's the stuff! Although that particular phrase doesn't occur in very many of them.
LT: I grew up watching that on Saturday-morning cartoons. I'm kind of dating myself there with that. So things are pretty quiet, then? You have to do a lot of trade-show kind of stuff?
Alan: I'm trying to do a lot less this year. I try to pick more technical stuff, I try to pick more interesting ones. I went to Portugal last year and had a great time in Iceland. That was one that Eric Raymond couldn't make. So I agreed to go with Bruce Perens instead. I was trying to pick different but interesting places.
LT: What do you think of some of the commercialization changes to Linux? There's been some more commercial companies adding stuff to Linux. Has that been a good thing?
Alan: I'm not so sure. Some of it is, some of it isn't. There are a lot of companies trying to push proprietary products on top of Linux. As far as I'm concerned this isn't really much of a move on Windows markets. Companies still tying themselves down to vendors.
What is starting to become apparent is that there are a lot of vendors whose stuff was previously proprietary who are very interested in the open-source model, who are saying a lot of our revenues are really in the support side--maybe there really is something in this. We're starting to see a lot of these companies get into open source.
Three years ago, IBM wouldn't really be anything but a nuisance. It's remarkable.
LT: But then yesterday they announced their journaling file system on top of Linux would be open source.
Alan: The JFS stuff yesterday, they've provided several drivers. There's the S/390 port, which is one of the most powerful computers that can run Linux. There's a lot of good stuff that's been coming out of IBM. HP's been doing a lot of stuff with the printers, [as well as] the HP-PA port. All of these really big companies, companies we thought would be sort of last on the list, are really starting to get it.
LT: You've done some work with the PA-Risc (HP) port, haven't you?
Alan: I've helped a little. I was trying to do some stuff with the PA but Linus sort of stepped in and said, "OK, you're going to be doing the 2.2 releases." So I've done rather less on the PA port than I'd have rather liked. I've had some people saying, "We'd like to give you one of these," but I had to say, "I don't have the time to fiddle with that as well."
LT: I'd personally like to see the PA-Risc port working. I really like their hardware and I kind of have a soft spot for HP-UX since it was my first UNIX exposure, version 6.0 of HP-UX.
Alan: [laugh] The HP hardware is actually very nice, but HP-UX is on my hate list.
LT: It's kind of creaking now in terms of comparison.
Alan: On of the things that a lot of these companies haven't picked up upon is the importance of easy-to-use user interfaces.
And the user interface is going to be more and more important because you're going to try and push not just Linux but all sorts of other technologies to people who basically want the Internet to work like a TV set.
If it's got any more buttons than a TV or if you can't just turn it off when you get confused, then it's not going to fly. It's not just really about people who are not inside of the tech community, or people who are too clueless to learn. These people have more important things to do with their time then to try to learn to run a PC.
They just want to be able to send e-mail or look on the Internet. They don't want to know about disk partitioning or most of the things that go in PCs
LT: And I think that some of the new Linux boxes that are already set up for the user are already there. There's no question that KDE or GNOME are both user interfaces that are very easy for people to use.
Alan: But that's still only the PC. The PC itself, for a lot of these things is still too complex. I think that a lot more important things will be things like set-top boxes.
One of the things I like about the Palm Pilot--I hate it for a lot of reasons, like the handwriting--but when you use a Palm Pilot and have no clue what you are doing, you have a set of basic functions that anybody can use. Yet as you get to know a bit more about it, as you learn to use it, you discover all of these extra features and all these extra clever things you can do.
They're not thrown in your face, so it doesn't intimidate you. You can simply treat it like a very simple notepad. And it works like a notepad. It's sort of the first PC you can give to your grandmother, in many respects.
That's the kind of interface technology we need in Linux to go beyond the PC world.
LT: So that we attract the people that are at the very beginning.
Alan: Not just the people who are at the very beginning--the people who simply do not want to know.
LT: I think that this likely runs contrary to the wants of people that are into Linux because it's cool or whatever right now.
Alan: Linux should be flexible. It's no good having that kind of dumbed-down interface on your main Web server.
You can't fix the problem because there isn't a button for it. It's an area where open source should be good--because it's extremely tailorable. You're not stuck with the user interface that came with the product.
LT: It's not a marketing decision either, it's a democratic decision.
Alan: Right. If you get something like the GNOME desktop and there's lots and lots of icons and the file manager and so on, and all you want to do is give people four icons, four applications, and an e-mail package to use, you can take all of the rest off and get rid of the panel. You can burn it on a CD, and you can sell that as part of your own product. You can actually customize Linux to a particular niche environment. You can build a business on it. You can see people starting to do that, the TV set-top boxes are only one example.
There are people building Linux on one floppy-disk firewalls. People are building businesses on this kind of technology.
LT: It's pretty freaky. It's partly because you can get in there and take out things that you don't really need.
Alan: Not only that--you're allowed to sell, support and to modify the product for other people. It gives you the flexibility of the control.
LT: How long have you been doing Linux now?
Alan: Around eight years.
LT: From the very beginning practically?
Alan: I was using Linux for a while. I sent Linus a patch for 0.95 or somewhere which Linus rejected because someone else had already fixed the problem. Then I sent him another one for process accounting shortly afterward, which eventually got into the kernel some point after 1.2. It had been written and rewritten by 10 people in the process.
Then came the networking stuff. What actually happened was that I'd been using Linux a bit at home, writing a multiuser game. It kind of did the job. It was a real user environment, and it worked like a proper computer.
LT: So you were writing a game?
Alan: I was involved in most of the design work and a little writing of a multiuser game called ABERMUD. It was probably the first generally available multiuser game released onto the Internet.
Although certainly not the first of the game, it was actually a cheap knock-off of a game written in 1982/83, and we thought, we ought to be able to do this too. Before I'd been doing some game work with Adventure International in the U.K.
So, that was my background. I had sort of assumed that I was going to university and finish. If I could find a way to make a lot of money I'd go and write computer games. I got sidetracked on it.
But what happened was that the university computer society got a 4-megabyte 386 [Intel] machine, and we stuck it on the university LAN. And it fell over--within minutes. So we started debugging the network code.
At the time we probably had one of the biggest multiprotocol networks that a Linux box was attached to. So we were finding a lot of the bugs that other people weren't, and I started fixing them.
LT: A 386. That would be pretty sluggish, too, although we don't remember things that way.
Alan: It wasn't that bad. You have to remember that the kernel was a lot smaller then.
LT: I remember 286es as being blazingly fast. But you know something, I encountered a 286 running DOS the other day and I thought I'd stepped in the mud. It's amazing how our tolerance for speed changes from year to year. Once you sit down and use a 600-MHz Pentium III and then go to something like a 200-MHz Pentium....
Alan: It probably depends upon what you are doing. There are a lot of systems out there running on 8-MHz microprocessors that feel just fine.
LT: Yes. If I'm in raw text mode on a console, which is what I switch to if I'm doing real work, it's different.
Alan: Well, that's what people were using, something like that, until around Word 5, text mode. People only flipped into graphic mode to do a final check through. It was too slow to draw the page.
Peoples' expectations have moved on. They want to do more. The ways of working have changed. There are things today that you wouldn't do with a computer before. You just say now, "I've got all this computer power--you sort it out."
LT: I feel like some of our demand like that is effecting our stress levels, making us less tolerant of not just computer things. Waiting in line these days I see a lot more people impatient waiting in line. They've increased their bandwidth at home, why should they have to wait for anything. I see it bleeding over into areas where it shouldn't.
Alan: Certainly people have forgotten that being happy is somewhat more important than owning all the best stocks and the other things that people sort of spend all their time running around trying to achieve.
LT: On the happiness question--what do you consider happiness? Where does it come from for you?
Alan: Making other people happy, I think, is probably the world's number-one cause of happiness. That's always the way I've looked at it.
LT: It's not something that fades with time. All this other stuff sort of fades and rusts. Things become invaluable over time, but a good deed never seems to be forgotten.
Alan: The funny thing about Linux, there's so much going on. Things like the UNESCO distribution of Linux into third-world countries and stuff. Because not only is it something to run on cheap hardware, especially if you're careful about which distributions you pick and sort of what your expectations are. But also it's something that is literally sustainable.
So you don't you don't have this problem where you've shared all of these computers and software technology out to the country, and then they have to all come back to Redmond and say, "OK, we need this and we need that and the other."
The translations and such can all be done locally. For things like translation and localization, that's exactly how it should be. Because I've never seen something translated that has been done well, unless it's been done in the country for which it has been a target.
LT: What's the name of the distribution? UNESCO?
Alan: The UNESCO thing? Part of it has to do with providing more sustainable resources to other countries. Pulling the Red Hat press from a couple of years ago you should be able to find out more about it. You've got a big archive on Linux Today, so you should be able to find out about it.
LT: We're redesigning Linux Today a bit, but the archives will always be available. There's some really interesting stuff in there.
Alan: Keeping some of the earlier stuff would be important to me.
LT: We use it ourselves researching our own stuff. To me it's a snapshot of history, and it's sort of like the bloodline for a beast we call the Linux Community. It is really, it's a circulatory system. If, for example, you're interested in security, you likely get our security newsletter and so on...
Alan: If you want to go back before that, some of us have tried to keep all of the old mailing-list archives. I've got some 1993-1995 Linux lists, and Ted hopefully has 1991 to 1992. So maybe based on that we can put up an archive of all of the original Linux messages. So back from the very very first days of the actual mailing lists. ...
LT: There's some history there...
Alan: A fair bit of history. The first SAMBA list. SAMBA digest number one. [laugh] The origin of fvwm, which state that the F really stood for which is feeble. Because Rob Sanders was trying to write a very small window manager and terminal so that he could use X on a 4-megabyte machine.
So it was the feeble virtual window manager. But the feeble has mysteriously disappeared over the years.
LT: No, a different word is always suggested... I think the history is important here. This whole Expo floor, to a new person just walking in, appears to have just sprung up from nothing. Literally overnight.
Alan: I'm pretty sure the first Linux Expo I went to, the stand would fit in the Red Hat booth today. It's not that long ago if you think about it. The first time I met Red Hat was at one of the Expos, they brought me over.
We had a tour of the office, which was like two small rented office buildings. About a five-minute tour and we'd seen everything. And now there are like 450 people with the Red Hat and Cygnus combo. And it really has just sort of... well, it's kind of like at one point I was just sort of fiddling around and playing with an interesting little toy I had and then the rest of the world arrived.
It has been really amazing.
LT: I feel like I'm in the current now and I'm being swept along. Linux Today, for example, there's so much news now, this was my favorite news site. I say was, not meaning that it isn't now, meaning that I no longer am able to keep up.
Alan: That's one of the problems I find with it. As a site, there's too much going on. If you don't look at it for two or three days, you missed so much stuff.
LT: You almost feel like you've missed part of a soap opera, don't you?
Alan: [laughter] You need better filtering so you can sort out the stuff you're more interested in.
LT: Well, we're definitely working on that, as well as some other issues.
Alan: I know this kind of thing. What used to happen, I used to go in the morning, read my e-mail, and click across the Linux news sites. But that click across the Linux news sites started getting longer, and longer...
LT: It's quickly coming to a point where the amount of things happening is beyond the comprehension of most people.
Alan: I have to agree. I follow less and less of it. It's not so hard in my case because I can generally ignore most of the proprietary software developments.
I have some interactions with some of the vendors like Oracle and such. But we're getting useful stuff back from them because people like Oracle and SAP are coming to us and saying, "Linux works great, but we've got a little bit of a benchmark problem here, and that's showing up in some of these things that don't scale very well."
SAP is actually providing kernel fixes for them as well, as are some of the other vendors. Because of course, they all want their software to run really well on Linux. And if that happens to mean a few kernel fixes it's a good investment for them.
LT: This is another major advantage. Of course there's no way that a proprietary vendor can hope to compete with this.
Alan: It's not so hard for a big company [to get the attention of a vendor with proprietary code] to get attention, but if you're a small company it's very hard to get something [like a requested software feature] in. Even to get noticed by the people who otherwise control the software. Whereas in the free-software world you send in the patch.
LT: You can get as involved as you want.
Alan: Right.
LT: How much involvement is IBM having when you look at the integration of the port to the 390?
Alan: Basically, they put out the S/390 code, and they sent me a copy a few days before they did the official release so I could look through and comment on it. On the basis of the idea that I wouldn't jump their press releases and spoil their nice PR arrangement. There were a couple of places where I where had to [alter some minor things regarding official device names]. Other than that, the code went right in.
The only one that was really funny: in one of the console drivers there were lots of gaps, bunches of blank lines. I thought, what's with all of these bunches of blank lines in here? You could have at least tidied that file up. They said, "Oh, that's where the lawyers took out the comments."
Just on the one driver. I have no idea what IBM's issue was. But obviously IBM has a chunk of people who are supposed to sit there and protect IBM. But the S/390 code is really nice code.
LT: Have they looked at the kernel and done anything like what SAP did?
Alan: Not that I'm aware of in that way. But now they're offering their journaling file-system stuff.
LT: Yeah, we posted it the other day.
Alan: I haven't looked at it yet.
LT: I was going to ask you how it compares to EX3, Stephen [Tweedie]'s stuff.
Alan: Stephen's stuff is interesting in the sense that you can take an EX2 file system [EX2 is the default Linux file system type], make it journaled, and you can easily turn it back into EX2 trivially, whereas the others are all new file formats. As to whether we end up with one or a collection of them, I don't know. I mean there's obviously ways to support lots of file systems. We support things like the QNX file system, not because we expect all people to go off and store it as their main file system, but because people have got QNX hard disks.
LT: We want to thank you for the time for the interview. I especially want to thank you for your work.
Alan: Mostly other people's work. I mostly organize it is all.
LT: I know, I know--that you're part of the collective, but you I feel are a big part of the success of that work. Doing your own things. Linux means way more to me than an operating system means to most people. I can tell you in all sincerity that your work has affected my life in a really positive way. I'm very thankful.
Alan: Oh, good.
LT: I swear a lot less at my computer. It has automation tools where I want them, and it just plain works thanks to people like yourself. So, thank you, Alan.
Alan: You're welcome.
Number two -- with a bullet. A fierce intelligence backs up the soft voice and unconventional appearance of Linux honcho Alan Cox. By Mark Anderson The Ottawa Citizen
Q: But, in fact, you're the number two guy, after Linus Torvalds.
A: Number two in terms of co-ordination, not in terms of work getting done.
Q: How did you achieve your position?
A: I've been working with Linux pretty much from the beginning. At the time Linux was first released (1991) I was looking for an alternative development platform to Windows 3.0, which was just hopeless for what I wanted to use it for. I installed Linux and started fixing it. And kept on fixing it.
Q: Linux has already captured a sizable piece of the server market. The next challenge, presumably, is the desktop. How critical is it to produce a Windows-like interface for Linux? Is that the key to penetrating the PC market?
A: It doesn't have to be a Windows-like interface. It has to be a good interface. There is a sizable contingent in the Linux community who think Macintosh and NeXT provide better models for the development of a graphical user interface than Windows. Either way, you want to have file managers, icons and all those good things people have come to expect.
... ...
Q: Indeed, the ability to fix identified problems quickly is one of the main strengths of the distributed Linux development paradigm.
A: Right. There's no central authority making decisions. There's no waiting around for a manager to give the go-ahead on a project. If someone doesn't think something is working right, and he wants it fixed, he just goes ahead and fixes it.
Q: As more corporations climb aboard the Linux train, are tensions building between the corporate developers and the traditional Linux "hobbyists"?
A: The main source of tension comes when the corporate development teams contribute to parts of the Linux system, and they then delay releasing their modifications while their PR and marketing people work up the appropriate spin. The traditional Linux community isn't used to that. They want development to be done in a continuous flow.
Q: One of the beautiful things about Linux is its international flavour. Linus Torvalds is from Finland, you're from Britain. It's not California-centric, as so much software is. How important is this to the Linux community, and Linux users?
A: It's very, very important to a lot of countries, especially Third World countries, because American software is expensive. With Linux, developing nations can download the operating system, modify it to suit their needs, make copies, and no money flows out of the country.
LT: So you spend 10 hours a day doing Linux stuff....Do you enjoy your day pretty much? Do you work at home?
Alan: Yes, I work at home. It's very rare that I'm with the Red Hat people, maybe once a year.
LT: A lot of internal Red Hat discussion probably goes by on your e-mail, so you don't have to deal with it?
Alan: Occasionally it's corporate, but it's not normally an issue. I stay in touch with them all so I know what's going on. I keep an eye on things so I can keep track of where things are going.
LT: What about kids or pets?
Alan: No children, no pets.
LT: Your wife?
Alan: Yes, Telsa.
... ... ...
LT: I grew up watching that on Saturday-morning cartoons. I'm kind of dating myself there with that.
So things are pretty quiet, then? You have to do a lot of trade-show kind of stuff?
Alan: I'm trying to do a lot less this year. I try to pick more technical stuff, I try to pick more interesting ones. I went to Portugal last year and had a great time in Iceland. That was one that Eric Raymond couldn't make. So I agreed to go with Bruce Perens instead. I was trying to pick different but interesting places.
LT: What do you think of some of the commercialization changes to Linux? There's been some more commercial companies adding stuff to Linux. Has that been a good thing?
Alan: I'm not so sure. Some of it is, some of it isn't. There are a lot of companies trying to push proprietary products on top of Linux. As far as I'm concerned this isn't really much of a move on Windows markets. Companies still tying themselves down to vendors.
What is starting to become apparent is that there are a lot of vendors whose stuff was previously proprietary who are very interested in the open-source model, who are saying a lot of our revenues are really in the support side--maybe there really is something in this. We're starting to see a lot of these companies get into open source.
Three years ago, IBM wouldn't really be anything but a nuisance. It's remarkable.
LT: But then yesterday they announced their journaling file system on top of Linux would be open source.
Alan: The JFS stuff yesterday, they've provided several drivers. There's the S/390 port, which is one of the most powerful computers that can run Linux. There's a lot of good stuff that's been coming out of IBM. HP's been doing a lot of stuff with the printers, [as well as] the HP-PA port. All of these really big companies, companies we thought would be sort of last on the list, are really starting to get it.
LT: You've done some work with the PA-Risc (HP) port, haven't you?
Alan: I've helped a little. I was trying to do some stuff with the PA but Linus sort of stepped in and said, "OK, you're going to be doing the 2.2 releases." So I've done rather less on the PA port than I'd have rather liked. I've had some people saying, "We'd like to give you one of these," but I had to say, "I don't have the time to fiddle with that as well."
LT: I'd personally like to see the PA-Risc port working. I really like their hardware and I kind of have a soft spot for HP-UX since it was my first UNIX exposure, version 6.0 of HP-UX.
Alan: [laugh] The HP hardware is actually very nice, but HP-UX is on my hate list.
LT: It's kind of creaking now in terms of comparison.
Alan: On of the things that a lot of these companies haven't picked up upon is the importance of easy-to-use user interfaces.
LT: And HP-UX isn't there. I had to do so many changes to CDE and the shell to make it even livable for myself.
Alan: Yes, it's similar to Solaris. One of the first things you did was to get the GNU tape. It's surprising that these vendors never really seemed to pick up on that.
LT: Because they could have done that out of the box.
Alan: Yes, they could have done it. It would be nice to seem some of these vendors shipping GNOME or KDE--and things like a shell where the arrow keys work!
LT: Exactly! Do you know how many times I complained to someone at HP about this very problem? I complained on the support line, I complained to their PR people, and I've gotten nowhere. It was a very simple thing--I even said, "Look, load bash, and you can go back and forth through your commands." Because I was on the support end, we would sell a machine and out of the box it would be so hostile to a new user.
Alan: Certainly this has been one of the things that has sort of given UNIX a bad reputation. I know certainly that they're easy to fix things. I know a long time back when we were using DOS at work, I started using DR-DOS solely for things like the arrow keys. It was worth a product switch for it.
LT: You could fix some of these things with DOS, but if it doesn't come that way out of the box it's pointless, as a new user wouldn't find it.
Alan: And the user interface is going to be more and more important because you're going to try and push not just Linux but all sorts of other technologies to people who basically want the Internet to work like a TV set.
If it's got any more buttons than a TV or if you can't just turn it off when you get confused, then it's not going to fly. It's not just really about people who are not inside of the tech community, or people who are too clueless to learn. These people have more important things to do with their time then to try to learn to run a PC.
They just want to be able to send e-mail or look on the Internet. They don't want to know about disk partitioning or most of the things that go in PCs
LT: And I think that some of the new Linux boxes that are already set up for the user are already there. There's no question that KDE or GNOME are both user interfaces that are very easy for people to use.
Alan: But that's still only the PC. The PC itself, for a lot of these things is still too complex. I think that a lot more important things will be things like set-top boxes.
One of the things I like about the Palm Pilot--I hate it for a lot of reasons, like the handwriting--but when you use a Palm Pilot and have no clue what you are doing, you have a set of basic functions that anybody can use. Yet as you get to know a bit more about it, as you learn to use it, you discover all of these extra features and all these extra clever things you can do.
They're not thrown in your face, so it doesn't intimidate you. You can simply treat it like a very simple notepad. And it works like a notepad. It's sort of the first PC you can give to your grandmother, in many respects.
That's the kind of interface technology we need in Linux to go beyond the PC world.
LT: So that we attract the people that are at the very beginning.
Alan: Not just the people who are at the very beginning--the people who simply do not want to know.
LT: I think that this likely runs contrary to the wants of people that are into Linux because it's cool or whatever right now.
Alan: Linux should be flexible. It's no good having that kind of dumbed-down interface on your main Web server.
You can't fix the problem because there isn't a button for it. It's an area where open source should be good--because it's extremely tailorable. You're not stuck with the user interface that came with the product.
LT: It's not a marketing decision either, it's a democratic decision.
Alan: Right. If you get something like the GNOME desktop and there's lots and lots of icons and the file manager and so on, and all you want to do is give people four icons, four applications, and an e-mail package to use, you can take all of the rest off and get rid of the panel. You can burn it on a CD, and you can sell that as part of your own product. You can actually customize Linux to a particular niche environment. You can build a business on it. You can see people starting to do that, the TV set-top boxes are only one example.
There are people building Linux on one floppy-disk firewalls. People are building businesses on this kind of technology.
LT: It's pretty freaky. It's partly because you can get in there and take out things that you don't really need.
Alan: Not only that--you're allowed to sell, support and to modify the product for other people. It gives you the flexibility of the control.
LT: How long have you been doing Linux now?
Alan: Around eight years.
LT: From the very beginning practically?
Alan: I was using Linux for a while. I sent Linus a patch for 0.95 or somewhere which Linus rejected because someone else had already fixed the problem. Then I sent him another one for process accounting shortly afterward, which eventually got into the kernel some point after 1.2. It had been written and rewritten by 10 people in the process.
Then came the networking stuff. What actually happened was that I'd been using Linux a bit at home, writing a multiuser game. It kind of did the job. It was a real user environment, and it worked like a proper computer.
LT: So you were writing a game?
Alan: I was involved in most of the design work and a little writing of a multiuser game called ABERMUD. It was probably the first generally available multiuser game released onto the Internet.
Although certainly not the first of the game, it was actually a cheap knock-off of a game written in 1982/83, and we thought, we ought to be able to do this too. Before I'd been doing some game work with Adventure International in the U.K.
So, that was my background. I had sort of assumed that I was going to university and finish. If I could find a way to make a lot of money I'd go and write computer games. I got sidetracked on it.
But what happened was that the university computer society got a 4-megabyte 386 [Intel] machine, and we stuck it on the university LAN. And it fell over--within minutes. So we started debugging the network code.
At the time we probably had one of the biggest multiprotocol networks that a Linux box was attached to. So we were finding a lot of the bugs that other people weren't, and I started fixing them.
LT: A 386. That would be pretty sluggish, too, although we don't remember things that way.
Alan: It wasn't that bad. You have to remember that the kernel was a lot smaller then.
LT: I remember 286es as being blazingly fast. But you know something, I encountered a 286 running DOS the other day and I thought I'd stepped in the mud. It's amazing how our tolerance for speed changes from year to year. Once you sit down and use a 600-MHz Pentium III and then go to something like a 200-MHz Pentium....
Alan: It probably depends upon what you are doing. There are a lot of systems out there running on 8-MHz microprocessors that feel just fine.
LT: Yes. If I'm in raw text mode on a console, which is what I switch to if I'm doing real work, it's different.
Alan: Well, that's what people were using, something like that, until around Word 5, text mode. People only flipped into graphic mode to do a final check through. It was too slow to draw the page.
Peoples' expectations have moved on. They want to do more. The ways of working have changed. There are things today that you wouldn't do with a computer before. You just say now, "I've got all this computer power--you sort it out."
LT: I feel like some of our demand like that is effecting our stress levels, making us less tolerant of not just computer things. Waiting in line these days I see a lot more people impatient waiting in line. They've increased their bandwidth at home, why should they have to wait for anything. I see it bleeding over into areas where it shouldn't.
Alan: Certainly people have forgotten that being happy is somewhat more important than owning all the best stocks and the other things that people sort of spend all their time running around trying to achieve.
LT: On the happiness question--what do you consider happiness? Where does it come from for you?
Alan: Making other people happy, I think, is probably the world's number-one cause of happiness. That's always the way I've looked at it.
LT: It's not something that fades with time. All this other stuff sort of fades and rusts. Things become invaluable over time, but a good deed never seems to be forgotten.
Alan: The funny thing about Linux, there's so much going on. Things like the UNESCO distribution of Linux into third-world countries and stuff. Because not only is it something to run on cheap hardware, especially if you're careful about which distributions you pick and sort of what your expectations are. But also it's something that is literally sustainable.
So you don't you don't have this problem where you've shared all of these computers and software technology out to the country, and then they have to all come back to Redmond and say, "OK, we need this and we need that and the other."
The translations and such can all be done locally. For things like translation and localization, that's exactly how it should be. Because I've never seen something translated that has been done well, unless it's been done in the country for which it has been a target.
LT: What's the name of the distribution? UNESCO?
Alan: The UNESCO thing? Part of it has to do with providing more sustainable resources to other countries. Pulling the Red Hat press from a couple of years ago you should be able to find out more about it. You've got a big archive on Linux Today, so you should be able to find out about it.
LT: We're redesigning Linux Today a bit, but the archives will always be available. There's some really interesting stuff in there.
Alan: Keeping some of the earlier stuff would be important to me.
LT: We use it ourselves researching our own stuff. To me it's a snapshot of history, and it's sort of like the bloodline for a beast we call the Linux Community. It is really, it's a circulatory system. If, for example, you're interested in security, you likely get our security newsletter and so on...
Alan: If you want to go back before that, some of us have tried to keep all of the old mailing-list archives. I've got some 1993-1995 Linux lists, and Ted hopefully has 1991 to 1992. So maybe based on that we can put up an archive of all of the original Linux messages. So back from the very very first days of the actual mailing lists. ...
LT: There's some history there...
Alan: A fair bit of history. The first SAMBA list. SAMBA digest number one. [laugh] The origin of fvwm, which state that the F really stood for which is feeble. Because Rob Sanders was trying to write a very small window manager and terminal so that he could use X on a 4-megabyte machine.
So it was the feeble virtual window manager. But the feeble has mysteriously disappeared over the years.
LT: No, a different word is always suggested... I think the history is important here. This whole Expo floor, to a new person just walking in, appears to have just sprung up from nothing. Literally overnight.
Alan: I'm pretty sure the first Linux Expo I went to, the stand would fit in the Red Hat booth today. It's not that long ago if you think about it. The first time I met Red Hat was at one of the Expos, they brought me over.
We had a tour of the office, which was like two small rented office buildings. About a five-minute tour and we'd seen everything. And now there are like 450 people with the Red Hat and Cygnus combo. And it really has just sort of... well, it's kind of like at one point I was just sort of fiddling around and playing with an interesting little toy I had and then the rest of the world arrived.
It has been really amazing.
LT: I feel like I'm in the current now and I'm being swept along. Linux Today, for example, there's so much news now, this was my favorite news site. I say was, not meaning that it isn't now, meaning that I no longer am able to keep up.
Alan: That's one of the problems I find with it. As a site, there's too much going on. If you don't look at it for two or three days, you missed so much stuff.
LT: You almost feel like you've missed part of a soap opera, don't you?
Alan: [laughter] You need better filtering so you can sort out the stuff you're more interested in.
LT: Well, we're definitely working on that, as well as some other issues.
Alan: I know this kind of thing. What used to happen, I used to go in the morning, read my e-mail, and click across the Linux news sites. But that click across the Linux news sites started getting longer, and longer...
LT: It's quickly coming to a point where the amount of things happening is beyond the comprehension of most people.
Alan: I have to agree. I follow less and less of it. It's not so hard in my case because I can generally ignore most of the proprietary software developments.
I have some interactions with some of the vendors like Oracle and such. But we're getting useful stuff back from them because people like Oracle and SAP are coming to us and saying, "Linux works great, but we've got a little bit of a benchmark problem here, and that's showing up in some of these things that don't scale very well."
SAP is actually providing kernel fixes for them as well, as are some of the other vendors. Because of course, they all want their software to run really well on Linux. And if that happens to mean a few kernel fixes it's a good investment for them.
LT: This is another major advantage. Of course there's no way that a proprietary vendor can hope to compete with this.
Alan: It's not so hard for a big company [to get the attention of a vendor with proprietary code] to get attention, but if you're a small company it's very hard to get something [like a requested software feature] in. Even to get noticed by the people who otherwise control the software. Whereas in the free-software world you send in the patch.
LT: You can get as involved as you want.
Alan: Right.
LT: How much involvement is IBM having when you look at the integration of the port to the 390?
Alan: Basically, they put out the S/390 code, and they sent me a copy a few days before they did the official release so I could look through and comment on it. On the basis of the idea that I wouldn't jump their press releases and spoil their nice PR arrangement. There were a couple of places where I where had to [alter some minor things regarding official device names]. Other than that, the code went right in.
The only one that was really funny: in one of the console drivers there were lots of gaps, bunches of blank lines. I thought, what's with all of these bunches of blank lines in here? You could have at least tidied that file up. They said, "Oh, that's where the lawyers took out the comments."
Just on the one driver. I have no idea what IBM's issue was. But obviously IBM has a chunk of people who are supposed to sit there and protect IBM. But the S/390 code is really nice code.
LT: Have they looked at the kernel and done anything like what SAP did?
Alan: Not that I'm aware of in that way. But now they're offering their journaling file-system stuff.
LT: Yeah, we posted it the other day.
Alan: I haven't looked at it yet.
LT: I was going to ask you how it compares to EX3, Stephen [Tweedie]'s stuff.
Alan: Stephen's stuff is interesting in the sense that you can take an EX2 file system [EX2 is the default Linux file system type], make it journaled, and you can easily turn it back into EX2 trivially, whereas the others are all new file formats. As to whether we end up with one or a collection of them, I don't know. I mean there's obviously ways to support lots of file systems. We support things like the QNX file system, not because we expect all people to go off and store it as their main file system, but because people have got QNX hard disks.
LT: We want to thank you for the time for the interview. I especially want to thank you for your work.
Alan: Mostly other people's work. I mostly organize it is all.
LT: I know, I know--that you're part of the collective, but you I feel are a big part of the success of that work. Doing your own things. Linux means way more to me than an operating system means to most people. I can tell you in all sincerity that your work has affected my life in a really positive way. I'm very thankful.
Alan: Oh, good.
LT: I swear a lot less at my computer. It has automation tools where I want them, and it just plain works thanks to people like yourself. So, thank you, Alan.
Alan: You're welcome.
Phil-14 asks:
Do you think that non-x86 versions of Linux will forever remain on the periphery, or will Linux actually become a force for platform independence?Alan Answers:
The focus of developers is always going to reflect the hardware people have. As non x86 machines become more prevalent the importance of the kernel port will do so. PowerPC for example has gone from being a fringe BeBox project to a major platform.Embedded systems and palmtops are likely to increase the amount of non-x86 linux platforms. The price squeeze is also going to take its tool - there are simpler cheaper processors and in the end that pricing will begin to count big time.
Intel clearly see IA64 eventually replacing x86. It may be in time that the x86 port is viewed in the same way as the 680x0 port. It may also be that IA64 is a turkey, we all run AMD K7's and x86 lives on. Its a guessing game.
asad asks:
Do you see yourself still working on Linux 5 years from now? What about other people on the Kernel mailing list? And do you think the quality of the code people now write for Linux is still up to the standards of the old days ?Alan Answers:
I have no idea what I will be doing in five years. Somehow I suspect it will involve Linux and Red Hat a great deal.The code standards haven't changed much. Linus is very keen on having clean modular and maintainable code. We have ugly code in there, but its mostly in specific drivers and quite frequently coping with ugly hardware.
Linus is picky, but Linus always was picky. Linux is as good as it is because he is prepared to be a right pain about doing things properly.
aheitner asks:
Okay, there's not a Linux hacker on the face of the planet who wouldn't kill to have your job.
- - Paid by RHADL.
- - Wake up when you want, work when you want.
- - Go to all the big trade shows.
- - Work with the likes of Linus and all the other regulars.
- - Get free toys (and I mean good toys) like PA-RISC systems from HP and Athlons from AMD.
But it wasn't always that way. Back before even RedHat paid you, back when you hacked on your aging spare equipment, what drew you to it? How did you know this was what you wanted to do before you knew about all the perqs involved, or that there would ever be perqs?
Alan Answers:
I'm actually paid by the support side of Red Hat not RHAD labs. That might sound strange but if you are selling high end support to people they want to know you have someone to fix the really bizarre, and you also need people who can. So in many ways I'm in support.I did it because it was fun. I'd been doing other free(ish) software stuff like AberMUD before that. I got into Linux to have a better development platform for AberMUD then got side tracked somewhat into hacking the OS.
As for the toys. New toys are fun whether they are expensive or not. The important thing is new. So I had lots of cheaper strange devices I was hacking on before - like the Macintosh 68K port, and Linux 8086.
The advantage of working for Red Hat is much more time than money: So many toys so few hours.
Techno_Jesus asks:
I'm always concerned with redundancy and I think the linux kernel development could benifit from it the same way our servers can. If something were to happen to Linus (albeit very tragic) would you or someone else be able to take the weight that he bears for the kernel development process? I fear that the community is putting all it's eggs in one basket and perhaps you are the only viable replacement.Alan Answers:
It used to be "what happens if Linus gets hit by a bus" now its "What happens if Linus and Alan both get hit by busses". I guess someone like DaveM would take over (yes next years question is "What happens if Linus, Alan and DaveM get ...")Kindjal asks:
How is the whole linux-on-sgi thing going? You were originally the guy behind that....what's your opinion on the sgi embracing linux stuff happening now?Alan Answers:
I was hardly the guy behind it. Dave Miller did the original ground work while at SGI. Ralf Baechle did a lot of the other work along with Miguel de Icaza. Almost all the work I did was on fixing up userspace packages and building install and bootstrap tools.The SGI I had is now in the hands of some other people who are still working on it. There is a lot of work within SGI on the new x86 machines and some work on the mips boxes. Check out oss.sgi.com
Amphigory asks:
I notice here that you were involved in the creation of the nano-x project at some point. What is your opinion on the continued viability of X-Windows? Should the open source community be focusing on developing something better, or is X the best we can hope for?Alan Answers:
Ben Pfaff wrote a library for Debian called Bogl that did basic drawing but was very compact - ideal for non X boot disks. I had a copy of an old Minix library called mini-X and I stuck them together. Alex Holden and others then decided to actually clean it up and make it work usefully.Its good enough to play minesweeper and probably doesn't need much work to be able to use it as say a Mozilla front end or to port gdk (and thus gtk/gnoem) to it.
For most things X11 is far superior. X is bad at some things - notably code size and handling fast 3D games. These are all getting fixed. X is a very flexible framework and there is little wrong with X itself as a system. XFree 4.0 should do a lot to polish up the implementation. Its rarely a good idea to throw out 15+ years of work because it has a few glitches. X is probably relevant to everything but small PDA devices or set top boxes. My interest in Nanogui is with the Psion5/Geofox Linux port where you have very tight storage constraints.
Borg[9 of 9] asks:
Alan, with the upcoming 2.4 kernel is there any work being done to address Linux tcpip performance issues? Is there any plans on making the IP stack multi-threaded and what about the stack spin lock issues on SMP machines?Alan Answers:
DaveM, Alexey Kuznetsov and others have been working on this very hard. It is one reason the 2.3.x tcp/ip is currently a little wobbly. It is however all happening.John Fulmer asks:
I'm a network security person and have always been intrigued by the concept of the 'hardened' BSD kernel (which isn't really THAT hard), and some of the role-based and compartmentalized systems out there.What is the status of the security of Linux' overflows and root hacks) and what do you see as the overall direction, if any, of Linux's security beyond the standard UNIX security model?
Alan Answers:
Ok Linux 2.0 is absolutely traditional. Later 2.0 adds securelevel which gives a little more security at a usability price. Linux 2.2 uses capability sets so you can give processes finer grained rights. You can also revoke rights for all processes which can be useful in higher security environments.There are people playing with role based models on Linux, although not in the mainstream kernel tree. There are also projects like Stackguard designed to catch buffer overflow attacks. Ultimately the only real way to improve security is careful auditing of packages. On the whole this works. Almost no packages that have been audited have future security holes logged against them. The Linux security audit project is the place to get involved with this. Anyone want to audit the perl interpreter ?
emil asks:
While I realize that you might not be completely objective about this question, what do you think of the design of the HURD, as it compares to Linux?I once asked Linus himself this question and he replied in rather annoyed tones that "the HURD is a great academic design that would never work in practice" (or something along those lines).
Richard Stallman has been steadfast in refusing to endorse Linux as the GNU kernel. Does he raise these objections merely for emotional reasons, or does he see the HURD as having real technical advantages to the current monolithic design?
Alan Answers:
HURD is a great concept. Like most great concepts it isnt efficiently implementable I suspect. Hurd is a GNU vision and every project needs some lofty probably unachievable goal.The HURD design is more about Richard Stallman's ideas about how a system should work to promote community than about high perfomance OS design. Linux is a bit more pragmatic about things. We took ideas from the microkernel world (like loadable device drivers) but we didn't take the accompanying partitioning and performance loss.
HURD is a rich flexible environment where the user has a lot of power to say "no I don't like that, I'll write my own code and use it" - even for things like filesystems. Right now HURD is a research project. Maybe one day it will become a useful OS.
Tekmage asks:
How has the multicultural and multilingual participation affected the development of Linux as a whole?Have you begun to see evidence of third-world participation effecting the progress of Linux yet, or is it still in the "hope to see soon" category? What needs to be done/changed to assist in the cross-cultural adoption of Linux? (Unicode?)
When can we expect to see a Linux Universal Translator Engine? :-)
Alan Answers:
We have uf8 unicode support in the kernel for stuff like the console and file names. We have Japanese X11 fonts. Gnome and KDE have a lot of translations although mostly to left to right rendered Western european languages. Its a start.What do you call third world ? Really its a gradation, and also a lot of it is based on perception. I get code and patches from countries as culturally varied as India, and Iran. There is a noticable amount of activity in Brazil, both with the kernel and other packages. The real third world countries don't really yet have the infrastructure to support the Linux development model. Linux helps to give them the tools to create that infrastructure and I am sure in time it will come when I get kernel patches from these countries too.
Developing countries are also in a good position to benefit from the opening of the market. It doesn't matter where I am on the globe providing I have part time connectivity, electricity and computers I can do Linux development work for companies anywhere in the world - this is one of the other great things about my job. If I wanted to move country there are almost no logistical barriers to doing that and continuing to work for Red Hat.
Actually promoting cross cultural adoption of Linux is hard. It has to come from people in those countries. Maybe I can get away with putting together a French language Linux distribution in the UK, but to put together a good distribution for any significantly different culture I think you need to be part of it. I expect a lot of the growth in support for other languages, and cultural needs to come directly from people hacking the code in the countries that need it.
Linux development isnt centralised in Redmond so you can go out and do this. Most of the time we can communicate world wide. Not always. Its really hard sometimes to follow Japanese Linux projects in Europe and the USA. I guess the reverse is probably true.
... ... ...
LWN: You seem to have taken on an increasingly organizational role in the kernel development process. Does that sit well with you, or would you rather get back to more full-time hacking?
AC: It just sort of happened. I'm anticipating the amount of co-ordination will go down again now that 2.2 is basically sorted out. I hope so anyway as I want to get the I2O code finished, debug the 3c527 ethernet driver I am writing and use the work the vMac people have done to attempt to get palette loading and floppy disks working on the Macintosh 68K.
LWN: And how do you manage to process (and produce) all that mail every day and still get something done?
AC: That is actually fairly easy. I've always been a fast reader and most of the stuff I dig through requires little in the way of a reply - a lot of the mail I send is just pointing people at each other to avoid duplication of work. Much of the rest is tracking bugs and patches.
All the patch merging and bug chasing is getting something done, so the mail scanning is definitely not time wasted.
LWN: Kernel development over the past year has been interrupted occasionally by "Linus burnout" episodes. What are your thoughts on the sustainability of the current development model? Is there anything you would change?
AC: If I was the kernel organiser, there are quite a few things I'd do differently. Right now Linus applies all the patches and builds the trees, I'd much rather there were a group of people directly merging patches into the kernel tree and Linus sitting watching it and vetoing things rather than doing all the merge work, too.
The model has changed over 2.1.x and it has evolved to a kind of compromise that seems to work very well. Linus is still applying all the patches but there are people now collating and feeding Linus tested sets of patches in small cleanly organised groups. Larry McVoy's new version control toys may solve some of the remaining problems.
LWN: Do you anticipate taking on responsibility for the 2.2 series like you have with 2.0, or will somebody else have to step up for that one?
AC: I never really anticipated it with 2.0.x; it simply happened because I was collating all the patches. My guess is that, by the time 2.2 gets into that long term maintenance state, 2.0.x will be basically dead.
LWN: If you were to point at the biggest unmet need in the Linux world, the project most in need of volunteers currently, which would that be?
AC: That's always a hard question to answer. One problem is that, as Fred Brooks observed, adding man power to a late software project can actually make it later. This is true with free software as well.
Clear areas that could do with more work are better GUI tools for the kernel facilities. A nice graphical ISA PnP manager is one example. Others include some end user friendly tools for the new 2.2 bandwidth control and management functions.
The free software world doesn't really, however, work like a managed corporate structure. If someone is going to do something as a volunteer, it has to be something they find fun. Watch freshmeat and also for calls from people like the FSF (eg they are currently after more documentation people) and see if something tickles your fancy.
LWN: What do you think was the most significant event in the Linux world in 1998? Any idea what will be the most interesting development of 1999?
AC: The most visible one was clearly the sudden discovery of Linux by the suits and very much tied to the Mozilla event. I'm not sure what surprises 1999 will hold. Microsoft's current attempts to commit corporate suicide are bound to have some effect on the Linux world in 1999. Whether they will be the most significant is hard to tell. The other one will be if large PC vendors start to make machines with Linux or no OS available. The recent fun with the Windows Refund saga will undoubtedly help this. It's actually important that it becomes easy to buy a machine without an OS. It would be bad for people like the FreeBSD community if most of the people fighting for OS choice simply said "OK, now you can have Linux" and left it at that.
My guess is 2000/2001 will be when the really big stuff happens. That I suspect is the time scale for big Unix vendors to begin openly switching to Linux. For vendors whose revenue stream is primarily support and hardware, the math is simple enough.
LWN: Along those lines, what are your thoughts on the future of the BSD variants? Will Linux be their undoing? Is our relationship with the BSD systems what it should be?
AC: I don't think Linux will kill FreeBSD. I can see one of Open or NetBSD dying. At one point, I'd assumed NetBSD was doomed but it has a very definitely stayed alive.
In the longer term, I expect that Linux will help them. Supporting the Linux kernel API (something they already do fairly well) will give them the same application base that Linux is creating. With SCO, BSDI and apparently Solaris going to support the Linux kernel API, we should see a lot of applications for Linux running just fine on anyone's favourite OS.
LWN: How do you feel about the increasing corporate interest in Linux? Does Linux risk "losing its soul" as some people fear?
AC: Linux has always reflected its user base so I'm sure that some parts of it will turn more corporate. I don't actually see it as a big problem. No large corporation can "own" Linux or take away the right to freely distribute and change it.
Personally I don't mind if someone releases a Linux distribution aimed totally at corporate IT managers. I'm sure the technical community will use words like "boring, out of date, slow to change" about such a distribution. I've met corporate IT managers - words like "boring and slow to change" have them excited.
Linux already has this spectrum - from the corporate style Caldera use, through the "easy to use/install" of Red Hat and the "pure and free" vision of Debian. It's richer for it now; I don't see why it should be poorer for it in future.
MT: ...and, may we understand that "switching to Linux" means IBM quitting developing AIX and concentrating only on Linux, for example?
AC: I couldn't see IBM switched entirely to Linux. Linux is a very good general purpose OS. Right now there are plenty of specific things that each vendor has specialised as part of their business that Linux doesn't do. It's the question of which product they offer first when you say "I want to buy a web server", rather than which products they offer in total.
MT: The second question is about this part of your diary (http://www.linux.org.uk/diary/)
March 1st:
> Tomorrow LinuxWorld starts and everyone can go and celebrate all the
> binary vendor software being released for Linux, and then perhaps
> start open source projects to make it look slow and obsolete. One
> thing Linux and especially the current desktop projects like Gnome
> and KDE are going to do is to weed the 'good' commercial software
> out from the junk.Since I am not a native English speaker, I do not have much confidence in my understanding. Would you please confirm if I am correct/wrong?
My understanding is that you think junk binary software will vanish because of the open source projects, however, if the software is 'good' enough, they will survive. Put another way, business that sell software (not support) will not completely vanish in the future. If it is 'good', such software that is used widely by ordinary people will sell in the future, even if it is not specially customized nor applied advanced technology like machine translation etc.
Am I correct? If I am wrong, would you please let me know what you meant with this part of diary?
AC: I think poor quality binary software will vanish. Anyone who is selling something that can easily be written freely to the same standard will find that it gets rewritten.
Good quality binary only software I'm sure will survive. I can see the fact its binary becoming a negative point, but not the only thing that matters. Perhaps the conversation in a board meeting in 2001 will go
"We could use Oracle for our database"
"But thats binary only, which makes it a risk"
"True but it is the only thing that scales to our data size"
MT: And, would you tell us what 'good' means?
- if it is 'good' how should it be? (features, functionally?)
- which existing binary product would you call 'good'? (if any)AC: "Good" means 'does the job people want it to do, and does it well' - its not a very precise definition.
I'd call Quake a good binary product for example. And the Linux community loves quake, and bought lots of copies. Writing a free quake clone would have been a big task at the time it was released
Interview with Alan Cox -- Looks like this one is lost :-(
5. Alan Cox In Charge Of 2.0 Development Linuxcare
[27 Mar 1999 - 31 Mar 1999] (13 posts): [PATCH] linux/net/ipv4/arp.c, kernel 2.0.36 (& 2.0.37-pre9)
Brian Moyle posted a small patch to the list to fix the obsolete function arp_set_predefined() (http://lxr.linux.no/source/net/ipv4/arp.c?v=2.2.5#L340) to work with external loopback with crossed routing. The test setup he used to verify the patch was to connect two NICs together using an external cross-over cable; he swapped route entries (to force packets out the opposite interfaces), and then issued pings, telnets, and ftps to both addreseses. He unplugged the cable to verify that the pings, etc. were appropriately halted.
Alan Cox was unamused. He said, "The kernel knows perfectly well that its silly to send packets to yourself via external interfaces. This patch isn't needed. The bug is your routing table and trying to set up this configuration."
Pavel Machek objected that he could also benefit from the patch, and Pete Popov was also nonplussed. To Alan he replied, "why is it silly? Because it's not useful to you or any other kernel hacker? Neither you nor any other kernel hacker can envision all the possible ways linux might be used. We find the patch _extremely_ useful in our production and test line. It allows us to fully test and manufacture mutiple I2O LAN adapters in a single system and it would be a lot more convinient if the patch was part of each linux instalation, rather than having to compile a new kernel each time." He also insisted that the bug really was with the kernel.
But Alan stood firm, with:
There are two reasons Im not putting this in
- Its a very specialised need. The patch is fine if you need it add it - its more specialised than other "not needed" patches people keep out of the main tree.
- Its on a common code path. Sum the number of times that path is taken by the number of users of Linux versus the 3 or 4 who actually need it.
There isnt anything wrong with the patch, its just not relevant to the userbase as a whole and its on a regularly executed path
Richard B. Johnson had some soothing words to offer the pro-patch contingent. He said, "I think you could also send it to Alexy and if he likes it, Alan will probably be persuaded to put it into the mainstream. I think it is a very good idea. If it is implemented in such a way that it has essentially zero impact upon normally-configured networking, I think you have a winner. However, you are going to have to convince several important people. These people are reasonable, but not easy. Don't give up yet."
One other interesting feature of this thread is that it clears up any lingering doubts about who is in charge of the 2.0 series. Maybe if Linus Torvalds wanted to throw his weight around he could take over 2.0 development -- and maybe he couldn't; but for all intents and purposes 2.0 is a kernel fork belonging to Alan.
One large vendor likes to talk about the risks of Open Source software, but the strange thing is, the risk is actually in closed technologies. This article looks at the real risks in following a proprietary software path.
- 1. Copyright and Licensing
- 2. The Risks of Closed Source Computing
- 3. Credits
[November 11, 1998] SlashdotFeature Cathedrals Bazaars and the Town Council -- "Town council" or "The committee for the administration of the structural planning of the Linux kernel" effect by Alan Cox (I applied bold to the phases that I consider the most relevant -- NNB). Mirror: Cathedrals Bazaars and Town Councils
These are some of my thoughts on the Bazaar model that I figure are worth sharing. Its also a guide to how to completely screw up a free software project. I've picked a classic example of what I think is best dubbed the "Town Council" effect (although town councillors may think otherwise).
...The first thing to understand is that really good programmers are relatively unusual...
Secondly you need to understand that a lot of the wannabe real programmers are very good at having opinions. Many of them also catch buzzword disease or have some speciality they consider the "one true path". On the Internet talk is cheap.
The third part of any software project is what we shall call "the masses". They range between people who don't program but contribute massively in other areas -- documentation, helping users and artwork to the sort of people that are often used to argue that you should require a license to connect to the Internet.
<discussion of Linux 8086 project deleted>
The problem that started to arise was the arrival of a lot of (mostly well meaning) and dangerously half clued people with opinions -- not code, opinions. They knew enough to know how it should be written but most of them couldn't write "hello world" in C. So they argue for weeks about it and they vote about what compiler to use and whether to write one - a year after the project started using a perfectly adequate compiler. They were busy debating how to generate large model binaries while ignoring the kernel swapper design.
Linux 8086 went on, the real developers have many of the other list members in their kill files so they can communicate via the list and there are simply too many half clued people milling around. It ceased to be a bazaar model and turns into a core team, which to a lot of people is a polite word for a clique. It is an inevitable defensive position in the circumstances.
In the Linux case the user/programmer base grew slowly and it grew from a background group of people who did contribute code and either had a basis in the original Minix hacking community or learned a few things the hard way reboot by reboot. As the project grew people who would have turned into "The committee for the administration of the structural planning of the Linux kernel" instead got dropped in an environment where they were expected to deliver and where failure wasn't seen as a problem. To quote Linus "show me the source".
If someone got stuck they posted questions and there was and is a sufficiently large base that someone normally has both the time and the knowledge to reply. In the Linux8086 case the developers had long since walled themselves off. Given a better ratio of active programmers to potentially useful wannabe programmers would have rapidly turned some of the noise into productivity. The project would have gained more useful programmers and they in turn would have taught others. As with any learning exercise you are better off having only a few trainees.
There is an assumption some people make that you can't turn the "lesser programmers" into real programmers. From personal experience in the Linux project there are plenty of people who given a little help and a bit of confidence boosting will become one with the best. There are many who won't but enough that will. [1]
<deleted>
The lessons from this project, and others that went the same way (and sometimes died - remember the earlier Linux word processor projects) are fairly clear:
- Release code right from the start. It doesn't matter if its not very useful. The best way to sort a town council is to simply do the job then tell them it has been done. Linux, KDE and GNOME have all taken this attitude and all done well from it. You can argue about the right way to program for a lifetime. Once there is code out there people (whatever their skill) can play with it.
- Appreciate there are people who with a bit of help will contribute very much to a project. If their first patches are buggy don't put them down, explain why there is a problem and suggest solutions or places to look for examples of solutions. Every minute spent answering real questions helping someone work on a project will be paid back ten-fold to the project, and incalculably to society.
- Don't forget non programmers. I find it sad that many people when asked "name the most important five Linux kernel people" rarely name some of the most important folk of all -- the all to forgotten people who maintain web sites, change logs, mailing lists and documentation are as important.
Linus says "Show me the code". That is a narrow view of a real project. When you hear "I'd love to help but I can't program", you hear a documenter. When they say "But English is not my first language" you have a documenter and translator for another language.
- Try and separate useful people from the noise. It is hard to separate people trying to help from a mass of pointless discussion and in the Linux 8086 case I definitely did the wrong thing by giving up on that goal. How to remove just those who talk and do not do anything is a research topic 8).
So next time someone wants to vote on a project, or discuss issues for a month and then implement it - be warned. They may end up with the right solution. The odds are however in your favour for carrying on regardless. Just ask them to send you a patch when it works.
Beware "We should", extend a hand to "How do I"...
Alan
[1] As an example of this claim the original author of the Linux IPv6 code used to sit on irc from Portugal playing with a few basic ideas and asking questions. After we helped him figure some of the kernel internals he wrote probably 75% of the Linux IPv6 stack and was last seen working in the USA for cisco
They (or rather he as it appears to be) _refused_ us access to the system. I have basically given up on Mindcraft's Microsoft funded pranks.
The careful use of the word "support" to imply we somehow validate his test is misleading. I sent him about four emails.
Only on May 4th did he offer any kind of open testing. The response I sent him was very simple
> I think I would prefer to see an open benchmark done on your configuration
> and test plan by an alternative testing body. That would also be better
for all parties.
I think I speak for much of the community in feeling that. No such test has yet occurred. I have no faith that Mindcraft can or would deliver an accurate assessment of the general question of Linux v NT performance.
Bjorn: Linux development needs to go where the developers lead it. That's a Zen-like answer, isn't it?
You shouldn't even try to predict what unconstrained developers can achieve; you will almost always be wrong. Instead, you will almost always be amazed and overwhelmed. The only "threat" to Linux is if the world should decide not to communicate openly, especially if it were to stop using open protocols. Linux would survive even that, but its expansion might slow down, at least for a short time.
LJ: How do you feel about Linux's current popularity? Would you have preferred it stayed contained in the hacker community? Would it have survived on the fringes?
Bjorn: I really like the almost exponential rise of the popularity of Linux-based systems, since a good product deserves success and people deserve a good product. Linux would have survived without this "explosion", but it's more fun this way ...
LJ: Would it have survived without the IPOs and financial backing? What impact has the commercialization of Linux had? How do you feel about Linux profiteering and the people who make millions off of other people's volunteered efforts?
Bjorn: I have no problem whatsoever with people making money from Linux-related activities, as long as Linux stays open, which it will. The "popularity explosion" would quite likely not have happened without financial backing. The "hard-core hackers" would have been involved independently of this explosion, but the end users would not.
LJ: How can Linux compete with Microsoft in the desktop sector, and will we be able to hold the commercial sector if we don't take the desktop, as well? Can we take the desktop without ruining the spirit of Linux by dumbing it down? Where will our next areas of growth and expansion be?
Bjorn: To be completely honest, I don't really care. If it's important to other people, then they will make sure Linux is a strong and worthy competitor in whatever area they feel is important to them. It's not important enough for me to spend any significant time on. I'm only interested in getting access to an environment that fills my needs, which is what my Linux-based system does. If I need something completely new in my environment, then I will build it. If that is useful for other people, then that's a nice side effect.
LJ: How do you feel about commercial applications being written for Linux, and proprietary software and protocols in general? Do you run Linux more for philosophical reasons or practical reasons? If something that appeared to be better came along, would people jump ship? Conversely, would we stay with Linux even if it somehow degenerated, took a wrong turn, or stopped progressing?
Bjorn: If people want to create closed proprietary applications, they should. If people want to buy those applications, then both the seller and the buyer will (hopefully) be happy. I don't have a problem with that, as long as it is also possible to create open alternatives. What I would have a problem with is if someone tried to lock competitors out and customers in by creating closed proprietary protocols or file formats, or cheating by other means, such as patenting more or less obvious software algorithms.
Personally, I will almost always prefer an open alternative, since then I can learn new things, and especially since I can fix problems and extend the functionality when and how I decide to. I know what I want and expect from my system, and as long as the Linux-based distributions fill my need for a high-quality, open, UNIX-compatible environment, I will stay on. What the rest of the world decides to do is not really that important to me. It will not have a strong impact on my decisions.
LJ: Do you think the community should support only open-source/free software? How would the community survive hard times if there were a lag or down time in the continuing success of the open-source methodology? Is the free software philosophy strong enough and with enough adherents to pull us through?
Bjorn: The community consists of individuals who have the power and responsibility to make their own choices. If anyone wants to support closed software, let them. Those who feel otherwise will create open software. For some reason, I come to think about Pandora's box: once open, it can't be closed again. So unless you outlaw the Internet and the process of thinking, there is no way of stopping this. It's way past critical mass.
LJ: How do you feel about the different licenses: GPL, LGPL, QPL, etc.?
Bjorn: I admit I do not always fully subscribe to the FSF "political" agenda, since I accept that some software is closed and proprietary. My principles are:
- I demand open protocols, APIs and file formats, since I want to decide what communicates with what and how.
- I want open tools, since I can then adjust them to do what I want.
- I like open applications, since I can then fix bugs without having to wait for the next release.
This means I can live with closed applications as long as they use open protocols, APIs and file formats. I'm a bit ambivalent to the "GPL virus", which means I prefer LGPL for libraries. It should be possible to translate your "business edge" into applications without having to reveal everything to everyone. On the other hand, I like to learn new things by reading the source.
So, if there is a closed application which I feel a great need to modify, and no suitable open-source equivalent, then I will quite likely spend time re-writing it. I will also quite likely make the source freely available. The license I will use will probably be some version of the open-source licenses. The bottom line is that the people who do the release should decide for themselves what license to use. It's nobody else's business.
LJ: Is there a world outside of computers? Are you ever afraid that you'll wake up one day and feel you have wasted your life in front of a computer?
Bjorn: There is definitely a world outside of computers, and I try to enjoy it as much as possible. I have never felt a need to waste any significant part of my time regretting my past actions. I have enjoyed most of what I have done, and have learned something useful from the rest.
LJ: What part of Linux were you personally interested in and working on? How are you still involved with Linux development?
Nick: I was interested in the areas that I needed to work for me. I contributed patches to libc4 when I found problems that affected me. I contributed tab expansion for the tty layer in the kernel when I wanted to use a dumb terminal that couldn't handle hardware tabs.
I added the dummy network driver to ease the use of Linux with a dialup connection.
Normally, my involvement is restricted to tracking the Linux kernel mailing list and browsing the patches. I'll submit minor patches from time to time, but I am not a mainstream contributor.
LJ: What was most important to you about Linux? What's the very best thing about Linux?
Nick: I like the model of being able to modify the source to fix a problem or add an enhancement you need. You can then submit it back, and if it is seen as generally good, it gets included.
LJ: How important was the GNU project, and how did the GNU Hurd factor into your thinking? Should Linux be known as GNU/Linux?
Nick: To me, GNU Hurd is an interesting project, but it never appeared to be the way of achieving my ambition of a home UNIX machine. The GNU project was very useful in getting a broad suite of user applications, without which the kernel is not of much use. Most important of these has to be the compiler, gcc. However, while the GNU project should be acknowledged for its part in assisting Linux, I don't agree with having the title "GNU/Linux" applied to all Linux distributions using GNU software.
LJ: What was it like to be working with others over the Internet at a time when several computer luminaries thought that organizing successful software development over the Internet was difficult if not impossible? Did you realize how revolutionary this approach was?
Nick: To me, it wasn't that revolutionary at the start. I have seen it in action with the various source newsgroups (alt.sources, comp.sources.unix, comp.sources.misc), where I could make changes, submit them back to the author and see them in the next release. Initially, Linux wasn't that different; it was an OS kernel, rather than an application. It just grew to be a much larger scale.
Over time, it has become more amazing. As the size grew and the number of contributors increased, it has been amazing to see the same success. It has been good to see Eric Raymond's writings helping to clarify exactly what the phenomenon is.
LJ: What are you doing with your life now? What's a typical day like in your life? How do you find time for work and Linux, and how do you balance free software with the need to make a living (or the desire to become rich)? What do you do for fun?
Nick: My job involves the development of business-to-business e-commerce solutions. This is mainly using NT, or Solaris, but not Linux. This isn't bad, because it allows me to separate work and play in a clean way.
I have to try and spend some time away from the computer, as my wife is not into computing. Pam is very understanding, and understands that playing with computers is a hobby. Away from the computers, we go scuba-diving, hill walking, cycling, and are avid readers.
LJ: Who do you think other than Linus has had the most influence over the Linux community, and why?
Nick: Alan Cox has been the person I have noticed most. He has been involved with major input to various parts of the kernel (networking, SMP, sound). More importantly, in my opinion, he has performed the very valuable job of maintaining the stable branch of the kernel. Although the development branch is where the kernel action is, many people just want a stable kernel for their production machines.
LJ: How do you feel about Linux's current popularity? Would you have preferred it stayed contained in the hacker community? Would it have survived on the fringes?
Nick: I think without the popularity, the pace of development would have dwindled by now. I would probably have continued using it, but I think it would have remained a hacker's plaything and dwindled over time.
LJ: Would it have survived without the IPOs and financial backing? What impact has the commercialization of Linux had? How do you feel about Linux profiteering and the people who make millions off of other people's volunteered efforts?
Nick: I wish anyone well with their efforts to profit from Linux. Overall, the companies are creating a net benefit to the Linux community. For example, Red Hat and SuSE are each in the position to employ important hackers, which means they don't suffer from real work getting in the way of their Linux work. This is one of the hazards of free software development.
LJ: How do you feel about commercial applications being written for Linux, and proprietary software and protocols in general? Do you run Linux more for philosophical reasons or practical reasons? If something that appeared to be better came along, would people jump ship? Conversely, would we stay with Linux even if it somehow degenerated, took a wrong turn, or stopped progressing?
Nick: I think there is a place for commercial applications being written for Linux. Just because the OS and many of the standard applications are free doesn't mean they all have to be. If a company has to invest in producing an application for Linux, then they have the right to charge for it.
On the other hand, I don't like proprietary protocols. I think the success of the Internet is largely due to heterogeneous machines sharing common protocols. Having open protocols gives you the opportunity to inter-operate.
LJ: How do you feel about the different licenses? GPL, LGPL, QPL, etc.?
Nick: I recognize that there will never be an open-source solution for every application, so you will never have everything available under the GPL. Without the libc being LGPL, there would never be the range of business applications available for Linux that there currently is.
LJ: Is there a world outside of computers? Are you ever afraid that you'll wake up one day and feel you have wasted your life in front of a computer?
Nick: I don't expect to think I've wasted my life. Even if I unplug myself tomorrow, to pursue my career as a hermit, I've still enjoyed the experience so far.
LJ: What attracted you to it, compared to FreeBSD, proprietary UNIX systems or lucrative areas such as Windows? What made you want to help with development?
Pauline: The source, and the way it did things. You could follow the changes in the source by the patches and develop a real gut feeling about the OS. Money was of no concern to me in those days, being a student, so Windows didn't speak to me, especially 3.1 without network support!
LJ: What part of Linux were you personally interested in and working on? Are you still involved with Linux development? If so, how?
Pauline: I started doing just little improvements to new patches I saw drifting by on the mailing list, and after that, when I got my own IP connection, developed a real itch in wanting to connect all my computers to the 'net. This called for some real programming, and since Alan Cox at the same time added code for NAT and such stuff, I immediately saw the connection and created the IP masquerading module. After a few iterations, this was approved by Alan and added to the kernel. Later, the ftp-rewriting modules were developed. Nowadays, I'm mainly interested in video4linux, for which I maintain the — in kernel — zoran driver and some VCR projects.
LJ: What was most important to you about Linux? What's the very best thing about Linux?
Pauline: The speed at which it progresses and the way you could participate in it. Normally, in those days, you would have to wait for your supplier to release a new version, which is almost always commercial/marketing-driven. Not so with Linux.
LJ: How important was the GNU project, and how did the GNU Hurd factor into your thinking? Should Linux be properly known as GNU/Linux?
Pauline: Very important. Without it, we have only a kernel without any real working tools. Minix came with its own set of limited tools, but it lacked the real UNIX feel. FSF/GNU provided this missing link. GNU Hurd never entered my thinking; it was promised a long time ago, and as of today, I never saw it working. It's okay to have discussions about monolithic and message-based kernels, but I tend to be pragmatic. Show me a working kernel, please. :) Linux, in my opinion, should not be known as GNU/Linux. It's the OS that counts. To me, GNU/Linux would denote a distribution, like Red Hat Linux. Hey, a distro made by GNU, why not?
LJ: What was it like to be working with others over the Internet at a time when several computer luminaries thought that organizing successful software development over the 'net was difficult, if not impossible? Did you realize how revolutionary this approach was?
Pauline: I never realized we were doing revolutionary stuff 'til I read the papers by Eric Raymond. At that time, I felt really proud :)
LJ: What are you doing with your life now? (occupation, family, etc.) What's a typical day like in your life? How do you find time for work and Linux, and how do you balance free software with the need to make a living (or the need to become rich)? What do you do for fun?
Pauline: Currently, I'm running my own small business which specializes in writing software and Internet consultancy. Furthermore, I'm one of the board members of a small Dutch Internet Provider (IAF). Of course, I still keep up with the latest developments, but sometimes time gets really short.
A typical day in my life starts out by reading my mail — lots of mail. Mainly these are questions from customers of IAF, some are work-related and some are private. 80:15:5, I would say. This takes all morning. Of course, I'm continuously interrupted by the phone, people asking for this and that ...
After business-closing hours, I keep working for a bit, tying up loose ends. During these hours, I find the time to work actively on the projects I'm involved in, but not all these projects are Linux-related!
I started living with my boyfriend (for 7 years now!) who also takes up some of my ample free time, but it is well-spent time, I hasten to add :)
Fun — funny you mentioned it, I have heard of the word, of course, but... but... I read a lot, SF mostly. No real physical activities, I found them too tiresome... :)
LJ: Who do you think, other than Linus, has had the most influence over the Linux community, and why?
Pauline: Alan Cox, who maintains the 2.2 series, has done a lot of stuff for Linux and is still on top of a lot of things, deserves an A+ in my book. David Miller as being the network guru deserves some praise, too. I would say some people in their field have a lot of influence (for IDE drivers, one looks at Andre; for SCSI, to Gerard, etc.) So it's difficult to say who has the most influence, but all are very gifted in their field!
LJ: What do you think is the most important addition or change needed by Linux in order for it to succeed further? In what direction does Linux development need to go? Where is Linux's future the brightest? What is the #1 biggest threat to Linux today?
Pauline: Full-blown USB. USB devices are crawling out of the wall everywhere, and not supporting all of them would be a deadly sin. For Linux to succeed more will not depend on the kernel itself. We need a good desktop with good office tools which are accessible to Joe Average, who is still a drag-and-drop kind of guy. Linux qualities are still at the development speed, and when looking at common usages, the server market. It makes a tremendous network/web/mail server with outstanding firewall capabilities, which will only get better in the upcoming 2.4 series.
The biggest threat to Linux would be if Linus got hit by a bus and couldn't manage to read his mail for four months. Of course, other people would stand up, but since there are a lot of capable people on the 'net, a lot more discussing would happen. Linus' word is law (most of the time *grin*), and people take his word as final. Not many people I know would dare to disagree with him in public :) So, not having Linus around would waste a lot of valuable time in discussions.
LJ: How do you feel about Linux's current popularity? Would you prefer it had stayed contained in the hacker community? Would it have survived on the fringes?
Pauline: I feed good about one side, bad about the other. I don't want Linux to be just a hype, which blows over when the next OS comes available. I don't really care for Joe Average; I just want a good OS. Selfish of me, I know. It would not have stayed in the hacker community, but for me, it would be sufficient to have Linux gain a good foothold in the server market. The user business is a difficult low-profit market, and doesn't bring much to us developers.
LJ: Would it have survived without the IPOs and financial backing? What impact has the commercialization of Linux had? How do you feel about Linux profiteering and the people who make millions off of other people's volunteered efforts?
Pauline: I think it would have survived without financial backing, but all development would have gone at a slower pace. Also, we would have seen good people go on in the business world and more new people starting. As it is now, the older guys stay around and help the newer people by preventing them from making the same mistakes they did. This, of course, speeds things up tremendously!
Other people taking advantage of Linux makes me unhappy sometimes, but than again, sometimes I get good feedback or even hardware as a way of saying thanks for my help in the community! Need I say more? Hacker's paradise :) I'm into it for the fame, not the money. Not that I don't make money from Linux; for my business, I regularly install Linux servers for customers, but I don't charge them for the OS, just for my time.
LJ: How do you feel about commercial applications being written for Linux, and proprietary software and protocols in general? Do you run Linux more for philosophical reasons or practical reasons? If something that appeared to be better came along, would people jump ship? Conversely, would we stay with Linux even if it somehow degenerated, took a wrong turn, or stopped progressing?
Pauline: Commercial apps are okay with me. I would not buy them, I think, if there was a free alternative on the Internet, but I wouldn't stop them. Proprietary software is okay, but don't force it on people. When I buy hardware, I want it to have enough information so I can write a Linux driver for it (if it isn't there already), or else I simply won't buy the card. So, most proprietary hardware is out in my book.
I don't think people would jump to an alternative OS if it came along. I think the Linux community would try to incorporate the good things of that OS and go on to other innovative designs. A wrong turn is all in the eye of the beholder. Some people think their patch is an absolute necessity for the kernel, and some don't. This will always be part of open-software development.
LJ: How do you feel about the different licenses? GPL, LGPL, QPL, etc?
Pauline: I like the concept of "I develop something for free; you may use it, but don't make money off of it, otherwise I want a piece of the cake". However this is legally phrased, I don't really care :)
Linux Journal: How did you first learn about Linux? What were you doing in your own life at the time?
Drew Eckhardt: I saw a Usenet posting from Linus along the lines of "I've thrown together this Minix-like system to teach myself about the i86 architecture. You guys might want to check it out ..." I was an 18-year-old student studying computer science at the University of Colorado. One condition attached to my Woz scholarship was working for the university as a jack-of-all-trades system administrator, with Evi Nemeth as my boss. She gave me a login, told me about the man command, explained sudo, and let me loose with root access in the CU-CS undergraduate lab.
It didn't take long for me to decide that UNIX was hacker friendly — "hacker" in the classical meaning of cobbling bits and pieces together in an elegant manner to make them do interesting things.
LJ: What attracted you to it, compared to FreeBSD, proprietary UNIX systems, or lucrative areas such as Windows? What made you want to help with development?
Drew: At that time, the only freely redistributable BSD was the Jolitzes' project. They said that since it was a research system, it didn't matter if it ran on everyone's system, and weren't accepting patches. At the other extreme, Linus accepted my changes to the boot blocks and IDE/WD1003 driver and had new releases out within the day. Since I wanted to have UNIX on my machine without spending rent money on more (supported) hardware, I preferred Linus' approach. Later, Linux ran better on small systems, I had inertia, and the AT&T lawsuit kept me away from the BSDs.
At that time, I'd never run any of the commercial PC Unices. Later, I used SCO in a commercial environment and found it slower and less stable than Linux. Money wasn't really a consideration. I started doing UNIX in my free time because its interfaces and tools were more programmer friendly and elegant than the Microsoft products. Recently, I've discovered that competent people's salaries are more a function of their negotiating skills than the area(s) they've decided to work in.
With regard to development, I wanted to run some free UNIX on my hardware. Since I didn't like what Bill Jolitz was doing, that meant Linux. When I first got it, the boot blocks didn't work on my system (I converted the source to A86 with a Perl script and tweaked them until they worked). After that, the disk driver wouldn't come up when the system had non-IDE/WD1003 drives. Linux worked well on my 45M MFM drive, although I figured it would work better with my 85M SCSI drive.
I was too impatient to wait for someone to fix these problems, and solutions (albeit not necessarily the most correct or elegant) weren't too difficult. Farther on, I continued to contribute to the Linux kernel because it was fun. I did the SCSI-HOWTO to cut down on the number of times I answered the same questions in e-mail or on Usenet.
LJ: What part of Linux were you personally interested in and working on? How are you still involved with Linux development?
Drew: The SCSI subsystem. And, no. As I began playing with more interesting projects professionally, I no longer had a void that needed filling in my spare time. Developing for the Linux kernel and user lands would also be too close to what I do at work (proprietary FreeBSD VFS code and user-land system software). The few UNIX hacks I've thrown together at home have been under FreeBSD because of the more coherent build process.
LJ: What was most important to you about Linux? What's the very best thing about Linux?
Drew: It was an opportunity to play with an interesting non-trivial software project. The best thing about Linux is the size of the community, because of the number of programmers within it who've contributed device drivers and user-land ports.
LJ: How important was the GNU project, and how did the GNU Hurd factor in to your thinking? Should Linux be known as GNU/Linux?
Drew: The GNU project was very important, in two ways. One, it provided a usable, free tool chain which allowed anyone to contribute to the project. Two, it demonstrated that large, distributed software projects were viable. There is a third reason, somewhat. All the workstation vendors (Sun, DEC, HP) had used substantial amounts of BSD code and made minimal contributions back to the freely available sources. The GPL has forced companies to contribute their changes back to the Linux effort and to release their drivers in source form, so that bugs can be fixed even if the company is unwilling to do so.
The motivation behind Linux was a useful system. The HURD seemed like it would be a playground to explore interesting ideas, even if they made for a less practical system. With a 4M 386-33, I preferred the "useful system" approach.
"GNU/Linux" was RMS' attempt to exploit the success of Linux. Tom Christiansen looked at how much of his Linux machine was GNU, X consortium, or BSD. If I recall correctly, he found that less than 20% of our code came from the Free Software Foundation. Other sources each contributed more to the distribution he was using.
LJ: What was it like to be working with others over the Internet at a time when several computer luminaries thought that organizing successful software development over the Internet was difficult, if not impossible? Did you realize how revolutionary this approach was?
Drew: In hindsight, the development effort wasn't too different from commercial environments where developers hide in their offices, work on some subsystem and release the code as certain functions are completed. Unfortunately, both those commercial situations and the Linux kernel lack the design and implementation gatherings (I'm reluctant to invoke the negative connotations attached to "meeting") which provide the multiple viewpoints needed for the cleanest, smallest, fastest and most maintainable implementations.
Linux and some commercial projects also suffer from the lack of formal automated regression testing, making it hard to pin down what changes resulted in bugs or performance degradation. Having never worked on a non-trivial project before Linux, I had no baseline to compare it to. I didn't know what was missing from the development process until I lucked into my third, full-time professional position.
LJ: What are you doing with your life now? What's a typical day like? How do you find time for work and Linux, and how do you balance free software with the need to make a living (or the desire to become rich)? What do you do for fun?
Drew: I'm a software engineer. The company I work for builds digital video servers for the broadcast (commercial spot-playout, news show broadcast, east to west coast time delay) and post-production (computer effects) markets. Our boxes run FreeBSD on custom i86 hardware. I'm still single, alternating between periods of dating and swearing off women as an unneeded frustration.
A typical day? Depending on who I'm working with (people playing with the same subsystem working 9-to-5 hours call for an earlier arrival), machine shortages during prime working hours and whether I'm currently suffering from a bicycling addiction that calls for lunchtime rides, I generally head in to work between 10 a.m. and 2 p.m. Since I'm allergic to commuting, I don't work outside town. Currently, this makes for a 3-mile journey each way by foot, bicycle or motorcycle.
A couple of hours later, I assemble a sandwich or inhale junk food for lunch. Dinner calls for dining out at 7 or 8, with the restaurant selection depending on who's going (I enjoy almost everything except American and European vegetables, although one of my co-worker dining companions can't handle "foofy" garnishes and most foreign food). Assuming no deadlines are coming up, 8-10 hours of work later I go home, feed my fish, and often overdose on pinball.
All said, I make a comfortable living developing in an all free-software environment.
LJ: Who do you think other than Linus has had the most influence over the Linux community, and why?
Drew: Probably Alan Cox, for what he's done with the Linux network code and various other projects.
LJ: What do you think is the most important addition or change needed by Linux in order for it to succeed further? In what direction does Linux development need to go? Where is Linux's future the brightest? What is the #1 biggest threat to Linux today?
Drew: In the server environment, clustering is most important. On the desktop, Microsoft compatibility. As for threats, in the server market, I could say Microsoft because in theory they could produce better server software. On the UNIX desktop, I could say FreeBSD, because its CVS repository makes it much easier to track the latest changes, and its installation is more complete and coherent.
However, there's a lot to be said for inertia. "Everyone knows" Microsoft's operating systems crash often and their server software is insecure and doesn't scale. It will take a lot to overcome that. A lot of people could run one of the BSDs, although they don't offer enough obvious advantages for Joe Average User to switch. Jane Doe is going to install Linux rather than BSD because that's what her friends have and mass media is covering.
LJ: How do you feel about Linux's current popularity? Would you have preferred it stayed contained in the hacker community? Would it have survived on the fringes?
Drew: Sure. Linux would have survived as long as there were developers running it.
LJ: Would it have survived without the IPOs and financial backing? What impact has the commercialization of Linux had? How do you feel about Linux profiteering and the people who make millions off of other people's volunteered efforts?
Drew: Commercialization has given free software credibility, which allowed people to use it in more commercial environments. It's also allowed for a few IPOs, which have enriched a large number of stock portfolios to varying degrees.
With regard to profiteering, I'm a big fan of capitalism. I also tip reasonably when I dine out, and consider tipping the right thing to do. An analog in the Linux community would be more source, service and financial contributions from successful commercial developers.
LJ: How can Linux compete with Microsoft in the desktop sector, and will we be able to hold the commercial sector if we don't take the desktop as well? Can we take the desktop without ruining the spirit of Linux by dumbing it down? Where will our next areas of growth and expansion be?
Drew: Linux can compete with a killer application that doesn't run elsewhere. Historically, platform-specific killer apps have sold many copies of operating systems and the hardware needed to run them. Sonic the Hedgehog sold Sega consoles. Visi-calc sold CP/M and the Z80. Turbo-Tax is the reason many of the most dedicated UNIX users have Microsoft on their machines. Better Windows compatibility or sales of VMware may help too, because they would allow people to run Linux applications without rebooting to run their "needed" Microsoft applications.
Various Unices will remain the choice for dedicated Internet servers as long as the UNIX software is more robust, secure, scalable and faster than Microsoft's offerings. With more solid software from Microsoft, UNIX servers will still remain the choice for some time because of reputation and inertia.
Drool-proofed user interfaces don't preclude expert-friendly interfaces. Apart from AIX, the graphical system administration facilities provided with many workstation Unices still allow you to edit files. Pop-up search boxes can still use regexes. Hieroglyphic icons don't rule out a command line. As long as the UNIX user-land programs aren't replaced, we'll be no worse off, although I'd like to see more flexible (scriptable, regex empowered, etc.) and efficient (educated people can type "find" faster than they can locate a picture that looks a bit like binoculars and click on it) interfaces on new software.
LJ: How do you feel about commercial applications being written for Linux, and proprietary software and protocols in general? Do you run Linux more for philosophical reasons or practical reasons? If something that appeared to be better came along, would people jump ship? Conversely, would we stay with Linux even if it somehow degenerated, took a wrong turn, or stopped progressing?
Drew: Commercial applications for Linux are great. More functionality is good. Free applications would be better, though. In niche markets, we'll always have proprietary software because those markets can't or won't fund new products, and software companies can't guarantee they'll sell the support needed to pay for development after the fact.
In the general consumer market, (proprietary software's) days may be numbered. Legal issues aside, buying shrink-wrapped proprietary software is a bit silly when you can get the same software on a CD for a dollar. I think people pay more to get the packaging, documentation, support and other services available to retail purchasers. Free software formalizes the existing situation and provides benefits to both the software producers (who don't need to develop new products from scratch, allowing for lower labor costs and shorter time to market) and consumers (who can get their bugs fixed and new features developed, even if the original company is willing to do neither).
Proprietary protocols are more likely to have holes which are later exploited by crackers, are a bit immoral and potentially illegal.
LJ: How do you feel about the different licenses? GPL, LGPL, QPL, etc.?
Drew: The GPL has both benefits (it forces companies to distribute derived works under the same terms) and drawbacks (it forces companies to distribute derived works under the same terms).
LJ: Do you think the community should support only open-source/free software? How would the community survive hard times if there were a lag or down time in the continuing success of the open-source methodology? Is the free software philosophy strong enough and with enough adherents to pull us through?
Drew: Even RMS has conceded (through his actions) that the community should support both proprietary and free software. The last time I encountered RMS, his laptop booted using proprietary firmware
Root Prompt -- Nothing but Unix -- early reimplementation of Unix by Steve Hosgood in the Electrical Engineering dept at Swansea University
Prev | Upp | Contents | Next |
Society
Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers : Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism : The Iron Law of Oligarchy : Libertarian Philosophy
Quotes
War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda : SE quotes : Language Design and Programming Quotes : Random IT-related quotes : Somerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose Bierce : Bernard Shaw : Mark Twain Quotes
Bulletin:
Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 : Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law
History:
Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds : Larry Wall : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOS : Programming Languages History : PL/1 : Simula 67 : C : History of GCC development : Scripting Languages : Perl history : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history
Classic books:
The Peter Principle : Parkinson Law : 1984 : The Mythical Man-Month : How to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite
Most popular humor pages:
Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor
The Last but not Least Technology is dominated by two types of people: those who understand what they do not manage and those who manage what they do not understand ~Archibald Putt. Ph.D
Copyright © 1996-2021 by Softpanorama Society. www.softpanorama.org was initially created as a service to the (now defunct) UN Sustainable Development Networking Programme (SDNP) without any remuneration. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.
FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.
This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...
|
You can use PayPal to to buy a cup of coffee for authors of this site |
Disclaimer:
The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the Softpanorama society. We do not warrant the correctness of the information provided or its fitness for any purpose. The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be tracked by Google please disable Javascript for this site. This site is perfectly usable without Javascript.
Last modified: March 12, 2019