decoration decoration
Stories

GROKLAW
When you want to know more...
decoration
For layout only
Home
Archives
Site Map
Search
About Groklaw
Awards
Legal Research
Timelines
ApplevSamsung
ApplevSamsung p.2
ArchiveExplorer
Autozone
Bilski
Cases
Cast: Lawyers
Comes v. MS
Contracts/Documents
Courts
DRM
Gordon v MS
GPL
Grokdoc
HTML How To
IPI v RH
IV v. Google
Legal Docs
Lodsys
MS Litigations
MSvB&N
News Picks
Novell v. MS
Novell-MS Deal
ODF/OOXML
OOXML Appeals
OraclevGoogle
Patents
ProjectMonterey
Psystar
Quote Database
Red Hat v SCO
Salus Book
SCEA v Hotz
SCO Appeals
SCO Bankruptcy
SCO Financials
SCO Overview
SCO v IBM
SCO v Novell
SCO:Soup2Nuts
SCOsource
Sean Daly
Software Patents
Switch to Linux
Transcripts
Unix Books
Your contributions keep Groklaw going.
To donate to Groklaw 2.0:

Groklaw Gear

Click here to send an email to the editor of this weblog.


To read comments to this article, go here
Linux is Not "About to Fork" like Unix
Sunday, November 21 2004 @ 02:46 PM EST

You may have seen Paul Krill's concern that the Linux kernel may be about to "fork" and go the way of Unix. He attended the SDForum conference recently, which covered the following topics, Open Source in the Enterprise: Practical Considerations, Vendor Perspectives, Success Stories, Open Source Legal Issues, and Investing in Open Source. One of the keynote addresses was by Andrew Morton, maintainer of the Linux 2.6 kernel, and something he said worried Krill. Never fear, Mr. Krill. The kernel is safe, as you will see. Here's what Krill wrote:

"Linux could be about to fork. In a worrying parallel to the issue that stopped Unix becoming a mass-market product in the 1980s - leaving the field clear for Microsoft - a recent open source conference saw a leading Linux kernel developer predict that there could soon be two versions of the Linux kernel.

"Each version of the kernel requires applications to be compiled specifically for it. Today, only one Linux kernel is current but Andrew Morton, lead maintainer of the Linux kernel for Open Source Development Labs (OSDL), said that he expects the 2.7 version of the platform to fork to accommodate large patch sets.

"Commenting on the planned 2.7 release of the Linux kernel, Morton said OSDL expects a few individuals with big patch sets will want to include them in the kernel. But there will be no place to put them - presumably because functionality for major kernel changes won't be applicable to all or even most users. One example might be between desktops and servers. So at some point, Linux founder Linus Torvalds will fork off version 2.7 to accommodate the changes, Morton said at the SDForum open source conference."

Not to worry. I contacted Mr. Morton and asked for clarification, and you'll find his response reassuring.

Here's what he wrote:

"Paul has misinterpreted the word 'fork'. I was referring to the software engineering process of branching off a stable release of your product so that development can continue against the tip-of-tree codebase.

"We did this for the 2.0 kernel series, the 2.2 series and the 2.4 series. One day we'll do it for the 2.6 series."

Linux is developed like this, so rather than being worrying, it is good news of pending Linux progress. That's OK. We can't all be techies, but everyone can go back to a relaxing Sunday, safe in the assurance that there is no doom on the horizon for Linux.

Of course, a little research on Mr. Krill's earlier articles makes for a fun Sunday in itself, I must say. Here's a catchy title: "Maybe SCO has a point". In that article, written back in January when SCO sent its letter to Congress about free and open source software allegedly endangering the economy, Krill articulated his concern about software developers:

"However, if the trend of giving away software continues to gather momentum, how do developers and software companies put bread on the table? Work a second job?

"This question is something I've pondered before, and now SCO seems to be backing me up.

"The capitalist economy is based on selling products and services for the top dollar that the market demands. If the user community begins to expect its software free of charge, what happens to the innovation and incentive to improve software, or to even build it at all?"

Of course, that was back in January. I'm not so sure anyone would want to associate themselves with SCO today. But in the earlier article he refers to, titled, "How can open source fly?" he expressed this worry:

"Pardon me, but am I missing something here? Do people make money in a capitalist system by giving things away?"

And in a November 17, 2004 blog entry, "Can't say I didn't tell you so," he writes:

"Previously, I have asked what happens to the software industry if enterprises expect to acquire their software for free via open source.

"Well, it looks like this expectation is coming to fruition. A Wells Fargo executive at the SDForum's 'Open Source Entering the Mainstream' conference this week said the tide has switched from companies being suspect of open source to now openly seeking it out as an alternative to commercial products.

"Which leads us back to the question of what happens to innovation if at some point there is no money being made on the actual selling of software. . . . How long can open source be sustained if developers have to work a separate job to pay the bills and deal with open source as a hobby? I guess that depends on the devotion of the developers, whose dedication to their craft is certainly admirable and even enviable. . . .

"I have to wonder, though, whether the days are numbered for software companies to make billions of dollars a year by selling software."

As you can see, he gets the big picture. Microsoft's artificially-high-priced software business model is doomed. But is that bad? Won't we all, businesses included, then have money to spend and invest in other ways? Is that not good for the economy?

And what about the question about programmers paying their mortgages? More and more, FOSS programmers work for vendors like Red Hat and Novell. Andrew Morton gets paid. He is sponsored by OSDL and works full-time on Linux kernel development. So does Linus. And so forth. Many companies, not just vendors, see it as in their corporate interests to assign some of the coders to working on the kernel, so they can get incorporated the things they need for their business purposes. I believe IBM is oriented toward making money in a capitalist structure. Morton writes to corporations on how to get what they want incorporated into the kernel:

"I'd like to encourage corporate developers to become more involved and to contribute more to Linux. There are some cultural and even legal problems, as well as some common pitfalls for corporations working with the kernel and the open source development process.

"Developers should always work with their management and in-house counsel to be clear about IP issues and internal processes. Each company needs to individually address these issues.

"Companies should try to avoid what I call the 'SourceForge Syndrome.' A company will set up a big project and then beaver away at it independently for months. Suddenly a 50,000-line patch appears on a mailing list and no one understands what it does or why it does it. Such large patches usually have significant architectural problems and even duplication of other efforts. By this stage the originating development team is deeply invested in their current code and may even have run out of budget for rework, but the code may be unacceptable. It's generally a disaster.

"Companies must understand that there are significant advantages for them to merge their code into the mainstream kernel. This reduces their maintenance effort and costs, increases their tester and reviewer base, and encourages other developers to contribute additional feature work. Other people will magically fix your bugs for you. . . .

"Rather than setting up an external SourceForge project, you should aim to get a small core of your feature into the base kernel and develop against that, introducing new features on a frequent basis. . . .. Companies may want to nominate a lead individual as their contact point with the rest of the kernel development team."

Many do write it just for fun, though. It may seem hard to accept that folks would write software without charge, but by now, after more than a decade, is it not an established fact that they will and they do? It's comparable to novelists, in a way. You can't stop them from writing, even if there may not be a publisher, because it's a creative outlet and they enjoy doing it. Working on the Linux kernel is satisfying because there is a use for your work, probably some functionality you have coded that you personally want, on top of the creative fun. And you know the entire world will benefit from your labor. So, let's just posit that folks will do it, based on the scientific basis that we see that they have and they do.

If you are curious about how it all works, I suggest you might enjoy reading Keeping the Kernel", by Andrew Morton. He explains how the process works, and it's interesting to see an example of how the openness of the work benefits the kernel, in a box, "Building a New Scheduler":

"Inspired by a paper on an anticipatory scheduler by Peter Druschel and Sitaram Iyer (http://www.cs.rice.edu/~ssiyer/r/antsched), a young Australian developer named Nick Piggin came out of the woodwork to ask Jens Axboe [another seasoned kernel developer] and Andrew Morton questions about the existing I/O scheduler. Per Morton's suggestion and with Morton's encouragement, Piggin implemented the scheduler on Linux. Six months and 140 patches later (90 percent of the contributions from Piggin), the new scheduler was stabilized and merged into the mainline kernel."

It's not a free-for-all, though, where anybody can just do whatever they want with the kernel. Morton provides clear instructions on how to become a contributor and how long it takes to get accepted into the process usually:

"Approximately 1,000 individuals contributed to the 2.5 and 2.6 kernels. For the Linux kernel the '20/80' rule very much applies. About 20 people did 80 percent of the work. Or maybe it was '10/90.'

"The kernel has become a lot more complex than it was in the 2.0 days, and the learning curve is longer and steeper. A committed developer who's competent in C can begin to contribute usefully after as little as a few weeks of study. (For an example, see the sidebar 'Building a New Scheduler.') In general, it probably takes six to twelve months of full-time work to reach the level of a mainstream kernel contributor."

I think we can see from this that while a few work at this full-time, many do not, yet they are still able to contribute meaningfully. That's how it works. The best way to get started, Morton writes, is the way he got started, fixing bugs:

"Of course, new developers are encouraged and are always welcome. We always need help with drivers, bug fixing, tuning, and testing. A new developer may choose to go through the kernel Bugzilla database and identify any problems that appear to be unresolved. See if you can reproduce the problem, work it with the originator, and then develop a fix. Similarly, the various development mailing lists are a good source of current bug reports."

Here's how Morton says he got familiar with the code, "I spent two years working partly on network drivers, but mainly on bug fixing. It didn't matter where in the kernel the bug was, I'd try to fix it. That was a great way to learn how the kernel worked."

So, while there are adjustments in the software industry to be made, you can see from the way it is already playing out that innovation and advances are coming from a merging of two groups: programmers who love to program and corporations who need things incorporated into the kernel for their business purposes. And what's wrong with that?


  View Printable Version


Groklaw © Copyright 2003-2013 Pamela Jones.
All trademarks and copyrights on this page are owned by their respective owners.
Comments are owned by the individual posters.

PJ's articles are licensed under a Creative Commons License. ( Details )