Bottom Line - Is Linux Mature Yet?
Sep/Oct 2003 by Sam Williams
Is Now The Time For
It's been 12 years since Linus Torvalds uploaded the first version of the Linux kernel, five years since major businesses such as IBM, Hewlett Packard and, yes, Novell made their first major investments in Linux, and at least a full year since Wall Street banks began swapping out their Unix servers with low-cost, reliable Linux servers.
And yet, for some reason, the doubts still linger.
A cynic might point to a few companies with a vested interest in stoking those doubts, but that devotes more attention to the people paid to out-talk Linux vendors and less to those in a position to weigh what Linux has to offer. For the purposes of this article, it seems better to focus on the customer side and to accept any lingering doubts as evidence of larger uncertainties in the IT marketplace.
Given these uncertainties, it seems strange that some corporate technology strategists have moved so decisively to embrace an operating system and development model that others still dismiss as "immature." What do these companies see that others do not?
Setting aside game theory and all its attendant lessons on "early adopter" and "late adopter" strategies, let's take a look at the basic facts common to both sides of the Linux debate.
There's More Than One Linux
This may seem like a common sense statement to anybody who has followed the Linux phenomenon over the last decade, but it's something Linux evangelists and their opponents in the Unix and Windows worlds quickly push to the background.
Those weighing Linux on a strategic level would do best not to make that mistake.
In just 12 short years Linux has leapfrogged from its original, native environment-Intel 386 microprocessors-to bigger and better chips: Pentium, Alpha, ARM, StrongARM and even the various RISC architectures favored by Apple and Sun.
This colonization pattern is not unprecedented. Unix followed a similar cross-platform proliferation in the early to mid-1970s. Still, in the current marketplace, where most commercial operating systems cleave closely to their native hardware platforms, the Linux growth pattern is noteworthy. Where Unix once fragmented into a host of proprietary, closed-source spin-offs, Linux for the most part, has retained the image of a single, solid entity.
This image rests on two fundamental tenets of the Linux design philosophy: 1) Torvalds & Co., having learned from Unix history, has explicitly avoided any development pathways that would cut off a significant sub-section of the operating systems marketplace and 2) The GNU General Public License, the license that regulates Linux development, forbids proprietary offshoots while at the same time encouraging the free exchange of ideas and innovations from all corners of the Linux marketplace. Simply put, developers of device- or application-specific Linux variants have a high incentive to retain cross-platform compatibility with other Linux variants.
The end result: Customers can buy SUSE Linux with Ximian applications for the desktop, Red Hat running Apache server for enterprise-scale applications and MontaVista's Hard Linux for embedded applications demanding hardened "realtime" availability.
This situation is similar to the one encountered by Microsoft Windows users. Companies buy Windows 2000 for the desktop, Windows NT or Windows Server for network-based services and Windows CE for handheld and embedded devices.
The only major difference, of course, is that Microsoft is a single company and a strict guardian of Windows source code, whereas Red Hat, SuSE and MontaVista are three distinct companies forced to compete on the basis of issues rather than source code access.
When it comes to maturity, one would never throw a blanket characterization over the entire Windows franchise. Why do the same for Linux? Breaking the Linux marketplace into three major sub-realms: desktop, embedded and servers systems, one quickly encounters a sliding scale of maturity and robustness. Most IT experts would give Windows the edge at the desktop level, noting the dominance of certain office application standards. When it comes to servers, a marketplace where no single application or suite dominates, however, most agree that Linux is equally suited for mission critical tasks.
"Linux on the server level has gone through the knee of the S curve," says Robin Bloor, president of Bloor Research, an IT analysis & consultancy firm. "It's getting bought without much resistance or concern."
Bloor credits this acceptance to a number of factors. Chief among those factors is the ability of companies to give Linux a full test run before implementation. Thanks to the GNU General Public License, the Linux operating system's governing license, source code on all variants must be freely available. Whether borrowing the software from another user or purchasing from a vendor, this availability gives systems administrators the ability to tinker and tune the operating system for customized use.
Since then, however, Bloor, like many other executives, has come to accept Linux as a stable platform, especially for server-based applications. These applications, he says, are key to the operating system's growing name recognition at the CTO/CIO strategic buying level.
"At the end of the day, it's applications that sell boxes," Bloor says. "The wealth of applications [on Linux servers] right now is second only to Windows and it will probably overtake Windows in a couple of years."
One customer who can vouch for Linux maturity at the network server level is Garner McNett, president of Dallas-based Cargo Data Management Corp. McNett's company made the switch from UNIX to Linux in the midst of a pre-Y2K overhaul for a main client. Since then, the company has taken on a major new account, Internet-based accounting and cargo tracking for Philippines air carrier Cebu Pacific Air, without a hitch.
"You've got people on these islands out in the Pacific using a dial-up connection, sending data back and forth," McNett says. "You could not do that in Windows. This is a character-based system with a terminal emulator. It's extremely fast, extremely functional and there's no overhead to drag around. Best of all: it never goes down."
Cost Is Subjective
Thanks to the GPL, Linux has long enjoyed a highly favorable cost profile. Indeed, one of the factors that sped Linux adoption in the late 1990s was the ability of individual systems administrators to download and install free copies for the management of corporate networks on a trial basis without running those decisions past purchasing departments.
Recently, however, some have called the operating system's low-cost reputation into question, perhaps prematurely. Numerous studies stretching costs over a three-to-five year period have given Windows 2000 the edge over Linux in terms of "total cost of ownership." The primary reason: an extended time horizon puts greater weight on salary and service costs and certified Linux specialists, like their UNIXbrethrenn, tend to pull down higher salaries than Windows specialists.
However, an independent 2002 Robert Frances Group survey of IT executives in Global 2000 companies provided the following average salary breakdown: Solaris $85,844, Linux $71,400 and Windows less than $3,000 behind at $68,500.
But when the study looked at total cost of ownership, however, it found that the number of Linux servers managed per employee was four times higher than the number for Windows. It also found that Linux servers experienced less downtime and fewer debilitating security problems than Windows servers. As a result, per node costs for Linux were $12,010 compared with $29,509 for Solaris and $52,060 for Windows.
"The three factors that seem to be coming up over and over again are reduction in server counts (doing the same job with half as many boxes), hardware savings (this is usually the Sun to Wintel comparison), and finally, software cost savings and flexibility of choice," says Chad Robinson, senior research analyst at Robert Frances Group. "Companies can buy Red Hat and then use Red Hat's commercial support channels. They may also download a free distribution and choose a support vendor from a growing array of commercial options. This flexibility of not being married to a specific vendor's 'stack' or 'road map' is adding to the growing popularity of Linux in the enterprise."
Finally there is the issue of vendor choice. In Your Open Source Plan, an article in the May, 2003 issue of CIO magazine, E-Trade executive Josh Levine notes that his company slowly made the switch from Unix servers, each costing $240,000 with a $25,000 yearly maintenance contract, to $4,000 Intel servers with minimal maintenance charges, thanks to Linux.
"We get to manage the [hardware] vendor as opposed to the vendor managing us," Levine says. "They can't hide behind an operating system."
Linux Involves A (Slight) Change In IP Philosophy.
As noted above, the GNU General Public License is a radical departure from other commercial software licenses. Those who embrace Linux as a cornerstone or even as a significant piece of their IT strategy would do well to familiarize themselves with both the language and philosophy that underlies the license.
Companies that do so must adjust their thinking to accommodate the liberal agenda of the free software movement, the collective architect of both the GPL and the many software elements that go into Linux, with its emphasis on the rights of developers to share and modify all software programs. Once through this political prism, however, many will experience the quiet epiphany that Red Hat Chief Technology Officer Michael Tiemann first noted when reading the free software movement's "urtext," Richard M. Stallman's GNU Manifesto. "On the surface it read like a socialist polemic," notes Tiemann in the 1999 book Open Sources. "But I saw something different. I saw a business plan in disguise."
Driving that business plan, as Tiemann goes on to note, is the economic efficiency that comes with offloading a major cost center-operating system development-onto a community of motivated, talented programmers, while retaining more profitable corners of the software marketplace-applications, services and support-for better targeted exploitation.
The GPL's implications vary. For the application developer, the GPL's implications are not so radical. Based on the same copyright foundation as all commercial software licenses, it draws its strength from the mutual respect most companies and creators already show for existing software licenses. In other words, applications that don't borrow Linux source code can still be made Linux-compatible without the need for relicensing.
"From a licensing perspective, there's no real difference between building an application for Linux and building an application for Windows," notes James Bottomley, software architect for SteelEye, a company that offers business continuity solutions and services for both platforms. "It's as easy to sell proprietary applications atop a Linux platform as it is to sell proprietary applications atop a Windows platform. In fact, we do both." From a Linux services perspective, the GPL is best seen as a market generator. By making it impossible for any single company to declare ownership or bar access to the Linux source code, the GPL pushes companies to compete in other realms such as quality of service and overall usability.
"The GPL is designed to benefit the customer, not the vendor," notes Bob Waldie, chairman of SnapGear, a company that develops Virtual Private Network firewall applications for Linux.
Which brings us back to the original perspective: you as a customer trying to assess the maturity level of Linux. If a license designed to encourage, rather than discourage, competition for the end-user dollar seems hard to believe, then maybe Linux isn't the proper choice just yet. For a growing number of companies looking to guard those dollars, however, Linux has proved itself more than ready to handle the enterprise IT load.
"You're witnessing a product in severe evolution," says Bloor, who likens the lingering uncertainty over Linux to the lingering uncertainty that surrounded relational databases in the late 1980s.
"If you remember back then, there were still doubts about transaction processing capabilities," Bloor says. "By 1995, nobody was asking the questions anymore, because relational databases had proven their capabilities. The same is happening for Linux. Every time you look at it, you'll be able to find things it doesn't do, but because of several factors, those things will soon be buttoned up."
|Proof Point - Getting the Best of Both Worlds|