Home Messages Index
[Date Prev][Date Next][Thread Prev][Thread Next]
Author IndexDate IndexThread Index

Re: Linux is Capitalism

I found it amusing that Steve Ballmer, working for a company that had
just been found guilty of violation of the Sherman Act, who had
personally admitted to a number of illegal activities under oath, whose
boss had admitted to even more illegal activities under oath, was
calling Linux "Communist".

Here was one of the biggest advocates of totalitarian control, a
"fascist" in the IT Industry, calling Linux, a panopoly, communism.

Both Linux and Unix were products of IT service and support
organizations.  Many UNIX systems were first used to supplement
mainframes and proprietary minicomputers, and were implemend and
adopted primarily as an outhgrowth of the popularity, not of AT&T Unix,
but if BSD Unix.  AT&T tried to cash in on the popularity after
divestature, but their product was so inferior to BSD 4.0 that they
eventually had to negotiate terms with BSD to exchange technologies.

The irony is that most of those students who had been working with BSD
Unix from 1976 to 1983 were beginning to graduate and form their own
companies.  Eventually hundreds of companes were adopting UNIX and
offering UNIX on their various platforms.  Zilog offered a Linux
processor, Motorola optimized their 68010 and 68020 processors for
UNIX, as did National Semiconductor, Harris, Sperry, and several other
companies.  AT&T and IBM both created UNIX powered machines, including
Personal Computers designed to run UNIX instead of MS-DOS.

In 1990, the X11 graphical system had been sufficiently refined and
enhanced to compete favorably with the Apple Mac, and Unix powered
workstations produced by IBM, HP, and Sun could run circles around
Windows.  There were two problems, the first was that most of these
machines cost 3-10 times more than comparable Windows 3.0 machinse.  If
IT Managers had realized what a huge productivity hit choosing Windows
over UNIX actually was, they probably would have chosen UNIX, but they
really didn't do due dilligence.

UNIX workstations were incredibly reliable, they could run for weeks
without rebooting, one rarely lost work, and one could concurrently run
several applications.  In fact, with even the most basic Unix
workstations, in 1990, you had almost exactly the same experience as
most Windows XP users now enjoy nearly 15 years later.  Can you imagine
how much more profitable companies would have been if people had had
the capabilities of Windows XP back in 1990, or even 1991?

Actually, I understate the case.  Most UNIX workstations had even
higher resolutions, Sun workstations, for example, had 1600x1200 color
resolution and even higher, on 21 inch monitors.

In 1991, Linus had taken a course in operating systems based on
Tannenbaum's book.  He didn't have the expensive equipment required to
debug more complex microkernel based systems, so he built a simple
system which provided the best features of both the Microkernel and the
complex kernel.

Over the next 4 years, the UNIX community became intrigued at the
possibility of having a UNIX-like PC that could be mass-produced for
less than $1000 per workstation.  It captured their imagination, and
they began to contribute in extraordinary ways.

In the 1980s, a book called "In Search Of Excellence" had become the
"Bible" for American business executives.  One section suggested that
companies should "Stick to Knitting", that they should do one thing and
do it well.  Banks and Insurance companies should try to become
software companies or real-estate companies, an OS vendor shouldn't try
to sell applications.

As a result, companies often created very useful tools and utilities -
especially for UNIX, which they didn't want to promote into the
marketplace.  Many of these companies would allow their IT staff to
publish the source code to less "strategic" components into public
repositories such as usenet .binaries archives and simtel20 and
sunsite.  Most of these utilities were shared freely, often published
under the GPL.

When Linux was developed, it had no UNIX code in the kernel, but was
designed to be able to easily run this massive repository of thousands
of programs contributed to the FSF, Sunsite, and other public archives.
 Rather than redesign the kernel to match the UNIX applications, Linus
encouraged the development of libraries which could provide the
"personalities" of various UNIX systems based on public information,
and then test and debug the libraries using these UNIX applications.

The strategy actually worked brilliantly.  The irony is that it wasn't
a strategy to Linus.  He just didn't want to be bothered with anything
other than the kernel, and as a result others took on the libraries,
drivers, tools, and applications.

Linus didn't even want to market Linux.  That was actually started by
SoftLanding and later Pat Volkerding of Slackware.  Bob Young helped
form Red Hat, and provided some excellent marketing and support skills.
 One of his excellent efforts was his attempts to create some very
plausible estimates of the size of the Linux market.  Even when the
Linux market had grown to over 10 million users, this was barely a
statistical blip on the radar screen compared to the 700 million
Windows users active at the time.

The interesting thing is that Linux became very attractive to smaller
IT companies who wanted to be able to create a profitable project
without the restrictions imposed by Microsoft.  Many of these companies
had products which they had developed for UNIX, and just wanted to port
them to Linux.  Other companies were consulting companies and wanted to
have a low-cost platform and open source tools, enabling them to do
more work in less time,  without having to pay royalties to Microsoft
that could be channeled into new projects, upgrades, enhancements, and
other "scope creep" opportunities.

When Microsoft introduced Windows NT 4.0 in 1997, they tried to promote
it as an alternative to UNIX.  Many UNIX developer were willing to give
it a shot, and IT managers asked them to take on projects.  Many of the
estimates turned out to be way off, because many of the tools and
paradigms which worked so well for Linux and Unix would not function
well under Windows.  Approaches used to try and overcome these
limitations often tripled the development time, and then took almost
twice again as long to test and debug.  In some cases, the projects
never could be made functional under production loads and had to be

By 2000, Microsoft had released Windows 2000, and it was pretty obvious
to those familiar with Unix and Linux development technologies, that
Windows 2000 still wouldn't "fit in".  Windows 2000 was a really good
workstation, but just didn't measure up to Linux or Solaris or AIX or
HP_UX as a server.

For many companies there was a two-pronged strategy for dealing with
the end-of-life for Windows NT 4.0.  Some applications simply couldn't
be migrated away from Windows, and would have to be migrated to Windows
2000 or 2003.  On the other hand, there were many applications that had
been ported to Linux, Solaris, and AIX.  Furthermore, since Linux could
run on the Windows NT hardware being phased out, Linux was a natural
fit for simply replacing existing servers.  In addition, even though
the machines were not as fast as AIX and Solaris servers, Linux was so
efficient and flexible that one Linux box could replace 4-6 Windows
machines, which meant that migration could be managed more easily.

The irony is that in 2000 Steve Ballmer was assuming that the ONLY
people contributing to Linux or Using Linux were volunteers, with no
economic interests.  In practice consulting firms ranging from IGS to
Accenture to PWC to CSC were looking for "single source code, any
platform" solutions that could be developed on Linux PCs and ported up
to huge UNIX servers.  There were thousands of vendors who had
developed UNIX software for Solaris, AIX, HP_UX, SCO, and SGI, who saw
Linux as a way to reach whole new markets.

The irony is that Linux is much like an "Iceberg".  The tip of the
iceberg is the Open Source software, the license sales, and the "Linux
Machines".  But under the water is a huge block, about 10 times larger
which includes commercial applications, lesser known freely distributed
Linux distributions, and after-market installations.

[Date Prev][Date Next][Thread Prev][Thread Next]
Author IndexDate IndexThread Index