TOPlist Why Open Source Software / Free Software (OSS/FS, FOSS, or FLOSS)? Look at the Numbers! Translations available: Czech | French | Japanese | Spanish

Why Open Source Software / Free Software (OSS/FS, FLOSS, or FOSS)? Look at the Numbers!

David A. Wheeler
Revised as of April 16, 2007

This paper provides quantitative data that, in many cases, using open source software / free software (abbreviated as OSS/FS, FLOSS, or FOSS) is a reasonable or even superior approach to using their proprietary competition according to various measures. This paper’s goal is to show that you should consider using OSS/FS when acquiring software. This paper examines market share, reliability, performance, scalability, security, and total cost of ownership. It also has sections on non-quantitative issues, unnecessary fears, OSS/FS on the desktop, usage reports, governments and OSS/FS, other sites providing related information, and ends with some conclusions. An appendix gives more background information about OSS/FS. You can view this paper at (HTML format). A short presentation (briefing) based on this paper is also available. Palm PDA users may wish to use Plucker to view this longer report. Old archived copies and a list of changes are also available.

1. Introduction

Open Source Software / Free Software (OSS/FS) (also abbreviated as FLOSS or FOSS) has risen to great prominence. Briefly, OSS/FS programs are programs whose licenses give users the freedom to run the program for any purpose, to study and modify the program, and to redistribute copies of either the original or modified program (without having to pay royalties to previous developers).

The goal of this paper is to convince you to consider using OSS/FS when you’re looking for software, using quantitive measures. Some sites provide a few anecdotes on why you should use OSS/FS, but for many that’s not enough information to justify using OSS/FS. Instead, this paper emphasizes quantitative measures (such as experiments and market studies) to justify why using OSS/FS products is in many circumstances a reasonable or even superior approach. I should note that while I find much to like about OSS/FS, I’m not a rabid advocate; I use both proprietary and OSS/FS products myself. Vendors of proprietary products often work hard to find numbers to support their claims; this page provides a useful antidote of hard figures to aid in comparing proprietary products to OSS/FS. Others have come to the same conclusions, for example, Forrester Research concluded in September 2006 that “Firms should consider open source options for mission-critical applications”.

I believe that this paper has met its goal; others seem to think so too. The 2004 report of the California Performance Review, a report from the state of California, urges that “the state should more extensively consider use of open source software”, and specifically references this paper. A review at the Canadian Open Source Education and Research (CanOpenER) site stated “This is an excellent look at the some of the reasons why any organisation should consider the use of [OSS/FS]... [it] does a wonderful job of bringing the facts and figures of real usage comparisons and how the figures are arrived at. No FUD or paid for industry reports here, just the facts”. This paper been referenced by many other works, too. It’s my hope that you’ll find it useful as well.

The following subsections describe the paper’s scope, challenges in creating it, the paper’s terminology, and the bigger picture. This is followed by a description of the rest of the paper’s organization (listing the sections such as market share, reliability, performance, scalability, security, and total cost of ownership). Those who find this paper interesting may also be interested in the other documents available on David A. Wheeler’s personal home page.

1.1 Scope

As noted above, the goal of this paper is to convince you to consider using OSS/FS when you’re looking for software, using quantitive measures. Note that this paper’s goal is not to show that all OSS/FS is better than all proprietary software. Certainly, there are many who believe this is true from ethical, moral, or social grounds. It’s true that OSS/FS users have fundamental control and flexibility advantages, since they can modify and maintain their own software to their liking. And some countries perceive advantages to not being dependent on a sole-source company based in another country. However, no numbers could prove the broad claim that OSS/FS is always “better” (indeed you cannot reasonably use the term “better” until you determine what you mean by it). Instead, I’ll simply compare commonly-used OSS/FS software with commonly-used proprietary software, to show that at least in certain situations and by certain measures, some OSS/FS software is at least as good or better than its proprietary competition. Of course, some OSS/FS software is technically poor, just as some proprietary software is technically poor. And remember -- even very good software may not fit your specific needs. But although most people understand the need to compare proprietary products before using them, many people fail to even consider OSS/FS products, or they create policies that unnecessarily inhibit their use; those are errors this paper tries to correct.

This paper doesn’t describe how to evaluate particular OSS/FS programs; a companion paper describes how to evaluate OSS/FS programs. This paper also doesn’t explain how an organization would transition to an OSS/FS approach if one is selected. Other documents cover transition issues, such as The Interchange of Data between Adminisrations (IDA) Open Source Migration Guidelines (November 2003) and the German KBSt’s Open Source Migration Guide (July 2003) (though both are somewhat dated). Organizations can transition to OSS/FS in part or in stages, which for many is a more practical transition approach.

I’ll emphasize the operating system (OS) known as GNU/Linux (which many abbreviate as “Linux”), the Apache web server, the Mozilla Firefox web browser, and the office suite, since these are some of the most visible OSS/FS projects. I’ll also primarily compare OSS/FS software to Microsoft’s products (such as Windows and IIS), since Microsoft Windows has a significant market share and Microsoft is one of proprietary software’s strongest proponents. Note, however, that even Microsoft makes and uses OSS/FS themselves (they have even sold software using the GNU GPL license, as discussed below).

I’ll mention Unix systems as well, though the situation with Unix is more complex; today’s Unix systems include many OSS/FS components or software primarily derived from OSS/FS components. Thus, comparing proprietary Unix systems to OSS/FS systems (when examined as whole systems) is often not as clear-cut. This paper uses the term “Unix-like” to mean systems intentionally similar to Unix; both Unix and GNU/Linux are “Unix-like” systems. The most recent Apple Macintosh OS (MacOS OS X) presents the same kind of complications; older versions of MacOS were wholly proprietary, but Apple’s OS has been redesigned so that it’s now based on a Unix system with substantial contributions from OSS/FS programs. Indeed, Apple is now openly encouraging collaboration with OSS/FS developers.

1.2 Challenges

It’s a challenge to write any paper like this; measuring anything is always difficult, for example. Most of these figures are from other works, and it was difficult to find many of them. But there are some special challenges that you should be aware of: legal problems in publishing data, the reluctance of many OSS/FS users to publicly admit it (for fear of retribution), and dubious studies (typically those funded by a product vendor).

Many proprietary software product licenses include clauses that forbid public criticism of the product without the vendor’s permission. Obviously, there’s no reason that such permission would be granted if a review is negative -- such vendors can ensure that any negative comments are reduced and that harsh critiques, regardless of their truth, are never published. This significantly reduces the amount of information available for unbiased comparisons. Reviewers may choose to change their report so it can be published (omitting important negative information), or not report at all -- in fact, they might not even start the evaluation. Some laws, such as UCITA (a law in Maryland and Virginia), specifically enforce these clauses forbidding free speech, and in many other locations the law is unclear -- making researchers bear substantial legal risk that these clauses might be enforced. These legal risks have a chilling effect on researchers, and thus makes it much harder for customers to receive complete unbiased information. This is not merely a theoretical problem; these license clauses have already prevented some public critique, e.g., Cambridge researchers reported that they were forbidden to publish some of their benchmarked results of VMWare ESX Server and Connectix/Microsoft Virtual PC. Oracle has had such clauses for years. Hopefully these unwarranted restraints of free speech will be removed in the future. But in spite of these legal tactics to prevent disclosure of unbiased data, there is still some publicly available data, as this paper shows.

Another problem is that many users of OSS/FS are reluctant to admit it. ZDNet UK’s November 25, 2005 article “Why open source projects are not publicised” by Ingrid Marson examines this. For example, it notes that many are afraid of retribution. Obviously, this makes some data more difficult to obtain.

This paper omits or at least tries to warn about studies funded by a product’s vendor, which have a fundamentally damaging conflict of interest. Remember that vendor-sponsored studies are often rigged (no matter who the vendor is) to make the vendor look good instead of being fair comparisons. Todd Bishop’s January 27, 2004 article in the Seattle Post-Intelligencer Reporter discusses the serious problems when a vendor funds published research about itself. A study funder could directly pay someone and ask them to directly lie, but it’s not necessary; a smart study funder can produce the results they wish without, strictly speaking, lying. For example, a study funder can make sure that the evaluation carefully defines a specific environment or extremely narrow question that shows a positive trait of their product (ignoring other, probably more important factors), require an odd measurement process that happens show off their product, seek unqualified or unscrupulous reviewers who will create positive results (without careful controls or even without doing the work!), create an unfairly different environment between the compared products (and not say so or obfuscate the point), require the reporter to omit any especially negative results, or even fund a large number of different studies and only allow the positive reports to appear in public. The song “Meat the Press” by Steve Taylor eloquently expresses this kind of thing: “They can state the facts while telling a lie”.

This doesn’t mean that all vendor-funded studies are misleading, but many are, and there’s no way to be sure which studies (if any) are actually valid. For example, Microsoft’s “get the facts” campaign identifies many studies, but nearly every study is entirely vendor-funded, and I have no way to determine if any of them are valid. After a pair of vendor-funded studies were publicly lambasted, Forrester Research announced that it will no longer accept projects that involve paid-for, publicized product comparisons. One ad, based on a vendor-sponsored study, was found to be misleading by the UK Advertising Standards Authority (an independent, self-regulatory body), who formally adjudicated against the vendor. This example is important because the study was touted as being fair by an “independent” group, yet it was found unfair by an organization who examines advertisements; failing to meeting the standard for truth for an advertisement is a very low bar.

Steve Hamm’s BusinessWeek article “The Truth about Linux and Windows” (April 22, 2005) noted that far too many reports are simply funded by one side or another, and even when they say they aren’t, it’s difficult to take some seriously. In particular, he analyzed a report by the Yankee Group’s Laura DiDio, asking deeper questions about the data, and found many serious problems. His article explained why he just doesn’t “trust its conclusions” because “the work seems sloppy [and] not reliable” ( a Groklaw article also discussed these problems).

Many companies fund studies that place their products in a good light, not just Microsoft, and the concerns about vendor-funded studies apply equally to vendors of OSS/FS products. I’m independent; I have received no funding of any kind to write this paper, and I have no financial reason to prefer either OSS/FS or proprietary software. I recommend that you prefer studies that do not have financial incentives for any particular outcome.

This paper includes data over a series of years, not just the past year; all relevant data should be considered when making a decision, instead of arbitrarily ignoring older data. Note that the older data shows that OSS/FS has a history of many positive traits, as opposed to being a temporary phenomenon.

1.3 Terminology and Conventions

You can see more detailed explanation of the terms “open source software” and “Free Software”, as well as related information, in the appendix and my list of Open Source Software / Free Software (OSS/FS) references at Note that those who use the term “open source software” tend to emphasize technical advantages of such software (such as better reliability and security), while those who use the term “Free Software” tend to emphasize freedom from control by another and/or ethical issues. The opposite of OSS/FS is “closed” or “proprietary” software.

Other alternative terms for OSS/FS, besides either of those terms alone, include “libre software” (where libre means free as in freedom), “livre software” (same thing), free-libre / open-source software (FLOS software or FLOSS), open source / Free Software (OS/FS), free / open source software (FOSS or F/OSS), open-source software (indeed, “open-source” is often used as a general adjective), “freed software,” and even “public service software” (since often these software projects are designed to serve the public at large). I recommend the term “FLOSS” because it is easy to say and directly counters the problem that “free” is often misunderstood as “no cost”. However, since I began writing this document before the term “FLOSS” was coined, I have continued to use OSS/FS here.

Software that cannot be modified and redistributed without further limitation, but whose source code is visible (e.g., “source viewable” or “open box” software, including “shared source” and “community” licenses), is not considered here since such software doesn’t meet the definition of OSS/FS. OSS/FS is not “freeware”; freeware is usually defined as proprietary software given away without cost, and does not provide the basic OSS/FS rights to examine, modify, and redistribute the program’s source code.

A few writers still make the mistake of saying that OSS/FS is “non-commercial” or “public domain”, or they mistakenly contrast OSS/FS with “commercial” products. However, today many OSS/FS programs are commercial programs, supported by one or many for-profit companies, so this designation is quite wrong. Don’t make the mistake of thinking OSS/FS is equivalent to “non-commercial” software! Also, nearly all OSS/FS programs are not in the public domain. the term “public domain software” has a specific legal meaning -- software that has no copyright owner -- and that’s not true in most cases. In short, don’t use the terms “public domain” or “non-commercial” as synonyms for OSS/FS.

An OSS/FS program must be released under some license giving its users a certain set of rights; the most popular OSS/FS license is the GNU General Public License (GPL). All software released under the GPL is OSS/FS, but not all OSS/FS software uses the GPL; nevertheless, some people do inaccurately use the term “GPL software” when they mean OSS/FS software. Given the GPL’s dominance, however, it would be fair to say that any policy that discriminates against the GPL discriminates against OSS/FS.

This is a large paper, with many acronyms. A few of the most common acryonyms are:
Acronym   Meaning
GNUGNU’s Not Unix (a project to create an OSS/FS operating system)
GPLGNU General Public License (the most common OSS/FS license)
OS, OSesOperating System, Operating Systems
OSS/FSOpen Source Software/Free Software

This paper uses logical style quoting (as defined by Hart’s Rules and the Oxford Dictionary for Writers and Editors); quotations do not include extraneous punctuation.

1.4 Bigger Picture

Typical OSS/FS projects are, in fact, an example of something much larger: commons-based peer-production. The fundamental characteristic of OSS/FS is its licensing, and an OSS/FS project that meets at least one customer’s need can be considered a success, However, larger OSS/FS projects are typically developed by many people from different organizations working together for a common goal. As the declaration Free Software Leaders Stand Together states, the business model of OSS/FS “is to reduce the cost of software development and maintenance by distributing it among many collaborators”. Yochai Benkler’s 2002 Yale Law Journal article, “Coase’s Penguin, or Linux and the Nature of the Firm” argues that OSS/FS development is only one example of the broader emergence of a new, third mode of production in the digitally networked environment. He calls this approach “commons-based peer-production” (to distinguish it from the property- and contract-based models of firms and markets).

Many have noted that OSS/FS approaches can be applied to many other areas, not just software. The Internet encyclopedia Wikipedia, and works created using Creative Commons licenses (Yahoo! can search for these), are other examples of this development approach. Wide Open: Open source methods and their future potential by Geoff Mulgan (who once ran the policy unit at 10 Downing Street), Tom Steinberg, and with Omar Salem, discusses this wider potential. Many have observed that the process of creating scientific knowledge has worked in a similar way for centuries.

OSS/FS is also an example of the incredible value that can result when users have the freedom to tinker (the freedom to understand, discuss, repair, and modify the technological devices they own). Innovations are often created by combining pre-existing components in novel ways, which generally requires that users be able to modify those components. This freedom is, unfortunately, threatened by various laws and regulations such as the U.S. DMCA, and the FCC “broadcast flag”. It’s also threatened by efforts such as “trusted computing” (often called “treacherous computing”), whose goal is to create systems in which external organizations, not computer users, command complete control over a user’s computer (BBC News among others is concerned about this).

Lawrence Lessig’s Code and Other Laws of Cyberspace argues that software code has the same role in cyberspace as law does in realspace. In fact, he simply argues that “code is law”, that is, that as computers are becoming increasingly embedded in our world, what the code does, allows, and prohibits, controls what we may or may not do in a powerful way. In particular he discusses the implications of “open code”.

All of these issues are beyond the scope of this paper, but the referenced materials may help you find more information if you’re interested.

1.5 Organization of this Paper

Below is data discussing market share, reliability, performance, scalability, security, and total cost of ownership. I close with a brief discussion of non-quantitative issues, unnecessary fears, OSS/FS on the desktop, usage reports, other sites providing related information, and conclusions. A closing appendix gives more background information about OSS/FS. Each section has many subsections or points. The non-quantitative issues section includes discussions about freedom from control by another (especially a single source), protection from licensing litigation, flexibility, social / moral / ethical issues, and innovation. The unnecessary fears section discusses issues such as support, legal rights, copyright infringement, abandonment, license unenforceability, GPL “infection”, economic non-viability, starving programmers (i.e., the rising commercialization of OSS/FS), compatibility with capitalism, elimination of competition, elimination of “intellectual property”, unavailability of software, importance of source code access, an anti-Microsoft campaign, and what’s the catch. And the appendix discusses definitions of OSS/FS, motivations of developers and developing companies, history, licenses, OSS/FS project management approaches, and forking.

2. Market Share

Many people think that a product is only a winner if it has significant market share. This is lemming-like, but there’s some rationale for this: products with big market shares get applications, trained users, and momentum that reduces future risk. Some writers argue against OSS/FS or GNU/Linux as “not being mainstream”, but if their use is widespread then such statements reflect the past, not the present. There’s excellent evidence that OSS/FS has significant market share in numerous markets:

  1. The most popular web server has always been OSS/FS since such data have been collected. For example, Apache is the current #1 web server. Netcraft’s statistics on web servers have consistently shown Apache (an OSS/FS web server) dominating the public Internet web server market ever since Apache grew into the #1 web server in April 1996. Before that time, the NCSA web server (Apache’s ancestor) dominated the web from August 1995 through March 1996 - and it is also OSS/FS.

    Netcraft’s survey published April 2007 polled all the web sites they could find (totaling 113,658,468 sites), and found that of all the sites they could find, counting by name, Apache had 58.86% of the market, while Microsoft had 31.13%.

    Market Share for Web Servers Across All Domains, August 1995 - April 2007
    Web servers 
across all domains, August 1995 - April 2007

    However, many web sites have been created that are simply “placeholder” sites (i.e., their domain names have been reserved but they are not being used); such sites are termed “inactive.” This means that just tracking the names can be misleading, and somewhat vulnerable to rigging.

    Which eventually happened. In April 2006 there was a one-time significant increase in IIS sites (versus Apache) among inactive sites, entirely due to a single company (Go Daddy) switching from Apache to IIS when serving inactive sites. While it is more difficult for a single active site to switch webservers, it is trivial for a hosting organization to switch all its inactive sites. Go Daddy’s president and COO, Warren Adelman, refused to discuss whether or not Microsoft paid or gave other incentives to move its inactive (parked) domains to Windows, leading a vast number of people (including me!) to believe that Go Daddy was paid by Microsoft to make this change, just to try to make Microsoft’s market share numbers look better than they really were.

    Thus, since 2000, Netcraft has been separately counting “active” web sites. Netcraft’s count of only the active sites is arguably a more relevant figure than counting all web sites, since the count of active sites shows the web server selected by those who choose to actually develop a web site. Apache does extremely well when counting active sites; in their study published in April 2007, Apache had 58.50% of the web server market and Microsoft had 34.44%. Here is the total market share (by number of active web sites):

    Market Share for Active Web Servers, June 2000 - April 2007
servers across all domains, June 2000 - April 2007

    Years ago, Netcraft’s September 2002 survey reported on websites based on their “IP address” instead of the host name; this has the effect of removing computers used to serve multiple sites and sites with multiple names. When counting by IP address, Apache has shown a slow increase from 51% at the start of 2001 to 54%, while Microsoft has been unchanged at 35%. Again, a clear majority.

    CNet’s ”Apache zooms away from Microsoft’s Web server” summed up the year 2003 noting that “Apache grew far more rapidly in 2003 than its nearest rival, Microsoft’s Internet Information Services (IIS), according to a new survey--meaning that the open-source software remains by far the most widely used Web server on the Internet.” The same happened in 2004, in fact, in just December 2004 Apache gained a full percentage point over Microsoft’s IIS among the total number of all web sites.

    Apache’s dominance in the web server market has been independently confirmed by E-Soft’s Security Space - their report on web server market share published April 1st, 2007 surveyed 23,331,627 web servers in March 2007 and found that Apache was #1 (73.29%), with Microsoft IIS being #2 (20.01%). E-soft also reports specifically on secure servers (web servers supporting SSL/TLS, such as e-commerce sites); Apache leads there too, with 52.49% market share, as compared to Microsoft’s 39.32%. You can go to for more information.

    Netcraft has noted that by April 2007 some domains appear to be running lighthttpd, but claim to be running Apache instead. For this paper’s purpose a lighttpd server claiming to be Apache does not harm the validity of the result, though. Both lighttpd and Apache are OSS/FS, so the market share of OSS/FS webservers would be the sum of them (and other OSS/FS web servers) anyway.

    Obviously these figures fluctuate monthly; see Netcraft and E-soft for their latest survey figures.

  2. GNU/Linux is the #2 web serving OS on the public Internet (counting by physical machine), according to a study by Netcraft surveying March and June 2001. Some of Netcraft’s surveys have also included data on OSes; two 2001 surveys (their June 2001 and September 2001 surveys) found that GNU/Linux is the #2 OS for web servers when counting physical machines (and has been consistently gaining market share since February 1999). As Netcraft themselves point out, the usual Netcraft web server survey (discussed above) counts web server hostnames rather than physical computers, and so it doesn’t measure such things as the installed hardware base. Companies can run several thousand web sites on one computer, and most of the world’s web sites are located at hosting and co-location companies.

    Therefore, Netcraft developed a technique that indicates the number of actual computers being used as Web servers, together with the OS and web server software used (by arranging many IP addresses to reply to Netcraft simultaneously and then analyzing the responses). This is a statistical approach, so many visits to the site are used over a month to build up sufficient certainty. In some cases, the OS detected is that of a “front” device rather than the web server actually performing the task. Still, Netcraft believes that the error margins world-wide are well within the order of plus or minus 10%, and this is in any case the best available data.

    Before presenting the data, it’s important to explain Netcraft’s system for dating the data. Netcraft dates their information based on the web server surveys (not the publication date), and they only report OS summaries from an earlier month. Thus, the survey dated “June 2001” was published in July and covers OS survey results of March 2001, while the survey dated “September 2001” was published in October and covers the operating system survey results of June 2001.

    Here’s a summary of Netcraft’s study results:

    OS groupPercentage (March)Percentage (June)Composition
    Windows49.2%49.6%Windows 2000, NT4, NT3, Windows 95, Windows 98
    Solaris7.6%7.1%Solaris 2, Solaris 7, Solaris 8
    BSD6.3%6.1%BSDI BSD/OS, FreeBSD, NetBSD, OpenBSD
    Other Unix2.4%2.2%AIX, Compaq Tru64, HP-UX, IRIX, SCO Unix, SunOS 4 and others
    Other non-Unix2.5%2.4%MacOS, NetWare, proprietary IBM OSes
    Unknown3.6%3.0%not identified by Netcraft OS detector

    Much depends on what you want to measure. Several of the BSDs (FreeBSD, NetBSD, and OpenBSD) are OSS/FS as well; so at least a part of the 6.1% for BSD should be added to GNU/Linux’s 29.6% to determine the percentage of OSS/FS OSes being used as web servers. Thus, it’s likely that approximately one-third of web serving computers use OSS/FS OSes. There are also regional differences, for example, GNU/Linux leads Windows in Germany, Hungary, the Czech Republic, and Poland.

    Well-known web sites using OSS/FS include Google (GNU/Linux) and Yahoo (FreeBSD).

    If you really want to know about the web server market breakdown of “Unix vs. Windows,” you can find that also in this study. All of the various Windows OSes are rolled into a single number (even Windows 95/98 and Windows 2000/NT4/NT3 are merged, although they are fundamentally very different systems). Merging all the Unix-like systems in a similar way produces a total of 44.8% for Unix-like systems (compared to Windows’ 49.2%) in March 2001.

    Note that these figures would probably be quite different if they were based on web addresses instead of physical computers; in such a case, the clear majority of web sites are hosted by Unix-like systems. As stated by Netcraft, “Although Apache running on various Unix systems runs more sites than Windows, Apache is heavily deployed at hosting companies and ISPs who strive to run as many sites as possible on one computer to save costs.”

  3. GNU/Linux is the #1 server OS on the public Internet (counting by domain name), according to a 1999 survey of primarily European and educational sites. The first study that I’ve found that examined GNU/Linux’s market penetration is a survey by Zoebelein in April 1999. This survey found that, of the total number of servers deployed on the Internet in 1999 (running at least ftp, news, or http (WWW)) in a database of names they used, the #1 OS was GNU/Linux (at 28.5%), with others trailing. It’s important to note that this survey, which is the first one that I’ve found to try to answer questions of market share, used existing databases of servers from the .edu (educational domain) and the RIPE database (which covers Europe , the Middle East, parts of Asia, and parts of Africa), so this isn’t really a survey of “the whole Internet” (e.g., it omits “.com” and “.net”). This is a count by domain name (e.g., the text name you would type into a web browser for a location) instead of by physical computer, so what it’s counting is different than the Netcraft June 2001 OS study. Also, this study counted servers providing ftp and news services (not just web servers).

    Here’s how the various OSes fared in the study:

    Operating SystemMarket ShareComposition
    Windows24.4%All Windows combined (including 95, 98, NT)
    Sun17.7%Sun Solaris or SunOS
    BSD15.0%BSD Family (FreeBSD, NetBSD, OpenBSD, BSDI, ...)

    A part of the BSD family is also OSS/FS, so the OSS/FS OS total is even higher; if over 2/3 of the BSDs are OSS/FS, then the total share of OSS/FS would be about 40%. Advocates of Unix-like systems will notice that the majority (around 66%) were running Unix-like systems, while only around 24% ran a Microsoft Windows variant.

  4. GNU/Linux was the #2 server OS sold in 1999, 2000, and 2001. According to a June 2000 IDC survey of 1999 licenses, 24% of all servers (counting both Internet and intranet servers) installed in 1999 ran GNU/Linux. Windows NT came in first with 36%; all Unixes combined totaled 15%. Again, since some of the Unixes are OSS/FS systems (e.g., FreeBSD, OpenBSD, and NetBSD), the number of OSS/FS systems is actually larger than the GNU/Linux figures. Note that it all depends on what you want to count; 39% of all servers installed from this survey were Unix-like (that’s 24%+15%), so “Unix-like” servers were actually #1 in installed market share once you count GNU/Linux and Unix together.

    IDC released a similar study on January 17, 2001 titled “Server Operating Environments: 2000 Year in Review”. On the server, Windows accounted for 41% of new server OS sales in 2000, growing by 20% - but GNU/Linux accounted for 27% and grew even faster, by 24%. Other major Unixes had 13%.

    IDC’s 2002 report found that Linux held its own in 2001 at 25%. All of this is especially intriguing since GNU/Linux had 0.5% of the market in 1995, according to a Forbes quote of IDC. Data such as these (and the TCO data shown later) have inspired statements such as this one from IT-Director on November 12, 2001: “Linux on the desktop is still too early to call, but on the server it now looks to be unstoppable.”

    These measures do not measure all server systems installed that year; some Windows systems are copies that have not been paid for (sometimes called pirated software), and OSS/FS OSes such as GNU/Linux and the BSDs are often downloaded and installed on multiple systems (since it’s legal and free to do so).

    Note that a study published October 28, 2002 by the IT analyst company Butler Group concluded that on or before 2009, Linux and Microsoft’s .Net will have fully penetrated the server OS market from file and print servers through to the mainframe.

  5. GNU/Linux and Windows systems (when Windows CE and XP are combined) are the leaders and essentially even in terms of developer use for future embedded projects, according to Evans Data Corporation (EDC). Their Embedded Systems Developer Survey, fielded in July 2002, asked developers “For each of the following operating systems, please indicate whether you are targeting the OS on your current project or your next project.” They collected data from 444 developers. Their results: 30.2% of embedded developers use or expect to use Linux, while 16.2% say they will use Windows CE and another 14.4% say they will use Windows XP Embedded. If the two Windows systems are combined, this gives Windows Embedded operating systems a statistically insignificant edge over Embedded Linux (at 30.6% vs. 30.2%). However, Embedded Linux has nearly double the growth rate, and combining two different Windows systems into a single value is somewhat misleading. Wind River’s VxWorks embedded OS, the current embedded software market leader, “trails slightly behind Embedded Linux for current project use, and VxWorks’ modest gain of just 2.9% for expected use in future projects drops it to a distant third place position, ending up with less than half the usage rate of the two neck-and-neck future project usage leaders (Windows Embedded and Embedded Linux).”

  6. An Evans Data survey published in November 2001 found that 48.1% of international developers and 39.6% of North Americans plan to target most of their applications to GNU/Linux. In October 2002, they found that 59% of developers expect to write Linux applications in the next year. The November 2001 edition of the Evans Data International Developer Survey Series reported on in-depth interviews with over 400 developers representing over 70 countries, and found that when asked which OS they plan to target with most of their applications next year, 48.1% of international developers and 39.6% of North Americans stated that they plan to target most of their applications to GNU/Linux. This is surprising since only a year earlier less than a third of the international development community was writing GNU/Linux applications. The survey also found that 37.8% of the international development community and 33.7% of North American developers have already written applications for GNU/Linux, and that over half of those surveyed have enough confidence in GNU/Linux to use it for mission-critical applications.

    Evans Data conducted a survey in October 2002. In this survey, they reported “Linux continues to expand its user base. 59% of survey respondents expect to write Linux applications in the next year.”

  7. An IBM-sponsored study on Linux suggested that GNU/Linux has “won” the server war as of 2006, as 83% were using GNU/Linux to deploy new systems versus only 23% for Windows. The November 9, 2006 article The war is over and Linux won by Dana Blankenhorn summarizes a new IBM-sponsored study. IBM determined that 83% of companies expect to support new workloads on Linux next year, against 23% for Windows. He noted, “Over two-thirds of the respondents said they will increase their use of Linux in the next year, and almost no one said the opposite.”

  8. Half of all mission-critical business applications are expected to run on GNU/Linux by 2012 A survey of IT directors, vice presidents and CIOs carried out by Saugatuck Research, reported in January 2007, suggests that nearly half of all companies will be running mission-critical business applications on Linux in five years’ time.

  9. An Evans Data survey made public in February 2004 found that 1.1 million developers in North America were working on OSS/FS projects. Evans Data’s North American Developer Population Study examined the number of software developers using various approaches. It found that more than 1.1 million developers in North America were spending at least some of their time working on Open Source development projects. That’s an extraordinarily large number of people, and it doesn’t even account for developers in other countries. Many only develop part-time, but that many people can develop a lot of software, and having a large number of people increases the likelihood of helpful insights and innovations in various OSS/FS projects.

  10. A 2004 InformationWeek survey found that 67% of companies use OSS/FS products, with another 16% expecting to use it in 2005; only 17% have no near-term plans to support OSS/FS products. The November 1, 2004 InformationWeek article Open-Source Software Use Joins The Mix by Helen D’Antoni reported the results from InformationWeek Research, which measured adoption of “open-source architecture” and found that adoption is widespread. The survey also found other interesting results: “In general, companies don’t view open-source software as risky. It often functions alongside [proprietary] and internally developed software, and because of this acceptance, open-source code is being used more broadly. Its use is evolving as companies look for cost-effective ways to manage software expenses.” Of those companies using OSS/FS, they found that 42% of companies implement production database operations using OSS/FS, with 33% more considering it; only 25% are not using or considering OSS/FS for production database use.

  11. A Japanese survey found widespread use and support for GNU/Linux; overall use of GNU/Linux jumped from 35.5% in 2001 to 64.3% in 2002 of Japanese corporations, and GNU/Linux was the most popular platform for small projects. The book Linux White Paper 2003 (published by Impress Corporation) surveys the use of GNU/Linux in Japan (it is an update to an earlier book, “Linux White Paper 2001-2002”). This is written in Japanese; here is a brief summary of its contents.

    The survey has two parts, user and vendor. In “Part I : User enterprise”, they surveyed 729 enterprises that use servers. In “Part II : Vendor enterprise”, they surveyed 276 vendor enterprises who supply server computers, including system integrators, software developers, IT service suppliers, and hardware resellers. The most interesting results are those that discuss the use of Linux servers in user enterprises, the support of Linux servers by vendors, and Linux server adoption in system integration projects.

    First, the use of Linux servers in user enterprises:
    Linux server64.3%35.5%
    Windows 2000 Server59.9%37.0%
    Windows NT Server64.3%74.2%
    Commercial Unix server37.7% 31.2%

    And specifically, here’s the average use in 2002:
    SystemAve. units# samples
    Linux server13.4N=429 (5.3 in 2001)
    Windows 2000 Server24.6N=380
    Windows NT Server4.5N=413
    Commercial Unix server6.9N=233
    Linux servers are the fastest growing category from last year. The average units of server per enterprise increased by 2.5-fold from 5.3 units to 13.4 units.

    Second, note the support of GNU/Linux servers by vendors:
    SystemYear 2002 Support
    Windows NT/2000 Server66.7%
    Linux server49.3%
    Commercial Unix server38.0%
    This is the rate of vendors that develop or sale products supporting Linux server; note that Linux is already a major OS when compared with its competitors. The reasons for supporting Linux server were also surveyed, which turn out to be different than the reasons in some other counties (for a contrast, see the European FLOSS report):
    Increase of importance in the future44.1%
    Requirement from their customers41.2%
    Major OS in their market38.2%
    Free of licence fee37.5%
    Most reasonable OS for their purpose36.0%
    Open source34.6%
    High reliability27.2%

    Third, note the rate of Linux server adoption in system integration projects:
    Project Size (Million Yen)LinuxWin2000Unix
    Where 1 Million Yen = $8,000 US. GNU/Linux servers are No.1 (62.5%) in small projects less than 3,000,000 Yen ($24,000 US), and GNU/Linux has grown in larger projects more than 50,000,000 Yen ($400,000 US) from 20.0% to 39.0%. In projects over 100,000,000 Yen ($800,000 US), Linux is adopted by 24.4% of the projects (mainly as a substitute for proprietary Unix systems). Note that many projects (especially large ones) use multiple platforms simultaneously, so the values need not total 100%.

    Note that the Japanese Linux white paper 2003 found that 49.3% of IT solution vendors support Linux in Japan.

  12. The European FLOSS study found significant use of OSS/FS. The large report Free/Libre and Open Source Software (FLOSS): Survey and Study, published in June 2002, examined many issues including the use of OSS/FS. This study found significant variance in the use of OSS/FS; 43.7% of German establishments reported using OSS/FS, 31.5% of British establishments reported using OSS/FS, while only 17.7% of Swedish establishments reported using OSS/FS. In addition, they found that OSS usage rates of larger establishments were larger than smaller establishments, and that OSS usage rates in the public sector were above average.

  13. Microsoft sponsored its own research to “prove” that GNU/Linux is not as widely used, but this research has been shown to be seriously flawed. Microsoft sponsored a Gartner Dataquest report claiming only 8.6% of servers shipped in the U.S. during the third quarter of 2000 were Linux-based. However, it’s worth noting that Microsoft (as the research sponsor) has every incentive to create low numbers, and these numbers are quite different from IDC’s research in the same subject. IDC’s Kusnetzky commented that the likely explanation is that Gartner used a very narrow definition of “shipped”; he thought the number was “quite reasonable” if it only surveyed new servers with Linux, “But our research is that this is not how most users get their Linux. We found that just 10 to 15 percent of Linux adoption comes from pre-installed machines... for every paid copy of Linux, there is a free copy that can be replicated 15 times.” Note