MS Windows NT: Boosting Competition in Enterprise
Just as the PC has democratized desktop computing, client/server technology is freeing the entire corporate computing environment from its dependence on the mainframe. Instrumental in the growing success of the client/server model was the launch of Microsoft's Windows NT operating system in 1993. A new kind of operating system that addresses the needs of network users and operators in an open computing environment, Windows NT is widely and inexpensively licensed to any computer manufacturer that wants it – and its application programming interfaces (APIs) are developed with the help of industry input and published for anyone to use.
This paper looks at the many economic, technical and performance factors that account for the popularity of Microsoft Windows NT Server, which currently accounts for around 40% of installed network operating systems. Among the key findings of the paper are:
The Evolution of Client/Server Computing
The nature of corporate computer systems has changed enormously in recent decades. From the mainframe-dominated world of the 1960s and 1970s, in which remote text-only terminals were connected to centralized banks of mainframes and (later) minicomputers, the industry underwent a far-reaching transformation in the early 1980s as customers began to appreciate the huge price/performance advantages of the personal computer.
The PC's democratizing effect on computing was powered by Intel's open microprocessor architecture and by Microsoft's open operating-system platform, which was cheaply licensed to any computer maker that wanted to use it. Attracted by a fast-growing user base – and by Microsoft's policy of making its application programming interfaces (APIs) widely and freely available – large numbers of independent software developers started designing applications for the operating system. The operating system's appeal was further enhanced by Microsoft's continuing efforts to add new and easy-to-use APIs (which allow developers to make use of an operating system's services), and to provide a vision for future development.
In the early years of the PC, most corporate computer systems bridged these two separate technology models. Mission-critical enterprise computing remained the province of mainframes and minicomputers, while the computing needs of individual employees and their workgroups were increasingly met by standalone desktop PCs. Data transfer between these two was sporadic, and often accomplished by disk-based media – an inefficient way to run an enterprise. This hybrid solution still left enterprise computing dependent on centralized mainframes – machines that were scalable only at significant expense, and that required users to share centralized computing resources even if they were carrying out unrelated tasks.
In view of the manifest technological and economic limitations of the old mainframe model, corporations in the late 1980s began to adopt computer systems based on the "client/server" model. In client/server computing, networks of PCs are connected to PC-based central "servers" that store data and provide shared computing services for everyone on the network. The client/server model provides users with their own individual PCs for independent tasks – such as financial and budget analyses, which require many recalculations – so they don't burden shared computing resources in the way they did on a mainframe. In short, it brings real computing power to the people. But users can still access shared resources, such as printers, file servers, databases and email, through the servers on the network. Better still, client/server networks are easily scalable: as users are added, the number and power of servers is increased incrementally. This confers huge economic benefits.
The PC-based client/server model has also gained strength from a number of fundamental advances in information technology. The rise of the Internet – and the World Wide Web – is blurring the boundaries between private and public networks, and driving customer demand for sophisticated server technology that can blend the protocols of corporate computer networks with those of the Internet. At the same time, a massive increase in the power, capacity and reliability of PC-based servers and software is making it possible for enterprise-computing systems to benefit from the PC computing model, too.
Instead of being forced to rely on expensive, proprietary hardware and software (which often has to be purchased from a single vendor), companies increasingly prefer to use open PC technology for their mission-critical enterprise-computing needs. This change has been reflected in the structure of the industry. Compaq Computer, for example, has moved from being a manufacturer solely of PCs to being a major player in enterprise computing – a strategic metamorphosis that will be aided by its purchase of Digital Equipment Corporation, a company which built its business on the old proprietary computing model. And in 1993 Microsoft, for its part, introduced Windows NT, a scalable, open operating system designed for seamless operation of enterprise computing systems, from the desktop to mission-critical servers.
All this is transforming enterprise computing into one of the most competitive segments of the information-technology industry. In the past, enterprise computing was dominated by vertically integrated companies (such as IBM, Sun Microsystems and Digital Equipment Corporation) that sold everything from hardware to software and all the other components in between – typically all in a single box, for a single price. Having invested in entire computer systems made by one – and only one – vendor, customers found it hard to switch without scrapping all their hardware and software, and making substantial new investments in an alternative technology.
PC-based client/server technology, by contrast, offers corporations enormous choice in hardware, applications, service and support options. A decision to move from one manufacturer's server to another's no longer means having to scrap an entire enterprise network and start over. And at the heart of this flexibility is Microsoft's Windows NT Server operating system.
As client/server computing started gaining momentum, it initially introduced a new set of costs. Instead of working together as a single unit, servers were generally deployed like individual, small-scale mainframes; as a result, the more servers there were, the harder it was to manage a network. Writing applications that insulated users and administrators from dealing with multiple servers was difficult, because the operating system and network services were not fully integrated – each server was, in effect, an island, and different servers served different functions. Network users looking for a specific piece of information had to know on which machine it was physically stored. Both users and administrators found that computing had in fact become more complicated under the client/server model than it had been under the old centralized mainframe model.
In the early 1990s these complexities led some industry observers to speculate that the client/server model would be short-lived. Customers, however, wanted a cost-effective solution, not a return to the mainframe era. Microsoft's response was to pioneer a totally new kind of operating system that not only addressed the needs of network users and operators, but did so in an open computing environment where the operating system would be widely and inexpensively licensed to hardware manufacturers, and APIs would be developed with industry input and published for anyone to use.
This open approach to operating system development and licensing – which has brought cheap and effective computing to individuals at work, at school and at home, and has driven the great success of the PC industry – is at the heart of Microsoft's Windows NT Server model. It contrasts with the strategy of vendors of competing operating systems, such as the Apple Macintosh and various versions of UNIX. They generally refuse to license their operating-system software widely to hardware manufacturers, which means prospective customers must purchase their entire computing system – often everything from microprocessor to follow-on services – from a single vendor. Naturally this leads to higher margins for the vendor – and higher prices for customers. That, in turn, leads to lower sales, the development of fewer applications, and less innovation in both hardware and software.
In a client/server environment, Windows NT Server is more reliable, scalable, and substantially easier to deploy, manage and use than any other distributed system in the marketplace – one reason why it has found favor among small business users. By building on the high-volume, low-cost PC model, and the open licensing practices of Microsoft, Windows NT Server gives enterprise customers more choice and lower costs – research commissioned by Microsoft and Compaq from the independent Business Research Group, of Newton, MA, estimates that total ownership costs per server for Windows NT Server are 36% lower than Sun Microsystems' Solaris on SPARC. (Solaris is a version of UNIX developed by Sun Microsystems; SPARC is Sun's microprocessor architecture.)
Although Microsoft's Windows NT Server was first released only five years ago, it offers users access to three times as many applications than are available for Sun and other UNIX-based operating systems. And it gives customers a choice of more than 230,000 Microsoft certified support professionals, as well as more than 10,000 solution providers (who tailor software and hardware to customer's requirements). Windows NT Server removes the barriers that formerly existed between the desktop and the enterprise, and between small, medium and large businesses. This means much lower costs for corporate customers, whatever their size.
Customers have responded enthusiastically to the benefits offered by Windows NT Server. According to the BRG study, 26% of the Fortune 1000 information-systems managers who were surveyed said they used Windows NT Server for their most mission-critical applications in a distributed computing environment, compared with 28% for all UNIX vendors (of which Sun accounts for only 6%). BRG also found that users of Windows NT Server spent 38% less on hardware and operating systems – and with 22% fewer servers, they also spent 52% less overall on their entire systems. At the same time, Windows NT Server users spent 68% less on value-added software, including development tools, databases, applications and utilities.
It is clear that the obvious benefits of Windows NT – rather than any so-called "leveraging" of Microsoft's success with desktop operating systems – are the reason why Windows NT is proving so popular with corporations eager to cut their computing costs and deploy more effective solutions. But why is Windows NT Server so much more cost-effective and productive than rival UNIX server solutions? Several factors stand out:
Corporate customers demand more than cost-effectiveness, however. They also tell us they want cutting-edge interoperability, openness and all the features that will make their enterprise productive in today's increasingly digital economy. These factors should be examined in detail.
Enterprises today demand open operating systems that offer flexibility, portability and easy interoperability. The economic benefits of open computing systems have been clearly demonstrated by the PC model, which has cut the cost of computing more than a million-fold in less than two decades. The risk of acquiring a computer system that becomes technologically obsolete is also greatly reduced, since the customer is not dependent on the success or failure of a single vendor. Since Windows NT Server was first being developed, Microsoft has actively promoted a strategy of "any client to any server, any server to any client, and any server to any server" – a stark contrast to UNIX-based networks, which generally require a third-party driver to achieve connectivity. In addition, UNIX operating systems often support only the hardware of those vendors building and selling them.
Windows NT Server delivers on each of the factors necessary for wholly open and interoperable computer networking – it is the most open server operating system in the marketplace, as the following makes clear:
Building and Supporting Industry Standards
In every industry that relies on open standards, competing protocols often emerge as the technology evolves: competition drives innovation. Examples outside the computer industry include the rivalry between VHS and Betamax in VCRs (which VHS won in the consumer marketplace primarily because its longer playing time allowed users to record an entire movie on a single cassette), and the race to develop an audio-only standard for DVD. The client/server business – and the computer industry in general – relies on numerous standard protocols.
In many cases, the standards themselves emerge as the result either of competition or of work by a single company to establish open protocols. In cases where a required protocol did not exist, Microsoft has worked with everyone from standards committees to groups of interested parties to ensure that any new standard aids openness across the entire industry. One example is Dynamic Host Control Protocol (DHCP), which responded to customers' needs to enable dynamically assigned TCP/IP addresses. Prior to DHCP there was no way to do this. Today, DHCP is supported by Sun, Novell and other vendors, as well as by a variety of enterprise customers.
The standards governing how different software components communicate with each other is a good example of how Microsoft's Windows NT Server supports a totally open industry model, while at the same time trying to advance the state of the technological art. Here, two competing protocols have been vying to become the accepted industry standard: Component Object Model (COM) has been developed by Microsoft in collaboration with a large number of independent software developers; Common Object Request Broker Architecture (CORBA) has been promoted by a UNIX-centric group of companies.
Each of these protocols serves a similar purpose – to enable software components across a network to identify themselves to other software components, regardless of platform or operating system. Both COM and CORBA have always been fully open; both are implemented on a number of platforms. As enhancements to COM specifications are made, Microsoft notifies all independent software developers, in order to ensure their applications are compatible; the Object Management Group, which controls CORBA, does likewise. Microsoft has also licensed COM to a variety of UNIX and other system vendors, such as Digital, Hewlett-Packard and Silicon Graphics, ensuring its cross-platform performance. (Microsoft even supports COM on Sun systems, despite the fact that Sun does not support COM.) In addition, DCOM, the interoperability protocol for COM, is based on industry-standard DCE RPC, and runs across standard TCP/IP or other transports.
The COM approach has proved attractive to a wide range of developers throughout the computer industry, and many more of them are now using COM than CORBA. Strategy Analytics/Giga Information Group estimates that COM is now a $670 million business – one that will grow to more than $2.6 billion by 2001. More than 150 million computers now use COM, and it is central to many of the industry's most popular applications, such as Microsoft Office, Lotus SmartSuite, Lotus Notes and Sybase PowerBuilder. The competitive, evolutionary process that produced COM is a constant in the computer industry, and one that ensures customers benefit from the most robust, reliable and open standards.
Microsoft has also worked hard to implement emerging technologies that have been developed by direct competitors. One example is Sun's Java programming language. Notwithstanding claims to the contrary by Sun, Microsoft's Java Virtual Machine (JVM) – the piece of software that enables programs written in Java to run – has independently been judged consistently better at running Java than, for instance, the Netscape VM for Sun Solaris and IBM OS/2 Warp. In April 1998 PC Magazine made Microsoft's Java environment its Editors' Choice, stating that:
For the second year in a row, Microsoft has produced the fastest and most reliable Java implementation available U The Microsoft Java environment came close to a perfect score on our compatibility tests, where it ran all but one applet each on Microsoft Windows 95 and NT U [T]he compatibility rate for VMs running on Windows U is almost 20 percentage points better than VMs running under [Sun's] Solaris.
Microsoft's Java strategy is to meet the demands of both customers and developers, which is why we have committed significant resources so that Java developers are able to take advantages of Windows features when writing Java software, or can write least-common-denominator, cross-platform Java programs. But Microsoft believes different applications have different requirements and success factors – and developers are in the best position to determine what will make their application successful. While the Java market is characterized by significant hype fueled by competitive ambition, Microsoft has focused on where Java can actually deliver, combined with our extensive knowledge of the developer market. For example, there will always be multiple languages. Portability is unlikely to be the key success factor for every application.
More developers than ever are targeting the Windows platform. Our research shows that more than 90% of all Java developers are doing their development on Windows. More than 50% of Java developers are using Java to build platform-specific applications – a number that is growing rapidly, because that's what customers are demanding (and that's where the technology delivers). Microsoft's strategy is not merely to run Java better than any other JVM; it is to help developers meld the productivity of Java with the power of Windows, so that they can build the kind of no-compromise applications that customers are demanding.
Supporting the Developer Community
At the heart of Windows NT Server's success is its openness: it supports the greatest number of hardware platforms and the most applications, and its APIs are fully documented and available to anyone who wants them. This is fundamental to the Windows NT Server model: the more hardware platforms and applications that are available for an operating system, the more productive and cost-effective it will be for customers. Achieving this degree of openness means involving software developers and hardware designers at the earliest possible stages of operating-system design – and at every stage of its subsequent evolution. Microsoft has single-mindedly pursued this strategy with Windows NT Server, as it has with all its operating systems.
For Windows NT Server 5.0, which is not scheduled to debut until 1999, Microsoft has already held two professional developers' conferences and many design previews. The specifications of Windows NT Server 5.0 have been available for some time – indeed, those for several of its applications and systems services (such as transactions, data access and directory) have been previewed since early 1995. The first beta of the new operating system was made widely available in the final quarter of 1996 at Microsoft's professional developers' conference. Microsoft has also made volumes of information available via the World Wide Web, through the Microsoft Developers' Network, and through our Site Builder Network. In all, more than 8,000 APIs are freely available to developers working on Windows applications of all kinds.
Software that is still under development – from Microsoft or others – is sometimes derided as "vaporware." Allegations that Microsoft somehow acts improperly by providing customers and developers with information about software that is still under development were thoroughly investigated by the Department of Justice in the investigation culminating in 1994, and it found no basis to assert any claim against Microsoft. Indeed, as even our competitors acknowledge, third-party developers want to know the specifications of forthcoming operating systems as far in advance as possible. This is pro-competitive: it means more applications on the market.
Although Microsoft has no contractual or legal obligation to notify third-party developers of either initial specifications or changes to those specifications, it would undermine the Windows NT Server model if significant changes were made without their knowledge. Further, we actively solicit developers' input, so that we build what their customers tell them they need. The earlier Microsoft lets developers know of changes, the more effectively they can incorporate them into their applications – and the better the Windows NT Server platform will perform for customers. For Microsoft to do otherwise would be commercial suicide. The claim that Microsoft somehow "hides" APIs from developers is both false and illogical: to ensure that as many compatible applications as possible are on sale when a new operating system is launched, Microsoft makes all necessary APIs widely available, and, indeed, extensively promotes their use.
Listening to the Customer
Rapid product improvement – in part through continuous integration of new features into existing products – is a hallmark of the modern computer industry. PCs increasingly include integrated features – such as disk-drives, CD-ROM drives, modems and speakers – that were in the past only available separately. Microprocessors have continuously integrated new functions and features since their invention in 1971 – that is why they are also known as integrated circuits. And the same is true of every computer operating system on the market today.
Whether they are buying a single computer or a mission-critical enterprise system, customers say they want operating systems to be steadily improved with new features and functions – not an ever-growing tangle of separate products, each with its own system requirements, commands and price tag. Integration adds value to the operating system by reducing system-management costs, and allowing customers to spend more time and resources on their business. By extending the existing capabilities of the operating system (e.g., adding Web access to applications), integration simplifies installation, configuration and use, and improves reliability and productivity.
And, clearly, Microsoft is not the only operating-system vendor that is integrating technology to serve customers better:
All this makes perfect sense for customers. For example, directory services are critical to implementing and managing the security of an operating system. Integration of directory services improves simplicity of installation, system configuration and management. Integrating directory services with the operating system also means that they can be integrated with other key distributed services, such as applications services and network services. Administrators benefit from centralized management and developers don't have to undertake systems integration to build these solutions. And customers are assured that a single system-management infrastructure is being used to manage objects whether they are being accessed by the system, by an application, or by an administrator or end-user.
The same benefits of integration hold true for dial-in networking. Users want to access documents and information on their servers whether they are at their desk, at home or on the road. To minimize management and training costs, corporations want to manage and secure this access in the same way they secure the rest of their network resources. To achieve this, most operating systems now offer remote access or dial-in networking as a standard feature. And for simplicity of installation, configuration and management, it must be tightly integrated.
Internet services are also increasingly being integrated into operating systems. Almost every enterprise-scale operating system now includes Web-browsing technology, Web authoring tools and a Java Virtual Machine, simply because customers demand it. Windows NT Server is no exception: it has Web-browsing and server technology and a JVM. Microsoft included Web server technology (known as Internet Information Server) as a feature of Windows NT Server because customers had stressed their need for the "page and link" metaphor of the Web to access the files and applications that they have always stored and run on Windows NT Server. IIS technology has proved popular because it is easy to secure, reliable, fast and very scalable. It also allows site administrators to control all Web sites from a single location.
As with Windows NT Server, Web-server technologies are also integrated into both Sun's Solaris and Novell's NetWare, presumably because their customers have similar requirements. This is a highly competitive business, and one that offers both customers (who can easily switch between Web servers) and developers a high degree of choice.
Until the introduction of PC-based client/server technology, corporations had two choices for their enterprising-computing needs: continue to use their "legacy system" mainframes and minicomputers, with all the high costs associated with poorly scalable, proprietary computer systems; or purchase proprietary client/server hardware and software from companies such as Sun Microsystems or IBM – a solution that improved on the price/performance ratio of aging mainframes and minicomputers, but still left customers committed to a vertically integrated, proprietary model with high purchase, usage and service costs.
Microsoft's Windows NT Server is helping to free enterprise systems from the tyranny of proprietary networks. On every measure – price/performance, ease of use, manageability, interoperability, openness, scalability, availability of applications, flexibility and many others – Windows NT Server easily beats out the competition. These are precisely the values that antitrust law is designed to promote – competitive pricing and product improvements for customers. And that is why, only five years after it was launched, Windows NT Server accounts for around 40% of installed network operating systems – and its popularity is increasing rapidly.
But the popularity of Windows NT Server is not the result of "controlling the network via the desktop," as some of Microsoft's competitors have claimed – it is the result of listening to our customers, developing robust, long-term solutions to their needs, and adding the features they ask us for. Nor has Windows NT Server's success been achieved through "predatory pricing" – it has been achieved by following the low-price, high-volume model that Microsoft has pursued on the desktop, rather than by sticking to the defunct, high-price proprietary model favored by rivals such as Sun Microsystems. And Windows NT Server's success is not a function of locking customers into a proprietary network in which everything from the microprocessor to the operating system is sold and serviced by a single vendor. It is a function of offering customers more choice, from more vendors, than they had ever thought possible in enterprise computing.
Microsoft's Windows NT Server is succeeding, quite simply, because it offers corporate computer users what they want.
Last updated January 12, 2000