Intel Server Trends

  • Written By: R. Krause
  • Published: May 9 2000

Intel Server Trends
R. Krause - May 9, 2000


The product space being examined is Intel-based servers (still known as "PC servers" in some areas). In this note, we will examine how five key trends affect users, vendors, and the product(s).

In the Intel-based server market, there are five trends of note:

  1. Market consolidation
  2. Consolidation of servers within a company
  3. Focused functionality, e.g., "server appliances"
  4. Movement toward Linux/Linux vs. NT
  5. Stratification/segmentation

Each of these trends addresses a different area of the server world and how the market views it:

  • Market consolidation relates to the vendor landscape

  • Server consolidation relates to the supply and demand of classes of server

  • Focused functionality addresses the feature set customers can now expect as well as the partitioning of features/functionality

  • Linux movement addresses the operating environment and architecture for servers;

  • stratification/segmentation deals with the growing divide between high-end systems and low-end systems; also deals with the reduction of categories into which servers are classified.

Trends and Discussion

1) Market consolidation


"Market consolidation" is the name for the phenomenon where an ever-increasing share of a given market is owned by an ever-decreasing (or static) set of vendors. In the case of Intel-based servers, the four key vendors are the "Big Four" - Compaq, Dell, IBM, and HP. In the last two years, market share for the Big Four has gone from approximately 65% of the market to 75% or higher, representing a raw share growth of approximately 15%.


For the user, this trend is a double-edged sword.

On one hand, this narrows the field of viable vendors from which to choose. In cases where customers had not pre-determined which vendor's products would be used, diversity (i.e., non-consolidation) left the field wide open, perhaps too wide open. (The lack of "definition" in the market meant that just about any vendor could stick a CPU or two in a chassis, add a SCSI or RAID controller and some RAM and disk drives and Presto! now there's a server!) Consolidation also means fiercer competition between the remaining vendors. Increased competition usually results in users benefits, either through reduced pricing as vendors fight to increase market share, or through increased functionality available.

On the other hand, users may now find their choices effectively limited, similar to network TV before the rise of cable. While the product selection available is adequate and suitable for the needs of most customers, there will always be some percentage whose needs go unmet or unsatisfied. For users to feel a severe impact, we believe the Big Four would have to become the Big Three or Big Two. Even with further consolidation, we expect each vendor to provide approximately ten separate models from which to choose. (Currently, HP provides 11, Dell and IBM provide ten, and Compaq provides 11, plus three more about to be retired.) At some point, though, vendors may decide that multiple alternatives is a losing strategy.


As with users, market consolidation can be double-edged.

On the positive side, as consolidation continues, the companies left standing are reasonably strong, and need focus only on the remaining competitors. Even if some small players may develop a server with groundbreaking functionality or technology (10% probability), the technology will quickly be assimilated by one or more of the big players. The variant of this is the development of new markets (e.g., server appliances). This market was created by small players, and while some of the Big Four took longer than others to catch on, all of the them are now in the game.

On the downside, the competition between the major players will intensify. Before consolidation, big vendors could take advantage of their superior strength and perceived corporate viability to overwhelm the smaller players. With four strong players vying for the same IT dollars, the match is more even, thus more intense.


Although consolidation does not guarantee product line reduction by the survivors, history indicates it is a likely outcome in the long term. An example is Compaq's reduction of ProLiant server models from 12 to nine. This reduction is primarily due to Compaq having too many models covering the same market "space". However, it is unlikely Compaq would feel as great a need to cull models if the competition (i.e., Dell) weren't breathing down its neck.

2) Server Consolidation


In general terms, "server consolidation" translates into "replace a large bunch of smaller servers with a smaller bunch of large servers". This is due to the confluence of a number of factors:

  1. Processing power and density has increased Eight-CPU servers are now available from the Big Four, providing the ability to deliver far more transaction-processing power than was available from servers two-three years ago.

  2. Computing requirements have increased As applications and operating systems have hogged more and more computing resources, a commensurate increase is needed in computing power. However, this power is not always needed, leaving resources for other applications to be migrated from smaller machines. In addition, company growth often forces IT Directors to upgrade their computing infrastructure incrementally.

  3. Floor space is costly Despite the increase in computing power, customers - especially ones with large computer centers - are trying to stuff as much as they can into ever-smaller spaces. Since outward/sideways expansion does nothing to lower the per-square-foot cost, users prefer to expand upward. This leads to the use of large equipment racks; into which appropriately designed servers can be mounted.

  4. Administration of large numbers of servers can be difficult It is much easier for a system administrator (sysadmin) to keep an eye on 10 eight-CPU systems than on 40 two-CPU systems, even with the ability to set alarms, guardbands, and other methods of monitoring and administration. In addition to the ability to manage the monitoring software more closely, being able to keep a number of servers in close proximity to each other also provides better visual feedback in the event of a fault.


For the user/customer, this means there is now the ability in some cases to move everything from a bunch of single-processor boxes into one box. Some companies have instituted trade-in policies to provide further incentives for customers. The benefits of consolidation are most appropriate for users who are trying to move a bunch of smaller applications (such as databases) into a centralized computing structure, or who have two or three larger applications that they want to run on one system.

The downside to server consolidation is the increased risk associated with having a single point of failure. Most high-end general purpose servers now come with redundancy and hot-swappability for power, fans, disks, and PCI cards such as network interface cards, but all that may still not be enough to prevent unwanted downtime. A typical Windows NT uptime "guarantee" is 99.5% - sounds impressive until you realize that translates into about 43 hours of downtime per year. While maintenance (one component of downtime) can often be scheduled, there will almost certainly still be failures.


Server consolidation means vendors will need to continue focusing on delivering high performance, high value systems. For the reasons described above, a large number of customers will want to move away from their one- and two-CPU older servers. This will increase the already strong price pressure on small/workgroup servers. Vendors are more willing to take miniscule profit margins (typically 15%) on workgroup servers in exchange for the higher margins (typically 35+%) delivered by the high-end servers. In essence, manufacturers view the low-end servers as a necessary evil.


As more applications get consolidated onto large servers, tuned performance will increase in importance. This does not mean each box will be individually "tuned". Rather, it means that manufacturers will increasingly review the application load and type anticipated by the user, and provide a "pre-packaged" system that addresses most of the cases expected. Dell is already providing server sizing tools on its website. These tools are for specific applications, such as Microsoft Exchange, SAP R/3, Oracle 8i, and Novell ICS, and provide customers with rough guidelines as to what their system might need for CPUs, memory, and storage. Because it is becoming more difficult for a particular server to be "all things to all people" we expect to see more of this sizing/tuning focus in the future.

3) Focused Functionality


The flip side of the more powerful general purpose (GP) server, such as those used for server consolidation, is the focused server, popularly referred to as the "server appliance". This segment grew out of a couple of needs:

  1. The need to perform a few specific and focused tasks very well.
    GP servers that are "tuned" to provide optimal all-around performance usually perform no task exceptionally well. A wise man once said "You can't optimize for more than one factor at a time". This holds true for servers as well - if you optimize for database retrieval, you are causing another factor (such as Web serving or caching) to degrade.
  2. The unwillingness of users to pay for lots of extra/unneeded functionality.
    This is the natural by-product of #1 - you need to perform Task A really really well, but you don't want to be forced to pay for the overhead/infrastructure required to perform Tasks C and D.

The rise of the Internet helped force the issue, making people focus on the above factors a lot more than they had previously. For example, companies were accustomed to using large GP servers to interface to the Web. When someone finally realized that you could provide specific, targeted functionality to enhance Web surfing by means of caching technology (storing commonly accessed functions or pages at a local site, rather than on the Web proper), it opened up an entirely new market segment. Server appliances, no longer limited to caching, are the fastest growing segment of the server market. Some estimates are in the 75% CAGR range, about triple the growth rate for the general purpose server market.


Users benefit from server appliances by paying only for the functionality they need. Most users who need/want server appliances will actually need a mix of GP servers and appliances. However, this will still save most users money because at least part of the unneeded functionality will no longer drain money. In addition, targeted appliances usually provide improved performance. In the case of Web caching, being able to load a page in three seconds instead of 30 means more productivity and less frustration at the desktop. (Yes, saving 27 seconds is not much of a productivity boost. The idea is that everyone saves that much time on most of their Web surfing [business-related, of course] through out the business day.)


Most of the Big Four Intel server vendors have appliances available to customers. Implementations vary from loading software such as Novell's ICS onto an existing system (Dell) to OEMing someone's "hot box" and loading caching software onto it (IBM) to OEMing a unit without modification (Gateway) to developing their own system from scratch (Compaq and startups such as Cobalt and CacheFlow). Presently, it is unclear which strategy will yield the best long-term results.

Consolidation in this market has already started: Quantum bought Meridian, maker of the Snap! server, IBM bought Whistle Communications, maker of a small caching server. We expect some modest proliferation for the next year, followed by a longer period of consolidation.


As mentioned earlier, server appliances made their biggest splash in the caching market. This is primarily due to the glamour associated with all things Webby. Other areas of significant growth include Web serving (lots of ISPs want tons of small servers to provide redundancy and high performance) and Network-Attached Storage (NAS). Currently, the NAS market is "owned" by Network Appliance, although there are other players such as Procom (recently involved in OEMing to Hewlett-Packard) and Auspex (who appears to be losing significant ground and whom we do not expect to survive independently).

4) Linux vs. NT


Few issues generate more heat and emotion than the operating system battles of the last three years. The server OS market used to be Unix (of various flavors), Novell NetWare (really a Network Operating System - NOS - more than a "base level" OS), and various proprietary OSes. The rise of Windows NT added a fourth leg to this market. As NT gained market share, as well as dominating the business desktop, people started seeking an alternative. The Unix market per se was too fragmented to provide a unified "defense" against the encroachment of NT. It appeared that NT would overcome the server OS space much as Windows had demolished other desktop OSes.

In the early 1990s, a then unknown Finnish student named Linus Torvalds "created" a free version of Unix, dubbed it Linux, and published the source code for the world to see. The idea was to have the programming world improve Linux a little bit at a time. "From these humble beginnings" as Bulwer-Lytton might write, sprang the OS that has now attained a #2 share in the server OS market for 1999. You can still get Linux for free, although most corporate entities choose to pay Red Hat, Turbo Linux, Caldera, or one of the other Linux distributors, in order to get documentation and customer support.

Windows NT is still too strong in corporations to be overtaken anytime soon. However, as more and more companies opt for Linux, NT's (or Windows 2000's) position will become more tenuous, and perhaps real competition will arise.


Corporate users should not choose Linux solely because it is "not Microsoft". Linux has a number of factors in its favor, such as scalability, robustness, and (if configured properly) security. Each situation where Linux is being considered must be evaluated on its own merits, and not just because Linux is the Next Big Thing.

A key concern of customers is the fragmentation of the Linux market, similar in some ways to that of Unix in general. We expect consolidation to start by the end of 2000. We also expect Red Hat (currently owning approximately 65% of the commercially-available Linux market share) to be one of the survivors - no surprise there. Although Corel's Linux distribution is supposed to be the easiest to load, plus able to run Windows applications, it is a desktop-oriented product, so we don't include it in our assessment.


Unsurprisingly, hardware vendors waited until 1999 to provide Linux in their servers. So far Dell is the only vendor with enough courage to factory-install Linux on its servers, but we expect others to follow suit in 2000. (Note that a lot of the Linux-oriented statements to come from server vendors in 1999 were merely "We will support Linux on some/all of our models", without any commitment to factory-install it.)

Vendors will need to intensify their commitment to Linux, at least for the short term. We believe Linux will not go away in the next 1-2 years, so vendors should make the most of it, or cede that market to Dell and whomever else is willing to commit to Linux. Most hardware vendors have formed relationships with at least one or two Linux distributors. We expect 2000 will bring semi-exclusive alliances, similar in ways to the Dell/Intel hardware alliance. We expect those alliances to be driven as much from the software side as the hardware side.


Presently there is no "one size fits all" distribution of Linux. As mentioned earlier, this perpetuates fragmentation in the marketplace, providing an opening for Microsoft to apply some more FUD. To combat the continuing arrows expected from Microsoft, at least one (and probably more) of the Linux distributors needs to bundle "missing" functionality into their version. Although this might be considered a parallel to the bundling that got Microsoft in trouble with the Department of Justice (DOJ), this can be thought of as a matter of long-term survival for the Linux vendors. Corporate IT managers will do what's best for their company, and if Linux does not provide as much needed functionality as Windows NT/2000, then the IT manager will stick with the safe bet of Microsoft.

5) Stratification/segmentation


By stratification, we mean the quasi-polarization of the server market into groups with a more tenuous connection than had existed in the past. Specific groups we see arising are the very-high-end servers, the mid-range workhorse servers, and the focused servers such as server appliances. Stratification is the by-product of two other trends: server consolidation and focused functionality. As each of these markets grow, it will be more difficult to justify three or four or five classes of GP server, as had recently been the normal situation.


For users, stratification means less long-term selection for "class" of machine. The typical customer could once buy servers for classifications such as workgroups, departments, applications, enterprises, and "super-enterprises", which were generally mapped to the size of the population being supported. In the long term, we expect those classifications to change to appliances, workgroup/department, and enterprise. These will be mapped by both functionality and size of supported population.

As stratification continues and variety decreases, competition will become more intense between the major suppliers. We expect the eventual winners to provide customers with low cost products, designs which are flexible to reconfigure easily either by the factory assembler or the advanced user, and a seamless supply chain process including rapid delivery of completed product.


Stratification will herald the next round of consolidation. As products have fewer classes into which they can be placed, a premium will be placed on those companies who can produce good, flexible designs in the shortest time, with minimum effort and maximum reuse. Not every vendor is up to the task, and so pressure will mount to produce or "move over".

We expect Compaq and Dell to maintain their lead roles in the server market. We expect IBM and HP will have a tougher time keeping up with the pace, although they provide other benefits such as support for multi-architecture environments.

It should be noted that as this trend continues, it will repeat the trend from the desktop PC market. We predicted some time ago that once servers became less "black magic" and more commoditized, the server market would repeat the PC market trend. As has been repeated in every maturing market since capitalism was invented. The vendors who cannot keep ahead of the trend wave must seek out other avenues of differentiation, such as in service and support.


For the general purpose servers, we expect to see designs become even more modular than they already are. Savvy vendors presently develop chassis which can be used for a multitude of products. We expect to see this be refined even more, although we don't expect to see chassis build themselves, yet. We do expect chassis design eventually to allow the user to both diagnose and repair their unit with a minimum of fuss and bother. Of course, by that time, servers should have uptime in the "six 9s" range (i.e. 99.9999% uptime, equal to around 30 seconds per year, about the same as the telephone system used to maintain), We say that with tongue only partly in cheek - server uptime is becoming another Holy Grail. Today's upper limit guarantees of 99.99% uptime equates to around five hours per year - pretty darn good when many Windows users have to reboot their desktop at least once every few days.


These trends show that, for the most part, the generalized Intel server market has now reached maturity. The main players are known, and mostly in control. The barriers to entry are high and getting higher. Even the nascent server appliance market - presently the only area not dominated by any of the Big Four - can be viewed as an evolutionary or revolutionary, depending on one's definition. (We see it as both, actually: revolutionary from a market/marketing standpoint; evolutionary from a technology and "next logical step" standpoint. We tend to think of it in automotive terms: appliances are analogous to the two-seater sports car; general purpose-servers analogous to the family sedan or station wagon.) The various consolidation and segmentation trends also point toward market maturity. The Linux/NT battles do not change the fundamental focus of the server market, only the dynamics of selection within the market.

Market maturity does not equate to stagnation - there are still pitched battles going on between Compaq and Dell for domination of the Intel server world, and the IBM and HP (distant #3 and #4 in the US market) keep fighting back with newer systems and increasing innovation. All of the Big Four, and some of the smaller players in the general-purpose market, are trying to pack ever-increasing functionality into the same space.

In summary, this market is still active and vital, and the next 2-3 years should provide even more benefit for the user.

comments powered by Disqus