VirtualIron.com | Join | Contact Us

Blog
Virtualization Blog
Virtualization Blog - The New Economics of Virtualization
Decrease font size
Increase font size
April 28, 2006
  The New Economics of Virtualization
Alex Vasilevsky
Founder and CTO
Virtual Iron Software

Welcome to my first blog. Obviously, virtualization is a white hot technology space, but there is a lot of confusion about it and still too much unfulfilled promise. My goal here is to paint a bigger picture about the possibilities of server virtualization and how it will change the IT and the server industry as we know it today.

Because this is the first entry, I'm going to start at a high level. My hope though is that we can quickly dig deeper into topics such as the future of the server industry and the impact that consolidation and virtualization will have on it. I'd also like to discuss what's holding users back from getting more value out of their existing virtualization solutions and how open source virtualization and innovation will usher in monumentous changes within the data center.

I'd also like to hear your ideas about the same.

But first, what is virtualization?

In the context of this blog, when I talk about virtualization I am referring to server virtualization.

In general server virtualization is the masking of server resources (including the number and identity of individual physical servers, processors, and operating systems) from server users. The intention is to spare the user from having to understand and manage complicated details of server resources while increasing sharing and utilization.

Usually server virtualization software runs between the operating system and the hardware; this software creates "virtual servers" that appear to an operating system as hardware, but in actuality are a software environment. This way multiple copies of an operating system think they are each running on their own server, when in reality they are all sharing one physical computer.

Server virtualization is not new; during the 1960s IBM pioneered the use of virtualization to allow partitioning of its mainframes. However the technology got relegated to the dusty heap of computing history as cheaper servers and PCs became widespread.

And then the resurgence started and virtualization is now on the cusp of becoming a mainstream technology. The question is why is there this sudden resurgence in the old technology?

The answer is quite simple - too many underutilized servers, taking up too much space, consuming too much power, and at the end costing too much money. In addition, this server sprawl become a nightmare for over-worked, under resourced system admins.

To save money, companies are consolidating their data centers and standardizing the applications they run their businesses on. The number one project in most IT shops is data center consolidation. Nicholas Carr in his blog asks if the server industry is doomed (an interesting topic for another day), and provides some great data. For example he states that:

"The chemicals giant Bayer, for instance, has been consolidating its IT assets worldwide. In the U.S. alone, it slashed its number of data centers from 42 to 2 in 2002, in the process cutting its server count by more than half, from 1,335 to 615."

The technology that helps companies consolidate their data center is server partitioning - the ability to carve up one physical server into many virtual servers. This allows data center managers to place multiple virtual servers, each with its own unique operating system instance, on a single physical server. By doing this, they can consolidate their physical infrastructure, preserve their investment in existing operating systems and applications, and get more from their hardware investments. The potential benefits and ROI of virtualization are numerous.  It can help control hardware and operational costs and it promises to deliver new levels of agility, manageability, and utilization.

Most companies are beginning to buy into its potential and making significant investments, but are many getting their money's worth?  Is the pace of innovation keeping up with the opportunity to solve the critical business problems we're dealing with in the data center? 

I say no way. We're not even close to scratching the surface of what virtualization can truly do. Existing solutions are inflexible and expensive - sometimes even more then the cost of a new server. On top of it all, they don't perform well.

Despite all the hype, the IT executives we speak with continue to be frustrated with the lack of solution options and with the slow pace of innovation in a software market that is controlled by a small number of vendors.  The lack of choice gives these vendors significant pricing and account control.  Compounding matters, once users do select one of the few commercial alternatives, they are locked into that proprietary solution.  Some solutions also require modifications to the operating system - not something to ever be taken lightly.

Fortunately, emerging alternatives and increased vendor competition are offering users new choices while driving the cost of the software down.

For example, Xen, an oft-hyped open source project, and an emerging low-level software for allowing multiple OSs to share the same physical hardware, has the potential to give users better performance at much more attractive price points. Xen has a broad ecosystem (Virtual Iron is one of the key contributors) that includes all the major processor manufacturers, server companies and operating system providers.  These companies are working together to deliver virtualization functionality that is based on broadly adopted industry standards. This community has formed an extended testing team, further driving quality improvements. The entire ecosystem is benefiting from having so much talent contribute to the development of Xen-based solutions and running the software through its paces. Open source technologies like Xen have a history of providing improved functionality, better performance, and lower total cost of ownership than proprietary technologies.  Unlike proprietary technologies, Xen is free and as a result, will rapidly make its way into commercial offerings and end user solutions. Costs will come down, making it more cost-effective to deploy virtualization to every server throughout an enterprise's IT infrastructure.

Indeed, history also shows that open source offerings, when generally accepted, tend to catch up fast to their proprietary counterparts. Although the current proprietary offerings have a few years head start on Xen, we expect that gap to close quickly. Hypervisor support for chip-assisted virtualization quickly negated several years worth of VMWare's development efforts. In addition, the Xen project and ecosystem have clearly reached critical mass and the Xen hypervisor is emerging as the de facto standard base to be used in server virtualization. The tidal wave of innovation has begun and it opens up a whole new set of alternatives and economic opportunities for users.

Edited: 05/01/2006 at 07:17 AM by alex

    Posted By: Alex Vasilevsky @ 04/28/2006 04:07 PM     Alex's Blog  

May 31, 2006

Comments


 
I am trying to understand the strategies of various players in this market.
Here is a simple argument without going into the niti gritties of the current events. Perhaps the next step is to take this single strand of logic and explain the exceptions and why key players are deviating from the logic below.


Prediction 1: Cross Platform compatibility will lead to an overall higher level of prices of Mac or Windows products and thus computer systems will overall become costlier. The expectation that computer systems will become cheaper due to competition from virtualization is but only a myth.

Prediction 2: Obviously, though systems will overall become costlier, users will be happier on an average since they can access benefits of a dual (Windows/Mac) machines and thus a higher level of satisfaction and ability to satisfy their diverse preferences.

Let us analyze why:

If a market is saturated (assuming it is) since possibly the growth rate of the market is probably far slower than what it is today. then it is expected in any IT sector that large vendor initiate cross-platform compatibility.

Let's look at it this way.

Scenario without cross-platform compatibility: If Microsoft lowers its price for Windows XP, it benefits Microsoft completely since people perhaps purchase more PCs (become affordable overall) and MS Office. Thus, MS can increase its market share by lowering the price of Windows when there is no cross platform compatibility. Lower prices benefits Microsft COMPLETELY in terms of higher market share and more sales. BUT what if the market is saturated? That is, consumers have already invested either in LINUX or WINDOWS and the likelihood of LINUX customers switching wholly to Windows is minimal and also vice versa. This is likely the situation nowadays with most business customers IT infrastructures either largely LINUX/UNIX or largely WINDOWS. Thus, if market is saturated Microsoft does not benefit from lowering prices of XP.


Scenario with cross-platform compatibility: Microsoft or Apple Mac for that matter, have no incentive to lower their prices since if XP price is reduced, then Mac users will buy XP and more potential Mac buyers will also likely buy their planned Mac purchases since they can also use XP using Bootcamp. Therefore, if Microsoft lowers prices, it benefits, both Microsoft and its rival Appple. Therefore, Microsoft does not have as much incentive to lower prices (neither does Mac).

 Posted By: Nilesh Saraf @ 05/31/2006 10:44 PM   :  Post a reply

FuseTalk Standard Edition - © 1999-2007 FuseTalk Inc. All rights reserved.


Copyright © 2003-2007 Virtual Iron Software, Inc. | Privacy Statement | Terms of Use | Site Map