- Get link
- X
- Other Apps
Risks and benefits of virtualization
Posted by Dominic Todd, Marketing Communications Specialist,
Stratus Technologies, 23 Jul 2018
Virtualization is more than just an industry buzzword or an
IT trend. This technology allows multiple instances of the operating
environment to run on a single device. These virtual machines (VMs) run
applications and services like any other physical server and eliminate the cost
of purchasing and maintaining additional servers. Virtualization also offers
other benefits such as faster delivery of applications and resources. In
addition, it can increase the productivity, efficiency, agility and
responsiveness of IT by freeing up IT resources so that they can focus on other
tasks and initiatives. However, virtualization has its own risks.
How has virtualization evolved?
To better understand the business rationale for
virtualization, as well as the potential risks of virtualization, we need to go
back to the days when mainframes ruled the world of computing.
Mainframes are used by large organizations to manage their
most critical applications and systems. However, they can also act as servers,
providing the ability to host multiple instances of operating systems at the
same time. In doing so, they first proposed the concept of virtualization.
Many organizations quickly saw the potential. They began to
split workloads between different departments or users in order to provide them
with dedicated computing resources for more capacity and better performance.
This was the beginning of the client-server model.
In most cases, the application ran on a server that was
accessed by many different PCs. Other advances, such as the proliferation of
Intel x86 technology, have helped make client-server computing faster, cheaper,
and more efficient.
Everything worked very well until its popularity increased.
In the end, it seemed like everyone in the company wanted their application to
be hosted on a server. As a result, many servers - "server sprawl" -
quickly took over even the largest data center.
Space is not the only concern. All of these servers were
expensive and required extensive support and maintenance. Overall IT costs have
increased and many companies have started looking for a new approach.
One solution: a virtualized approach for any server using
x86 technology. With virtualization, a physical server can now host multiple
virtual machines and provide all the isolation and functionality required for
each application.
New Approach Raises New Concerns
All of this worked fine, except for the new risk that the
virtualization layer - the hypervisor - could fail. Worse, a single failure in
a virtualized environment will cause a domino effect where all virtualized
applications will also fail, resulting in an unacceptable risk of downtime. To avoid
this scenario, many companies have chosen to virtualize their non-production
systems. This way, if any failure occurs, critical systems will not fail.
As technology improved, organizations realized that
hypervisors could provide the performance and stability they needed and began
to virtualize all of their applications, even workloads.
First, the effort was uncomplicated and seemed to open the
way to many significant benefits. On the other hand, it presented new equipment
and availability risks. For example, consider a case where a company might have
20 business-critical virtual machines on a server just to fail.
How long does it take to solve the problem? How much will
this simple cost? What long-term implications will this have on customers,
prospects and the company's reputation? All of these questions are reasonable,
but often there are no satisfactory answers.
This scenario indicates the need for the right hardware infrastructure and always available systems as part of any successful
virtualization strategy. We'll cover these topics as well as some common
misconceptions in our next article. Be aware of.
- Get link
- X
- Other Apps