For enterprise IT, evaluating different virtualization options can be a challenge, especially for organizations that don’t have any experience with it. That’s no reason to shy away from the technology, though, as virtualization is a great way to manage resources and reduce costs. To get some expert insight on the matter, we caught up with Matthew Portnoy, who literally wrote the book on virtualization.
Published on Aug. 29, 2016, the second edition of Virtualization Essentials contains an in-depth explanation of virtualization, as well as chapters on hypervisors, VMs and availability. The book also covers a broad range of processes, such as managing CPUs, memory, storage and networking for VMs. Here, Matthew Portnoy answers questions about virtualization basics and gives advice to those looking to implement virtualization.
What are the advantages of virtualization? Are there disadvantages?
Matthew Portnoy: The advantages of virtualization are still the same as they were at theÂ beginning — more efficient and reliable use of resources at lower costs. With virtualization moving into other areas of the data center, such as storage and networking, and the addition of virtual appliances — load balancers, VPNs [virtual private networks], firewalls — costs areÂ permanently being driven down and efficiencies are stillÂ improving.
If there’s a disadvantage to virtualization, I would say it’s also still the same as the initial challenges: that a poorly planned deployment will perform poorly. That might be said about physical deployments, as well, though virtual infrastructures often mask issues that might be moreÂ obviousÂ in aÂ physical deployment. One common example is not providing enough I/O forÂ storage requirements, though users now allocate enough memory and CPU resources, which was not always the case.
What are the biggest misconceptions regarding virtualization?
Portnoy: Virtualization is not aÂ panacea for bad practices. You have to pay as much attention to resources in the virtualÂ world as you do in the physical. You canÂ architect and deploy efficient, performant, highlyÂ available and secure virtualÂ environments, but it doesn’t just happen. You have to understand the product you are working with, know what its limitations and benefits are and act accordingly.
The mindsets that used to limit which applications could be virtualized are mostly gone now via a combination of maturing administrators, more capable hardware and experience.Â There are things that you can take advantage of in the virtualÂ environment that can’t be duplicated in the physical world — page sharing, for example, or new security models in virtual networking — that still make peopleÂ hesitant or skeptical. But a well-deployed virtual environment is stillÂ usually less expensive — per workload — more available, more manageable and more secure than their physical counterparts.
What types of organizations are more likely to implement virtualization?
Portnoy: At this time, there are noÂ organizations that couldn’t implement virtualization if they choose to. In certain industries, there are use cases that might drive organizations to virtualize sooner — healthcare providers and virtual desktops come to mind. Today, even smaller companies probably have some virtualization in their IT departments whether they know it or not.
One early driver was that VMware could provide HighÂ Availability to workloads on a cluster without any additional hardware or software. Organizations could get better uptime merely by being on the platform, whereas in the physicalÂ environment, thoseÂ unprotected workloads would be prone to both planned and unplanned downtime.
What steps should companies take when evaluating virtualization products?
Portnoy: Like with other products, the best questions to ask are: What problem am I trying to solve? What is the cost of trying to solve this? What will it cost me if I do nothing? What are the transformational effects and costs on the organization of deploying this particular platform?
I often heard that VMware [was] expensive, but it solved certain issues for companies and returned enough value in a short enough time to make it an easy choice for many people. If Hyper-V solves the issue at a lower price and the feature differences are notÂ relevant, well, that’s a good answer. Nothing beats kicking the tires, so if you have the time, download theÂ products and either compare them using your use cases or have a trusted partner assist your efforts.
HowÂ canÂ ITÂ bestÂ partner with the C-suite and other departments to evaluate virtualization products?
Portnoy: It comes back to the value IT is providing to the business. Initially,Â virtualization drove huge hardware costs out of the data center while increasing availability and decreasing deployment times. This allowed companies to provide a more stable platform for their applications and allowed them to roll out new applications much faster, decreasing time to market for certain corporate initiatives.
Virtualization provides some really interesting disaster recovery products, again, at lower deployment costsÂ and lower operational costs than traditional models. In areas prone to natural disasters — hurricanes, tornados, snowstorms and so on — virtual platforms offered businessÂ continuance in the event of [a] disaster at a lower barrier to entry. New technologies like long distance vMotion, [which offers] the capability to migrate a running VMÂ across continental distances withoutÂ interruption, offer otherÂ possibilities. Virtual networking can significantly improveÂ environment security,Â which is an important topic for anyÂ public-facing company today.
What are the top challenges companies face when trying to implement virtualization?
Portnoy: The technology itself is fairly mature today, so much of the challenge is in the people side of theÂ equation. Change means unknown, and unknown is uncomfortable, and uncomfortable and unknown are two words that application owners try to stay far away from. Education of the executive and application owners is probably the single largest way to smooth that transition. Once they understand the value thatÂ virtualization provides, they are usuallyÂ proponents of theÂ implementation.
For application owners, focusing on how virtualization can help them mitigate risk with higher availability, improve testing and quality assurance through VMÂ cloning, and improve performance with dynamic server upgrades — without incurring downtime — are allÂ compelling reasons.
Executives are looking to be more agile, moreÂ cost effective andÂ to protect the interests of their company. Virtualization provides these benefitsÂ as well.
How do you recommend balancing costs with the desire to have the latest technology?
Portnoy: If everyone had unlimited resources â€¦ But, seriously, cost is the practicality that drives most projects. If theÂ latest technology provides a solution to an acute problem that needs to be solved, then,Â usually, cost is no longer the gating factor. But that is not typical.
It still comes back to what is the value that can be provided with aÂ platform, and is the problem it solves going to be worth the money that is spent. Project dollars are competed for in mostÂ organizations, and unless there is a strong case for deploying a particular product, those funds can quickly find another home.
Source: TechTarget, Ryann Burnett