Virtualize your IT infrastructure to optimize the use of multiple servers while reducing costs. Virtualization is the use of software in a virtual machine, which allows you to redistribute system resources to perform significant tasks without the need to purchase additional equipment. Even if you own only two servers, use virtualization to allocate resources for other tasks.
What is virtualization and virtual machines?
Information technology has brought many useful and interesting things into the life of modern society. Every day, inventive and talented people come up with more and more new applications for computers as effective tools for production, entertainment and cooperation. Many different software and hardware tools, technologies and services allow us to daily increase the convenience and speed of working with information. It’s becoming more and more difficult to single out really useful technologies from the stream of technologies that are falling upon us and to learn how to use them with maximum benefit. This article will discuss another incredibly promising and truly effective technology that is rapidly breaking into the world of computers - virtualization technology.
In a broad sense, the concept of virtualization is the concealment of the real implementation of a process or object from its true representation for those who use it. The product of virtualization is something convenient for use, in fact, having a more complex or completely different structure, different from that which is perceived when working with an object. In other words, there is a separation of representation from the realization of something. In computer technology, the term “virtualization” is usually understood as an abstraction of computing resources and providing the user with a system that “encapsulates” (hides) its own implementation. Simply put, the user works with a convenient representation of the object, and it does not matter for him how the object is actually arranged.
The term “virtualization” in computer technology appeared in the sixties of the last century along with the term “virtual machine”, which means a virtualization product of a software and hardware platform. At that time, virtualization was rather an interesting technical find than a promising technology. The development of virtualization in the sixties and seventies was carried out only by IBM. With the advent of the experimental paging system in the IBM M44 / 44X, the term “virtual machine” was first used, which replaced the earlier term “pseudo machine”. Then, in the IBM System 360/370 series mainframes, you could use virtual machines to save previous versions of operating systems. Until the end of the nineties, no one but IBM decided to use this original technology seriously. However, in the nineties, the prospects for a virtualization approach became apparent: with the growth of hardware capacities, both of personal computers and server solutions, it would soon be possible to use several virtual machines on the same physical platform.
In 1997, Connectix released the first version of Virtual PC for the Macintosh platform, and in 1998 VMware patented its virtualization techniques. Connectix was subsequently acquired by Microsoft, and VMware by EMC, and both are currently two major potential competitors in the virtualization technology market in the future. Potential - because now VMware is the undisputed leader in this market, but Microsoft, as always, has a trump card up its sleeve.
Since its inception, the terms “virtualization” and “virtual machine” have acquired many different meanings and have been used in different contexts. Let's try to figure out what virtualization really is.
Types of Virtualization
The concept of virtualization can be conditionally divided into two fundamentally different categories:
- platform virtualization
The products of this type of virtualization are virtual machines - some software abstractions that run on the platform of real hardware and software systems.
- resource virtualization
This type of virtualization aims at combining or simplifying the presentation of hardware resources for the user and obtaining some user abstractions of equipment, namespaces, networks, etc.
However, due to the complexity and high cost of deploying and maintaining virtual infrastructure, as well as the difficulty of correctly evaluating the return on investment, many virtualization projects have failed. According to the results of studies conducted by Computer Associates among various companies that have attempted virtualization, 44 percent cannot describe the result as successful. This fact holds back many companies planning virtualization projects. Another problem is the lack of truly competent specialists in this field.
What awaits virtualization in the future
2006 was the key year for virtualization technologies: many new players came to this market, many releases of virtualization platforms and controls, as well as a considerable number of partnership agreements and alliances concluded, suggest that in the future the technology will be very, very popular. The virtualization market is at the final stage of its formation. Many hardware manufacturers have announced support for virtualization technologies, and this is a sure key to the success of any new technology. Virtualization is getting closer to people: interfaces for using virtual machines are simplified, agreements on the use of various tools and techniques, not yet formalized, appear, migration from one virtual platform to another is simplified. Of course, virtualization will occupy its niche in the list of necessary technologies and tools in the design of enterprise IT infrastructure. Ordinary users will also find their use in virtual machines. With an increase in the performance of desktop hardware platforms, it will be possible to support several user environments on the same machine and switch between them.
Hardware manufacturers are also not going to stay in place: in addition to existing hardware virtualization techniques, hardware systems will soon appear that natively support virtualization and provide convenient interfaces for the software being developed. This will quickly develop reliable and efficient virtualization platforms. It is possible that any installed operating system will be immediately virtualized, and special low-level software, with the support of hardware functions, will switch between running operating systems without sacrificing performance.
The very idea embedded in virtualization technologies opens up wide possibilities for their use. After all, in the final analysis, everything is done for the convenience of the user and the simplification of the use of familiar things. Is it possible to significantly save money on this, time will tell.
What is it for?
Virtualization greatly simplifies the work of IT infrastructure, increasing productivity by optimizing the use of resources, reducing maintenance and management costs. The time to create a typical infrastructure is drastically reduced and IT resources, both hardware and human, are rationally used.
An important point is the creation of a constantly functioning IT infrastructure that is protected from failures and resistant to disasters. Due to a well-built virtualization environment, there is a reduction in unplanned downtime and the absolute exclusion of planned outages for servicing servers or data warehouses. At the same time, all IT services can get away from binding to a specific provider.
For companies of any level and at any stage of development of IT infrastructure, it is possible to implement automation of processes that are somehow related to the allocation of computing resources for various departments within the company, or for their customers.
What categories of users is the solution suitable for?
Virtualization is suitable for any company seeking to create a flexible and modern computing infrastructure. Ease of implementation and maintenance, reliability and functionality, reducing risks for the enterprise, make investment in this technology justified. With the current level of development of third-party cloud systems, virtualization opens up unlimited possibilities for combining these technologies and further development aimed at the company’s business, rather than constant care about the IT infrastructure.
- Reduced support costs for IT systems
- reducing the cost of introducing new IT services.
- reducing the time for the introduction of new IT services,
- simplicity of infrastructure maintenance,
- improving the reliability of IT systems in general.
- improving the convenience of infrastructure management,
- reduction of low skilled jobs,
- professional development.
The first virtualization systems arose within the framework of operating systems and made it possible to create virtual PCs in parallel with the performance of basic tasks. The development of this area has led to the emergence of a separate class of software - hypervisors. The hypervisor is installed directly on the hardware platform and represents all available resources - processor megahertz, megabytes of RAM, gigabytes of storage space and network bandwidth for a large number of virtual machines. The hypervisor not only creates these resources for each virtual machine, but also redistributes the resources between a large number of consumers and ensures the full life cycle of the virtual server.
The main solutions for virtualization of computing resources today:
- VMware vSphere
- Microsoft Windows Server
- Citrix XenServer
- Oracle VM
- Red Hat Enterprise Virtualization
- Linux KVM
- Huawei FusionSphere
Modern IT architectures necessarily contain a storage subsystem. It can be implemented in several ways - from storage on a computing node to devices allocated exclusively for storage. In addition, storage can take place on various media: from spindle drives and tapes to solid state drives.
For virtual infrastructure, storage is also an integral part. To optimize the work of the hypervisor with the storage system, manufacturers of hardware storage systems equip their solutions with specialized drivers that can transfer the execution of certain operations to storage systems, which saves computing resources. But there is another way - using virtualized storage used in hyper-converged infrastructures. Such storage is created on the basis of the same computing nodes and uses server disks as part of a single storage. This allows you to drastically reduce the cost of construction and maintenance, to allocate optimized storage resources for each VM. In addition, the storage virtualization system itself builds a fault-tolerant load-balanced storage scheme and in accordance with the service policy for each VM. Storage virtualization systems can be used both on a data center scale and when performing small local tasks.
The solutions themselves can be additions to the hypervisor or be included in its composition by default. All major manufacturers have similar solutions in their arsenal:
- VMware Virtual SAN
- Microsoft Storage Spaces included with Microsoft Windows Server
- Virtuozzo storage
- Red hat ceph
- StarWind Virtual SAN
- Huawei FusionStorage
- EMC ScaleIODataCore
- Virtual san
To build a fully software-defined data center, you must be able to not only virtualize standard devices for the server, but also flexibly manage the configuration of the network topology and firewall rules. To do this, there is a separate class of products in the virtualization environment - solutions for network virtualization.
To one degree or another, each hypervisor virtualizes the network, since several physical network ports can be shared between dozens of virtual machines. But to provide more flexible management of network parameters based on security policies and firewall rules, as well as to reduce spurious traffic in the network and reduce the load of hardware network equipment, more advanced tools for creating a virtual network are needed.
At the moment, a limited number of solutions are presented on the market that allow you to virtualize all aspects of the network at once:
- VMware NSX
- Microsoft Windows Server 2016 Datacenter with System Center
The most important area of application of virtualization technologies is the creation of user jobs, when the main workload falls on a common server, and the user sees only the image of what is happening on the virtual PC on the screen of the access device. This technology is called VDI (Virtual Desktop Infrastructure).
Virtual workstations allow you to allocate the right tools for each user, rationally distribute software licenses, have access to the workspace from stationary and mobile devices, providing convenient administration and compliance with security policies.
If the company needs to use stationary PCs or laptops for remote work, VDI technologies allow delivering only work applications located on a shared server to employees' devices and not creating full-featured remote desktops.
VDI solutions on the market:
- VMware Horizon
- Citrix XenApp and XenDesktop
- Parallels VDI and RAS
- Huawei FusionAccess
Virtualization management and automation
Further development of virtualization technologies and cloud services led to the creation of new IT infrastructures, hybrid and hyperconverged. These infrastructures are fully software-defined and have deep integration with private or public clouds.
To manage such infrastructures, powerful tools are needed that, on the one hand, will take into account the specifics of the installed physical equipment, on the other, be able to quickly provide resources for the needs of the business, and on the third hand, be transparent and protected. For these purposes, control and automation systems for virtualization also serve.
Among the main products can be identified:
- VMware vCenter and vRealize
- Microsoft System Center
- Red Hat Enterprise Virtualization Manager
- Citrix Systems XenCenter
- SolarWinds Virtualization Manager
- DELL Foglight
VRealize Suite is a hybrid cloud management platform for VMware solutions.
Under the vRealize brand, VMware integrates all solutions designed to manage hybrid infrastructure, including resource management tools on the side of cloud providers (not only VMware), as well as infrastructure management tools based on various hypervisors.
The VMware vRealize Suite stack meets the Gartner requirements for cloud management tools - Evaluation Criteria for Cloud Management Platforms, namely:
- On-demand delivery of infrastructure applications or resources through a self-service portal or service catalog,
- calculation of the cost of cloud resources and transparent planning of financial indicators of its effectiveness.
The main tasks that are solved by virtualization
Virtualization of resources allows you to create fully functional isolated copies of server systems on one hardware platform - the so-called virtual machines. They emulate a complete set of devices on which any operating system can operate, allowing you to run software on virtual resources and reduce investment in hardware infrastructure. The result is a lack of dependence on any manufacturer, centralized administration of complex systems, easy scaling without compromising performance.
Virtualization in business
Simplify the opening of new offices and branches - with virtualization, you will spend less on equipment at service points, and it will be easier for the IT service to prepare jobs for employees.
Even with peak loads, your online store will cope with the influx of visitors, and the call center will handle the entire flow of calls. Наращивайте и сокращайте потребляемые ресурсы по мере необходимости — и никаких переплат за простаивающие мощности.
Виртуализация в среде разработке упрощает и ускоряет создание ИТ-продуктов и сервисов. Разработчики работают в реальной среде, благодаря чему могут быстрее исправлять ошибки и передавать готовые продукты бизнесу.
Виртуализация обеспечит стабильно высокий уровень ИТ на всей территории присутствия вашего бизнеса. Централизованное обслуживание всех ИТ-ресурсов сократит материальные и временные расходы, позволяя вашим сотрудникам меньше тратить на рутину, а бизнесу — обходиться меньшими ресурсами.
Virtualization greatly simplifies the work of the enterprise IT infrastructure and reduces the load on the enterprise technical support services. Special modules allow you to evaluate the cost-effectiveness of virtualization and more accurately plan the costs of developing an IT enterprise.
CROC offers virtualization systems based on the VMware vSphere, Microsoft Hyper-V, SUSE, Red Hat Virtualization, Virtuozzo, Citrix XenServer products, as well as products from Russian manufacturers Rosplatforma R-Virtualization, RusBITEh-Astra PC "VIU" and PC SV "Brest" , Horizon-Sun Barricades and open source software (or free software).
Network function virtualization
Network Function Virtualization (NFV) is an approach to designing, deploying, and managing network services such as NAT, firewalling, intrusion detection, DNS, and other network functions, separating software from specialized equipment. Network virtualization helps reduce capital and operating costs for IT, quickly deploy network functions in support of changing business needs. The customer increases the flexibility of their IT by providing software-based services using server virtualization technologies, switches and storage. This provides a simple and flexible scaling of network services in organizations, as well as greater independence from manufacturers of network equipment.
CROC offers network virtualization systems based on VMware NSX, Nuage Networks Virtualized Services Platform (VSP), Citrix NetScaler, and open source software (or free software).
Key Benefits of Switching to Virtualization
Saving on the purchase of new computing resources and accelerating the launch of solutions
Rational use of computing resources
Lower support costs for large information systems
Rapidly scale computing resources as needed
Flexible redistribution of computing resources on the fly
Ensuring continuous operation of applications with optimal equipment loading
Ensuring high availability of information systems
Continuous monitoring and optimization of application performance