Today's Most Hot Topic "Server Virtualization"

ABSTRACT


We have witnessed low resource utilization of high performance graphics workstations in our instructional computer laboratories. The low utilization statistics indicate that workstation consolidation could achieve great savings in infrastructure, networking, power consumption, and maintenance costs. In addition, we would spend less time in deployment, security, and fault isolation without compromising performance.


Server virtualization opens up a range of new possibilities for autonomic datacenter management, through the availability of new automation mechanisms that can be exploited to control and monitor tasks running within virtual machines. This facilitates more powerful and flexible autonomic controls, through management software that maintains the system in a desired state in the face of changing workload and demand.Virtualization, a well established technology in the desktop and server domain, is currently investigated and analyzed with respect to its potential within mobile devices.

"What is Server Virtualization"

Server virtualization, also known as hardware virtualization, server virtualization enables multiple operating systems (can be different) to run on a single physical machine as virtual machines (VMs). Server Virtualization enables organizations to consolidate physical servers into a single system running multiple operating systems and applications to increase system utilization. The software layer providing the virtualization is called a virtual machine monitor or hypervisor. A hypervisor can run on bare hardware or on top of an operating system.
Virtualization is an abstraction layer that decouples the physical hardware from the operating system to deliver greater IT resource utilization and flexibility. Virtualization allows multiple virtual machines with heterogeneous operating systems to run in isolation, side-by-side, on the same physical machine. Each virtual machine has its own set of virtual hardware (e.g., RAM, CPU, NIC, etc.) upon which an operating system and applications are loaded. The operating system sees a consistent, normalized set of hardware regardless of the actual physical hardware components.

Benefits

With server virtualization, you can consolidate workloads of underutilized server machines onto a smaller number of fully utilized machines. Fewer physical machines can lead to reduced costs through lower hardware, energy, and management overhead, plus the creation of a more dynamic IT infrastructure. Virtualization reduces hardware investment, tremendously enhancing server and application management tasks. It leads to better capacity planning, higher availability, smarter resource sharing, and simpler heterogeneous storage infrastructure management. Another benefit is the ability to securely separate virtual operating systems, and the ability to support legacy software as well as new OS instances on the same computer. Virtualization also has a benefit when working on operating system development: running the new system as a guest avoids the need to reboot the computer whenever a bug is encountered.
See More:
Related Topics

Guru’s Software Testing
Software Testing Tools – All free and Open-Source
Software Testing Techniques
Software Testing Techniques - II

If this topic really helped you or to make this topic more informative please give your suggestions/comments. Thanks

Guru's Software Testing

If software engineering is really an engineering discipline, it is the intelligent application of proven principles, techniques, languages, and tools to the cost-effective creation and maintenance of software that satisfies users' needs.
1. What is Software

"Software is code, programs, documentation and data relating to the operation of a computer based solution or to utilize capabilities of hardware."

2. What is Software Testing

"Software testing is an activity (phase of Software Development Life Cycle) that aims to check or evaluate capability of a system or program or software and determining that it meets its required results."
2.1 Testing Objectives

A number of rules that act as testing objectives are:
  • Testing is a process of executing a program with the aim of checking capability of software for producing right results.
  • Testing is a process of executing a program with the aim of finding errors.
  • A good test case will have a good chance of finding an undiscovered error.
  • A successful test case uncovers a new error.
2.2 Test Case Design

The design of software testing can be a challenging process. However software engineers often see testing as an after thought, producing test cases that feel right but have little assurance that they are complete. The objective of testing is to have the highest likelihood of finding the most errors with a minimum amount of timing and effort. A large number of test case design methods have been developed that offer the developer with a systematic approach to testing. Methods offer an approach that can ensure the completeness of tests and offer the highest likelihood for uncovering errors in software

3. Software Testing Techniques

The importance of software testing and its impact on software cannot be underestimated. Software testing is a fundamental component of software quality assurance and represents a review of specification, design and coding. The greater visibility of software systems and the cost associated with software failure are motivating factors for planning, through testing. It is not uncommon for a software organization to spent 40% of its effort on testing.

3.1 Static Testing Versus Dynamic Testing

Static testing is a form of software testing where the software isn’t actually used.
  • It is generally not detailed testing, but checks mainly for the sanity of the code, algorithm, or document. It is primarily syntax checking of the code or and manually reading of the code or document to find errors
  • This type of testing can be used by the developer who wrote the code, in isolation. Code reviews, inspections and walkthroughs are also used.
  • This is the verification portion of Verification and Validation
  • These are verification activities. Code Reviews, inspection and walkthroughs are few of the static testing methodologies
In Dynamic testing the software must actually be compiled and run.
  • Dynamic analysis refers to the examination of the physical response from the system to variables that are not constant and change with time.
  • Some of dynamic testing methodologies include unit testing, integration testing, system testing and acceptance testing.
  • Dynamic testing is the validation portion of Verification and Validation.
  • These are the Validation activities. Unit Tests, Integration Tests, System Tests and Acceptance Tests are few of the Dynamic Testing methodologies.

Also Available

Server Virtualization – World’s Hottest Topic – A Good Research Work