There seems a need in the dev and the ops world to virtualise everything. Everything!
People talk of it as the second coming. We can scale. We can spin up any environment without over-preparing or extra investment or asking permission.
Like the cloud, talk of virtual machines solving every problem imaginable is all the go.
Like all new technologies, they bring you benefits but they are not the answer to everything.
Good engineering and common sense must always be applied, especially in times of wonder.
One place that you can not 100% virtualise is in testing.
Twice in the last four months, I have had to walk in to situations where the issue was testing an app in an updated SOE that had only been tested in a purely virtualised test environment.
One had issues with drivers and the other with an out of date BIOS version.
When you are running in a virtual machine (VM), it is an emulator. As much as possible is handled inside the emulator but one thing that makes a VM so portable is that it is piggybacked on the host operating system. One thing you can count on is that when it falls back to embedded calls to hardware that it falls back to the native OS and its drivers.
Underneath that is the BIOS and although you can mostly count on that, nothing is guaranteed.
In the two cases I saw, the problem happened under load and pushed the boundaries that would never have been tested because it was impossible to simulate in a non-native environment. Although operating systems were tested, bespoke applications were assumed to pass all tests if they did so inside the VM.
There is no need to test everything outside of a VM though but you should at some point in the testing journey. My suggestion is...
Unit test in a VM.
System of Integration test in a VM.
Functionally test in a VM.
At least Performance or Hardware Integration test on the native hardware with the native operating system.
Nothing beats reality.