We run a video recording/archiving application called Digital Sentry. Currently we have several physical Windows 2008 R2 servers each recording video from 30 to 60 cameras. There is not special hardware installed...data is pulled from cameras over TCP/IP and written to iSCSI volumes. Each physical server has 4 cores (2 dual-core sockets) and 8GB memory. CPU usage is usually very low (about 10%) and memory usage is also low.
But when we try running this application on a virtual server (VMWare ESX 5), even a small number of cameras (4) quickly pushes CPU above 50%. This makes no sense.