Hi - We have a Dell server with 6 internal disks running VMWare ESXi 3.5.
First two disks are a mirrored pair holding the VMWare install and initial datastore
Second 4 disks are a RAID-5 array holding the actual VM machines.
1 disk of the 4 in the RAID-5 array failed (position 3) and the server kept on running as intended.
We replaced the disk in position 3 but it failed to light up or show in the BIOS.
We tried disk 4 in position 3 with the same effect so we concluded that the position was faulty.
Despite the RAID being degraded we should still be able to boot and run whilst we get a replacement backplane or whatever ... HOWEVER ... VMWare now reports that the second disk (the array) is blank and the VM's held on it appear to be gone.
Any advice please ? We have not initialised the disks although we have deleted and recreated the array from the BIOS. The BIOS does now see the array as working but degraded which is correct.