Link to home
Create AccountLog in
Avatar of JeanPoL
JeanPoLFlag for Albania

asked on

esx 4.1 , gets frozen

hi there ,

I have a problem on one of host we have on vmware.
its 4.1.

Its just get disconnected from vcenter, i cannot use console even. But there is some errors on desktop :

Avatar of Bradley Fox
Bradley Fox
Flag of United States of America image

This appears to be a failed disk or RAID controller.  See the VMWare KB link.
That's a storage fault

1. Check you have supported storage controller.
2. Boot from Smart Start CDROM and run the Array Configuraton Utility and check the RAID array.
3. Do you have any flash amber lights on the disks.
4. Update firmware on controller and disks.

What's the RAID array, local disks or DAS attached?
Avatar of JeanPoL


Well, this setup is working for years. Last year i did upgrade on Vshpere 4.1 but never had an issue like this.

1. I think this is ok , i have 4 other host and had no issue ... i guess its  supported 99.9 % .
2. Well , i am waiting for closing of branches , then i can reboot atm vm's on host are working , just host is not responding.. ( but soon expecting to lose connectivity on vm's which resides on that host)
3. No i dont have any amber light , otherwise i would change broken part.
4. I am downlaoding firmware for dl390 g5 ... ( its 1gb gonna take 2 hrs more , hp limits bandwitdh i guess )

I have raid 1 on local SAS disks  , it 2 x 72 SAS disk.
disks do fail, they are mechanical.

you've not removed any shared storage?
Avatar of JeanPoL


its not problem , i am expection problems like this ...

There is not any changes on stroge side , actually one of storages acts strage but it would effect other hosts , but still is local issue i think.
This host crashed yesterday too , i didnt expect that its gonna repeat.After restart of host it keep working like 20 hrs.
it crashed yesterdey at 16PM and today again at 12:30 PM.
Avatar of JeanPoL


-Checked disk health status it ok ,
- RAID status is ok , no failure.
- Did last firmware upgrades , restarted and host wont boot :

But i moved vm's to other hosts everything is up and running :) . vm power ...
If i continue boot ( chceck image ) it stucks on   - mount root    while booting...

Should i do clean install again ... ? or keep tracking for for hardware failure ?
Avatar of Andrew Hancock (VMware vExpert PRO / EE Fellow/British Beekeeper)
Andrew Hancock (VMware vExpert PRO / EE Fellow/British Beekeeper)
Flag of United Kingdom of Great Britain and Northern Ireland image

Link to home
Create an account to see this answer
Signing up is free. No credit card required.
Create Account
and yes, ESX and ESXi 4.1 can co-exist in the same farm.
DR is much easier and quicker for ESXi!

(just put another USB flash drive in, and your  restore configuration back and done).

Avatar of JeanPoL


cool , you are really pro :)

does it matter if i install esxi with update 2 and others are update 1 ?
No issues there with ESXi 4.1 U1, or U2 in the same Farm. We are currently testing U2 in a farm of U1s, before finally upgrading all U1 to U2.

Go for it!
Avatar of JeanPoL


OK i installed new ESXi 4.1 Update 2 ..

on ESX i was using service console on vswitches , should i replace them with management trffic or what ? i am bit confused
There is no service console on ESXi 4.1, it's been replaced with a VMKernel Port Management Network.
Avatar of JeanPoL


its ok i solved the issue.

thx for help