Link to home
Start Free TrialLog in
Avatar of vmwarun - Arun
vmwarun - ArunFlag for India

asked on

ESXi Whiteboxes for Lab Environment

I am planning to procure 2 whiteboxes for my home lab environment in order to test DRS and HA The lab would run vCenter v4.1 and ESXi v4.1.

I had the intention to go for the hardware spec mentioned below

AMD Phenom II X6 1075T - 3.0 GHz
16 GB DDR3 1333 MHz
Asus M4A89GTD-PROU3 motherboard based on 890GX chipset
350 or 400W SMPS
1 KVA UPS
Antec or Cooler Master chassis
iomega StorCenter ix2-200 Network Storage - 1 TB - iSCSI SAN and certified by VMware

My doubt is would I be able to vMotion my VMs from one ESXi to another ESXi with the config mentioned above ?

I chose the 890GX chipset over 890FX since 890GX has onboard DVI, VGA and HDMI ports.

890 FX has additional support for IOMMU, which can be used for VMDirectPath config, although I am not planning to test this functionality.

Budget is not a constraint, hence I am willing to go for Opteron CPUs along with supported motherboards, but I am having difficulty in finding info about Opteron + motherboard combo over the net.

Should I go for an Intel Xeon CPU & motherboard combo since I am able to find plenty of info on Intel's website for CPU & Board compatibility.

Requirements

Support atleast 16 or 24 GB RAM
A single CPU socket (6 cores needed)

ESXi v4.X would be installed on a USB stick and all VMs would be stored on the iOmega StorCenter.

I have a 8 port Gigabit switch (unmanaged) which I can use for setting the vSphere Lab infrastructure.

ASKER CERTIFIED SOLUTION
Avatar of IanTh
IanTh
Flag of United Kingdom of Great Britain and Northern Ireland image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of vmwarun - Arun

ASKER

Thanks for replying IanTh.

I have a dekstop (Core 2 Quad with 4 GB RAM) & a laptop (6 GB RAM with Core i7 CPU) which I intend to use for management.

I would use the 60 days eval license for configuring the vSphere environment.

I am a VCP on vSphere and VI3 so I know the requirements for setting up a vSphere infrastructure.

My only concern is which combo should I go among the 4 mentioned below ?

A. Phenom II with a 890GX/890FX motherboard
B. Opteron with a compatible motherboard
C. Core i7 with a compatible motherboard
D. Xeon CPU with a compatible motherboard

16, 24 or 32 GB RAM is acceptable.
well its down to really ram esx needs as much ram so the xeon or opteron is the way to go as the others are desktop motherboards where-as opterons and xeons are server motherboards that take more ram as I now ram over commisioning ? works but real physical ram is better and also opterons and xeons are multi socket usually where i7 and phenom are not
SOLUTION
Avatar of Irwin W.
Irwin W.
Flag of Canada image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
yes but you need a cluster so you need multiple esx servers and using esx inside of esx is the correct way to go as the physical esx servers can be different
SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
nappy_d - Number of guests would be maximum 6 per ESXi host.

I would be running ESXi on a USB stick (no RAID for the hypervisor :-) )

For storing VMs, I will be either using iomega StorCenter ix2-200 1 TB or Iomega ix4-200d - 2 TB as these are certified for both NFS and iSCSI by VMware.

Let me go ahead for a Phenom II X6 + 890GX combo as I am not planning to upgrade this into a 2 socket system.16 GB DDR3 1333 MHz RAM would be accommodated on the 4 RAM slots.

What should be my PSU wattage and UPS VA ?
SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Yes with that few a number of guests a single quad core or dual Xeon procs should so you just fine, in each whitebox

As for RAM, the only considration is that when guests fail overs, just have to make sure you have adequate RAM to accomodate all the guests being on concurrently.

In my whitebox, I have a 350W PSU with a "5 in 3" cage and 5 160GB SATA drives attached to a PERC controller.  Also, coincidentally, I have the ix2-200  I tried to get iSCSI working and it causes my ESX4 host to hang.  

iOmega support is useless.  My Win7 box connects just fine to an iSCSI LUN I created.
SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
bgoering - I agree with you about the feature set difference, although I am not planning to practice VMDirectPath or enable it in my lab.
I would go dows the xeon route as when your finished you can sell it as an esx 4 server on ebay
hanccocka - Is there any other NAS/iSCSI box which is suitable for home use rather than iOmega ?

nappy_d - Your statement puts me in a confusion, Should I or should I not go for iOmega ?

I appreciate each and every one of you for replying.
I use openfiler on an old pc
I need to test both NFS and iSCSI.

Rather than investing in a separate filer, can I use OpenFiler as my iSCSI SAN deployed in a separate physical box ?
openfiler does nfs and can be done in a vm or a physical pc
Personally I use a Promise m500i for iSCSI
I am on the fence with my iOmega and their VMWare "claims".  It works like a charm with my Windows 7 machine.  It works GREAT in general but i cannot get my ESXi 4 box to work with it it just hangs.

I will keep playing.
Another idea just stuck me.

I have an old Pentium IV 3.2 GHz system with 1 GB DDR RAM

I am not using it for the time being.

Is it feasible to install an Adaptec 4 port RAID card to this system, connect 3 or 4 SATA hard drives, configure RAID 5 or RAID 10, install Openfiler and use it as an iSCSI/NFS SAN ?
yes I use a athlon 64 with 2.5 gb and a 1.5 gb sata for my openfiler it does really need at least 1 gigabit nic imho
I have an Intel Pro 1000 MT NIC on hand which I have installed on the Pentium box.

Thanks for the tip, IanTh.

There is a PCI Express x16 slot on the P IV box.

Could that support any of the 4 port Adaptec RAID controllers listed on this URL ?

http://www.adaptec.com/en-us/products/controllers/hardware/

If it does, then I will scrap the Iomega SAN plan and would go for 4 x 1 TB Seagate SATA III disks in a RAID 10 array and configure it using the Adaptec RAID controller.
OK I have a question for arunaju, is this she same person that has designation of Sage, in EE for VMWare?

I only ask this because these questions all seem so beginner, for a VCP and a person who has answered so many questions on EE :)
nappy_d - :) I am an Expert & a learner too, although I am cautious & don't want to burn my fingers when setting up my home lab since anything done incorrectly can hinder my progress of achieving the VCAP-DCA and DCD certs.

Opinions always matter to me since it helps me to take a balanced decision.
SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
I am going to procure the hardware for the lab tomorrow. I will assemble the hardware and will update the thread.

Intel® Server Board S3420GPLC

http://ark.intel.com/Product.aspx?id=46533

Intel® Xeon® Processor X3430 (8M Cache, 2.40 GHz)
http://ark.intel.com/Product.aspx?id=42927

I have considered the above as well.

Its either Phenom II X6 or the Xeon CPU mentioned above.
yes I got an second hand Intel S5520HC

and a quad core xeon with hyperthreading

Like I said whenb my test rig is finished I could sell that as a proper esx 4 server
I would purchase a HP DL385 or DL585 from eBay! (if you can get one). No need to build and works from power-on.
if you get a esx 4 server on ebay the sellars know its esx 4 capable and my self built vsphere server, apart from fault tolerent psu which most servers have, if you look on ebay for an intel server with the same board I got for £200 you are looking well over £2000 and it wouldn't cost £1000 to do ft psu in my rig  as my test vsphere server can host abought 8-10 vm's I reckon unless I have a windows server vm lol

But my Arima rioworks I got for £100 is for my rwo vm m$ servers a 2003 dc and a vcenter server
I live in Chennai, India and I ain't sure whether I can get stuff shipped from eBay.
I am sure you can there are a lot of sellers in the far east like china just ask the seller if the ship to india so why would the block it. If you need something of ebay there are ebay "merchants" how will do the purchase for you and then send it to you as a private shipping
SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
yes a motherboard is a fraction of a hp dl and you should be able to get a e-atx case locally
It seems I was out of luck yesterday since none of the local resellers or distributors in Chennai had the Intel® Server Board S3420GP motherboard series :(

They requested me to wait till Monday which has spoiled my plans of assembling the boxes for my lab atleast for a week.
well at least you can get them locally I saved £180 by getting a 2nd hand server motherboard and £219 on the xeon cpu
The resellers have asked me to wait for a week. I am awarding points to the noble hearts that supported me in this aspect.

I will update the thread once I finish assembling my whiteboxes so that it can act as a index for folks who want to assemble whiteboxes for vSphere lab.
The resellers have asked me to wait for a week. I am awarding points to the noble hearts that supported me in this aspect.

I will update the thread once I finish assembling my whiteboxes so that it can act as a index for folks who want to assemble whiteboxes for vSphere lab.
It seems the ECC RDIMMs have shot my budget way up. They are costing INR 7500 per 4 GB stick.
I wanted to go with 16 GB RDIMM per whitebox. Now that has been canceled and I am going with UDIMM Non ECC 16 GB  per whitebox - 8 sticks ( 4 GB stick costs INR 2300) and Intel X3440 CPU along with Intel 3420GPV motherboard since I am not looking at RDIMMs and none of the other boards support more than 16 GB RAM without RDIMMs.
yes my board intel 5200 would do with rdimm's 96gb I think and 48gb with non rdimms but I don't think I will need 96gb ram  
I never thought that procuring Xeon CPUs and compatible motherboards would be so difficult in Chennai, India as compared to its desktop counterparts.
what you cannot get server parts like xeons and xeon motherboards in chennai or india in general if thats the case I would be interested in getting something going as India is an tech country now  
I can't spending a the $$$ on 96GB of RDIMM ram for a home test lab :)

My test lab consists of an AMD X3 processor and 8GB of RAM with simple board. anything more than that and a compy will pay for the equipment
I got 16gb or ddr3 ram for my xeon whitebox as it host two esx servers and they can have 7.5gb which is enough for a test lab as it will not have many vm's inside of the virtual esx servers
It was around £200 I seem to remember
I finally managed to install ESXi v4.1 on the whitebox mentioned below

Intel® Xeon® Processor X3450 (8M Cache, 2.66 GHz)
Intel 3420GPV Motherboard
16 GB DDR3 1333 MHz Unbuffered Non-ECC Corsair RAM

I used Cooler Master Elite 360 as my chassis and Cooler Master Extreme Power 350W as my PSU.

I am glad that everyone is working as expected except for one of the onboard NICs since one of the Gigabit NICs belong to Intel 82574L Gigabit Network Connection which is on the VMware compatibility guide and the other NIC 82578DN is not (hence it is not detected after ESXi installation)
yes I worked hard to get my first whitebox's nics working but then got a couple of intel pro ct pci-e x1 for £20 each as they are on the whitebox hcl

Any way well done I have just done an article on my vsphere whitebox lab setup
Please post a link to the article you have published.
IanTh - Thank you.