Solved

ESXi Whiteboxes for Lab Environment

Posted on 2011-03-10
52
2,195 Views
Last Modified: 2012-05-11
I am planning to procure 2 whiteboxes for my home lab environment in order to test DRS and HA The lab would run vCenter v4.1 and ESXi v4.1.

I had the intention to go for the hardware spec mentioned below

AMD Phenom II X6 1075T - 3.0 GHz
16 GB DDR3 1333 MHz
Asus M4A89GTD-PROU3 motherboard based on 890GX chipset
350 or 400W SMPS
1 KVA UPS
Antec or Cooler Master chassis
iomega StorCenter ix2-200 Network Storage - 1 TB - iSCSI SAN and certified by VMware

My doubt is would I be able to vMotion my VMs from one ESXi to another ESXi with the config mentioned above ?

I chose the 890GX chipset over 890FX since 890GX has onboard DVI, VGA and HDMI ports.

890 FX has additional support for IOMMU, which can be used for VMDirectPath config, although I am not planning to test this functionality.

Budget is not a constraint, hence I am willing to go for Opteron CPUs along with supported motherboards, but I am having difficulty in finding info about Opteron + motherboard combo over the net.

Should I go for an Intel Xeon CPU & motherboard combo since I am able to find plenty of info on Intel's website for CPU & Board compatibility.

Requirements

Support atleast 16 or 24 GB RAM
A single CPU socket (6 cores needed)

ESXi v4.X would be installed on a USB stick and all VMs would be stored on the iOmega StorCenter.

I have a 8 port Gigabit switch (unmanaged) which I can use for setting the vSphere Lab infrastructure.

0
Comment
Question by:vmwarun - Arun
  • 21
  • 19
  • 7
  • +2
52 Comments
 
LVL 30

Accepted Solution

by:
IanTh earned 300 total points
ID: 35093574
I do the same

1 vmotion and drs should have its own switch so you esx servers will need more than 1 nic so you can put the other nic in the vmkernal

2 I bought a intel 2 * 1366 socket motherboard off ebay for £200 and a quad core xeon for £200 as I know vmware will use the hyperthreading function so the xeon as effectively 8 core and I could put another cpu in there for 16 cores also it will take at least 64gb ram  curently got 16gb ram and as this is a proper server mother board all the functions of advanced plus work for 60 days as you free license will not allow ha drs vmotion etc.

4 opterons do work I bought a arima rioworks 2 opteron system of ebay to have esx4 installed as also you will need a dc and vcenter server can should not be installed on a dc so the arima has a dc and a vcenter server . that server cost about £100 and its fine with 4gb ram.

5. you will need a iscsi system like openfiler as vmotion and drs use an iscsi path and you really should have a iscsi switch so the iscsi performance is good enough for the iscsi system to function correctly

3 you will really need vcenter server which is not viclient
0
 
LVL 30

Assisted Solution

by:IanTh
IanTh earned 300 total points
ID: 35093603
my other rig rig is a core 2 quad with 12 gb ram so I have 3 esx boxes

1 the arima
2 esx 1 with 2 esx vm's
3 esx2 with 2 esx vm's

so I have 4 esx servers as also that way fyou can setup a cluster and as the cluster servers need to be exactly the same as they are virtual esx servers they are iddentical as you will need to setup a cluster to do drs
0
 
LVL 30

Assisted Solution

by:IanTh
IanTh earned 300 total points
ID: 35093633
according to http://www.vm-help.com/forum/viewtopic.php?f=13&t=2316

this mother board like many desktop boards have a problem in esx with the nic as esx is very iffy about nics and this can be avoided by using an additional nic like an intel pro gt

esx has a similar issue with sata controllers and again this can be avoided by using the whitebox hcl for sata controllers but intel ich 9 and ich 10 do work.
0
 
LVL 19

Author Comment

by:vmwarun - Arun
ID: 35093635
Thanks for replying IanTh.

I have a dekstop (Core 2 Quad with 4 GB RAM) & a laptop (6 GB RAM with Core i7 CPU) which I intend to use for management.

I would use the 60 days eval license for configuring the vSphere environment.

I am a VCP on vSphere and VI3 so I know the requirements for setting up a vSphere infrastructure.

My only concern is which combo should I go among the 4 mentioned below ?

A. Phenom II with a 890GX/890FX motherboard
B. Opteron with a compatible motherboard
C. Core i7 with a compatible motherboard
D. Xeon CPU with a compatible motherboard

16, 24 or 32 GB RAM is acceptable.
0
 
LVL 30

Expert Comment

by:IanTh
ID: 35093781
well its down to really ram esx needs as much ram so the xeon or opteron is the way to go as the others are desktop motherboards where-as opterons and xeons are server motherboards that take more ram as I now ram over commisioning ? works but real physical ram is better and also opterons and xeons are multi socket usually where i7 and phenom are not
0
 
LVL 32

Assisted Solution

by:nappy_d
nappy_d earned 125 total points
ID: 35093798
The only question I have for you, based on your processor choices is that since this is just a home lab, how many guests do you plan to have running?

If it's just for knowledge and small lab tests, 16GB of RAM and the Xeon proc should do just fine, not to mention that it will save you a couple of bucks.

In my current lab, I have 5 guests running on a single Intel dual core processor with 8GB of RAM.
0
 
LVL 30

Expert Comment

by:IanTh
ID: 35093803
yes but you need a cluster so you need multiple esx servers and using esx inside of esx is the correct way to go as the physical esx servers can be different
0
 
LVL 32

Assisted Solution

by:nappy_d
nappy_d earned 125 total points
ID: 35093956
One other consideration you have not mentioned for your lab is storage.

If you plan to use the onboard controller for your SATA drive, make sure that they are on the HCL for your whitebox.

If the board you choose does not have a supported controller, you will need to purchase something like. 3Ware controller and maybe a "3 in 2" drive cage for your case. This makes drive swapping easier.
0
 
LVL 19

Author Comment

by:vmwarun - Arun
ID: 35094282
nappy_d - Number of guests would be maximum 6 per ESXi host.

I would be running ESXi on a USB stick (no RAID for the hypervisor :-) )

For storing VMs, I will be either using iomega StorCenter ix2-200 1 TB or Iomega ix4-200d - 2 TB as these are certified for both NFS and iSCSI by VMware.

Let me go ahead for a Phenom II X6 + 890GX combo as I am not planning to upgrade this into a 2 socket system.16 GB DDR3 1333 MHz RAM would be accommodated on the 4 RAM slots.

What should be my PSU wattage and UPS VA ?
0
 
LVL 117

Assisted Solution

by:Andrew Hancock (VMware vExpert / EE MVE)
Andrew Hancock (VMware vExpert / EE MVE) earned 25 total points
ID: 35094334
Iomega StorCenter really suffer when using multiple VMs.

Use NFS, not iSCSI, Jumbo Frames if possible.

Don't get the rack mount as it only has one NIC.

oh and use AMD Opterons!
0
 
LVL 32

Expert Comment

by:nappy_d
ID: 35094354
Yes with that few a number of guests a single quad core or dual Xeon procs should so you just fine, in each whitebox

As for RAM, the only considration is that when guests fail overs, just have to make sure you have adequate RAM to accomodate all the guests being on concurrently.

In my whitebox, I have a 350W PSU with a "5 in 3" cage and 5 160GB SATA drives attached to a PERC controller.  Also, coincidentally, I have the ix2-200  I tried to get iSCSI working and it causes my ESX4 host to hang.  

iOmega support is useless.  My Win7 box connects just fine to an iSCSI LUN I created.
0
 
LVL 28

Assisted Solution

by:bgoering
bgoering earned 50 total points
ID: 35094463
Only comment that I would add to this discussion is the difference between the GX and FX motherboards. The support for DVI and HDMI would be pretty much wasted for an ESX server, while the IOMMU support may come in handy should you choose to work with VMDirectPath in you lab in the future.

Of course you may be intending those machines for some other purpose later where the HDMI/DVI features are important...

Good Luck with your lab
0
 
LVL 19

Author Comment

by:vmwarun - Arun
ID: 35094531
bgoering - I agree with you about the feature set difference, although I am not planning to practice VMDirectPath or enable it in my lab.
0
 
LVL 30

Expert Comment

by:IanTh
ID: 35095708
I would go dows the xeon route as when your finished you can sell it as an esx 4 server on ebay
0
 
LVL 19

Author Comment

by:vmwarun - Arun
ID: 35095989
hanccocka - Is there any other NAS/iSCSI box which is suitable for home use rather than iOmega ?

nappy_d - Your statement puts me in a confusion, Should I or should I not go for iOmega ?

I appreciate each and every one of you for replying.
0
 
LVL 30

Expert Comment

by:IanTh
ID: 35096141
I use openfiler on an old pc
0
 
LVL 19

Author Comment

by:vmwarun - Arun
ID: 35096375
I need to test both NFS and iSCSI.

Rather than investing in a separate filer, can I use OpenFiler as my iSCSI SAN deployed in a separate physical box ?
0
 
LVL 30

Expert Comment

by:IanTh
ID: 35096393
openfiler does nfs and can be done in a vm or a physical pc
0
 
LVL 28

Expert Comment

by:bgoering
ID: 35096934
Personally I use a Promise m500i for iSCSI
0
 
LVL 32

Expert Comment

by:nappy_d
ID: 35097338
I am on the fence with my iOmega and their VMWare "claims".  It works like a charm with my Windows 7 machine.  It works GREAT in general but i cannot get my ESXi 4 box to work with it it just hangs.

I will keep playing.
0
 
LVL 19

Author Comment

by:vmwarun - Arun
ID: 35097673
Another idea just stuck me.

I have an old Pentium IV 3.2 GHz system with 1 GB DDR RAM

I am not using it for the time being.

Is it feasible to install an Adaptec 4 port RAID card to this system, connect 3 or 4 SATA hard drives, configure RAID 5 or RAID 10, install Openfiler and use it as an iSCSI/NFS SAN ?
0
 
LVL 30

Expert Comment

by:IanTh
ID: 35098252
yes I use a athlon 64 with 2.5 gb and a 1.5 gb sata for my openfiler it does really need at least 1 gigabit nic imho
0
 
LVL 19

Author Comment

by:vmwarun - Arun
ID: 35098572
I have an Intel Pro 1000 MT NIC on hand which I have installed on the Pentium box.

Thanks for the tip, IanTh.

There is a PCI Express x16 slot on the P IV box.

Could that support any of the 4 port Adaptec RAID controllers listed on this URL ?

http://www.adaptec.com/en-us/products/controllers/hardware/

If it does, then I will scrap the Iomega SAN plan and would go for 4 x 1 TB Seagate SATA III disks in a RAID 10 array and configure it using the Adaptec RAID controller.
0
 
LVL 117
ID: 35101600
I'd go with Openfiler, FreeNAS or Open-E
0
 
LVL 32

Expert Comment

by:nappy_d
ID: 35101835
OK I have a question for arunaju, is this she same person that has designation of Sage, in EE for VMWare?

I only ask this because these questions all seem so beginner, for a VCP and a person who has answered so many questions on EE :)
0
 
LVL 19

Author Comment

by:vmwarun - Arun
ID: 35104158
nappy_d - :) I am an Expert & a learner too, although I am cautious & don't want to burn my fingers when setting up my home lab since anything done incorrectly can hinder my progress of achieving the VCAP-DCA and DCD certs.

Opinions always matter to me since it helps me to take a balanced decision.
0
Free Gift Card with Acronis Backup Purchase!

Backup any data in any location: local and remote systems, physical and virtual servers, private and public clouds, Macs and PCs, tablets and mobile devices, & more! For limited time only, buy any Acronis backup products and get a FREE Amazon/Best Buy gift card worth up to $200!

 
LVL 30

Assisted Solution

by:IanTh
IanTh earned 300 total points
ID: 35107349
I would trey the raid card in the pci-e x16 slot, I did the same and got mixed results as I hads to go into the bios and say use vga all the time so when it rebooted then it worked but if I just power up the openfiler it hung so imho its down to the motherboard
0
 
LVL 19

Author Comment

by:vmwarun - Arun
ID: 35108874
I am going to procure the hardware for the lab tomorrow. I will assemble the hardware and will update the thread.

Intel® Server Board S3420GPLC

http://ark.intel.com/Product.aspx?id=46533

Intel® Xeon® Processor X3430 (8M Cache, 2.40 GHz)
http://ark.intel.com/Product.aspx?id=42927

I have considered the above as well.

Its either Phenom II X6 or the Xeon CPU mentioned above.
0
 
LVL 30

Expert Comment

by:IanTh
ID: 35109392
yes I got an second hand Intel S5520HC

and a quad core xeon with hyperthreading

Like I said whenb my test rig is finished I could sell that as a proper esx 4 server
0
 
LVL 117
ID: 35113109
I would purchase a HP DL385 or DL585 from eBay! (if you can get one). No need to build and works from power-on.
0
 
LVL 30

Expert Comment

by:IanTh
ID: 35115811
if you get a esx 4 server on ebay the sellars know its esx 4 capable and my self built vsphere server, apart from fault tolerent psu which most servers have, if you look on ebay for an intel server with the same board I got for £200 you are looking well over £2000 and it wouldn't cost £1000 to do ft psu in my rig  as my test vsphere server can host abought 8-10 vm's I reckon unless I have a windows server vm lol

But my Arima rioworks I got for £100 is for my rwo vm m$ servers a 2003 dc and a vcenter server
0
 
LVL 19

Author Comment

by:vmwarun - Arun
ID: 35115994
I live in Chennai, India and I ain't sure whether I can get stuff shipped from eBay.
0
 
LVL 30

Expert Comment

by:IanTh
ID: 35116255
I am sure you can there are a lot of sellers in the far east like china just ask the seller if the ship to india so why would the block it. If you need something of ebay there are ebay "merchants" how will do the purchase for you and then send it to you as a private shipping
0
 
LVL 32

Assisted Solution

by:nappy_d
nappy_d earned 125 total points
ID: 35116984
The issue you will potentially high cost to ship, especially for an HP DL series server. They are not exactly tiny.

A home brew white box would be a better investment.
0
 
LVL 30

Expert Comment

by:IanTh
ID: 35117007
yes a motherboard is a fraction of a hp dl and you should be able to get a e-atx case locally
0
 
LVL 19

Author Comment

by:vmwarun - Arun
ID: 35119736
It seems I was out of luck yesterday since none of the local resellers or distributors in Chennai had the Intel® Server Board S3420GP motherboard series :(

They requested me to wait till Monday which has spoiled my plans of assembling the boxes for my lab atleast for a week.
0
 
LVL 30

Expert Comment

by:IanTh
ID: 35121063
well at least you can get them locally I saved £180 by getting a 2nd hand server motherboard and £219 on the xeon cpu
0
 
LVL 19

Author Comment

by:vmwarun - Arun
ID: 35127246
The resellers have asked me to wait for a week. I am awarding points to the noble hearts that supported me in this aspect.

I will update the thread once I finish assembling my whiteboxes so that it can act as a index for folks who want to assemble whiteboxes for vSphere lab.
0
 
LVL 19

Author Closing Comment

by:vmwarun - Arun
ID: 35127271
The resellers have asked me to wait for a week. I am awarding points to the noble hearts that supported me in this aspect.

I will update the thread once I finish assembling my whiteboxes so that it can act as a index for folks who want to assemble whiteboxes for vSphere lab.
0
 
LVL 19

Author Comment

by:vmwarun - Arun
ID: 35164950
It seems the ECC RDIMMs have shot my budget way up. They are costing INR 7500 per 4 GB stick.
I wanted to go with 16 GB RDIMM per whitebox. Now that has been canceled and I am going with UDIMM Non ECC 16 GB  per whitebox - 8 sticks ( 4 GB stick costs INR 2300) and Intel X3440 CPU along with Intel 3420GPV motherboard since I am not looking at RDIMMs and none of the other boards support more than 16 GB RAM without RDIMMs.
0
 
LVL 30

Expert Comment

by:IanTh
ID: 35166004
yes my board intel 5200 would do with rdimm's 96gb I think and 48gb with non rdimms but I don't think I will need 96gb ram  
0
 
LVL 19

Author Comment

by:vmwarun - Arun
ID: 35166023
I never thought that procuring Xeon CPUs and compatible motherboards would be so difficult in Chennai, India as compared to its desktop counterparts.
0
 
LVL 30

Expert Comment

by:IanTh
ID: 35166477
what you cannot get server parts like xeons and xeon motherboards in chennai or india in general if thats the case I would be interested in getting something going as India is an tech country now  
0
 
LVL 32

Expert Comment

by:nappy_d
ID: 35166611
I can't spending a the $$$ on 96GB of RDIMM ram for a home test lab :)

My test lab consists of an AMD X3 processor and 8GB of RAM with simple board. anything more than that and a compy will pay for the equipment
0
 
LVL 30

Expert Comment

by:IanTh
ID: 35166824
I got 16gb or ddr3 ram for my xeon whitebox as it host two esx servers and they can have 7.5gb which is enough for a test lab as it will not have many vm's inside of the virtual esx servers
0
 
LVL 30

Expert Comment

by:IanTh
ID: 35166830
It was around £200 I seem to remember
0
 
LVL 19

Author Comment

by:vmwarun - Arun
ID: 35172545
I finally managed to install ESXi v4.1 on the whitebox mentioned below

Intel® Xeon® Processor X3450 (8M Cache, 2.66 GHz)
Intel 3420GPV Motherboard
16 GB DDR3 1333 MHz Unbuffered Non-ECC Corsair RAM

I used Cooler Master Elite 360 as my chassis and Cooler Master Extreme Power 350W as my PSU.

I am glad that everyone is working as expected except for one of the onboard NICs since one of the Gigabit NICs belong to Intel 82574L Gigabit Network Connection which is on the VMware compatibility guide and the other NIC 82578DN is not (hence it is not detected after ESXi installation)
0
 
LVL 19

Author Comment

by:vmwarun - Arun
ID: 35172564
0
 
LVL 30

Expert Comment

by:IanTh
ID: 35172863
yes I worked hard to get my first whitebox's nics working but then got a couple of intel pro ct pci-e x1 for £20 each as they are on the whitebox hcl

Any way well done I have just done an article on my vsphere whitebox lab setup
0
 
LVL 19

Author Comment

by:vmwarun - Arun
ID: 35174197
Please post a link to the article you have published.
0
 
LVL 30

Expert Comment

by:IanTh
ID: 35179305
0
 
LVL 19

Author Comment

by:vmwarun - Arun
ID: 35179390
IanTh - Thank you.
0

Featured Post

Highfive + Dolby Voice = No More Audio Complaints!

Poor audio quality is one of the top reasons people don’t use video conferencing. Get the crispest, clearest audio powered by Dolby Voice in every meeting. Highfive and Dolby Voice deliver the best video conferencing and audio experience for every meeting and every room.

Join & Write a Comment

This article is an update and follow-up of my previous article:   Storage 101: common concepts in the IT enterprise storage This time, I expand on more frequently used storage concepts.
This is an issue that we can get adding / removing permissions in the vCSA 6.0. We can also have issues searching for users / groups in the AD (using your identify sources). This is how one of the ways to handle this issues and fix it.
Teach the user how to configure vSphere clusters to support the VMware FT feature Open vSphere Web Client: Verify vSphere HA is enabled: Verify netowrking for vMotion and FT Logging is in place or create it: Turn On FT for a virtual machine: Verify …
This Micro Tutorial walks you through using a remote console to access a server and install ESXi 5.1. This example is showing remote access and installation using a Dell server. The hypervisor is the very first component of your virtual infrastructu…

707 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

13 Experts available now in Live!

Get 1:1 Help Now