HP Lefthand P4500 G2 or IBM N-series N3400 as backend for virtualization?

Posted on 2010-09-23
Medium Priority
Last Modified: 2013-11-14
Hey Experts

I’m looking to buy some storage to use as shared backend for virtualization and have been looking @ HP's lefthand and IBM's N-series.

The goal is to run 15-20 windows servers (fileservers, domanecontrollers, exchangeserver and applicationservers for 75-85 users) and 30-40 Linux servers (application servers, DNS and DHCP)

I have been offered the following setups by my vendors:

Lefthand P4500 G2 - 4 nodes each with 12x450gb 15k drives
Total of 48x450gb 15k drives
2 of the 4 nodes will be placed @ is own site and we expect to run a "network raid 10" over the 4 nodes.

Features: (need or nice to have)
2 site setup - need to have
Thin provisioning - need to have
Performance from all drives/spindes - Nice to have

N-Series N3400 with 12x300gb 15k drives and 2 disk enclosures with 12x300gb 15k drives each.
Total of 36x450gb 15k drives
2. site will have its own N3400 (single controller version) with 12x1tb SATA drives. The 2. site is only for disaster recovery use.

Features: (need or nice to have)
2 site setup - need to have
Thin provisioning - need to have
Dedupilcation - very nice to have (if it doesn’t cause performance issues on the virtual servers)
Snapshots - nice to have
NFS/Filservice access to data - nice to have

The price of the IBM solution is more than double the price of the HP solution but is it worth it?
From what I can see I will have to pay a lot of money for the added "intelligence" in the IBM solution.

Does anyone have experience with either (or both) of above solutions and if so what was your experience with it?

Does anyone have experience with the use of deduplication of virtual servers on a shared data backend? Does it work without performance drops?

Hints to things I need to be aware of when choosing my solutions will be much appreciated.

I have of course asked the vendors the same questions and each of them says (naturally) that their solution is the best and that the others don’t work at all...

Thanks in advance
Question by:comxit
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 2
  • 2
LVL 42

Accepted Solution

paulsolov earned 1000 total points
ID: 33749806
The N Series is basically a rebranded Netapp so you may want to look at Netapp directly.  Netapp (or N series IBM) offers CIFS, NFS, iSCSI, FC all in the same device and will act as your file server as well while Lefthand is strictly an iSCSI device.  Snapshots are pointers in time and will not take significant space or time on the Netapp.  It also offers as an options SnapManager for Virtual Infrastructure to backup your VMs.  

From experience most VMs and CIFS data is deduplicated to at leas 30-75% so it's usually not the amount of raw storage it's how you use it.  

Working with several datacenters the main issue I've seen with Lefthand is that it uses generic hardware (HP based now but I've seen it on Dell,etc..) and uses the hardware's raid technology.  On top of that it builds a virtual file system that also acts as a raid set.  When one of the drives goes out the rebuild has been painfull to a few customers that I've been to.  This is why HP bought 3PAR to better compete.

Dedupe on the Netapp is done as a scheduled job offpeak hours and is normally under a min. or two with no significant performance penalty.  If you're backing up Exchange N series has SnapManager for Exchange that will allow you to perform restores of your Exchange DBs and Single Mailbox restore (some of these may be extra cost).  The provisioning is built into vCenter and also integrates well with Hyper V.  

My $.02


Author Comment

ID: 33753366
Thx for the reply paulsolov.

I can specialy use the info about the way Lefthand handles raidsets, and the potential problems with rebuild following a faulty drive.

I am aware of the reduction in needed space with dedup but still concerned about potential drop in performance. Not when "running" the dedup but when let say 40 servers is trying to acces the same deduped data at the same time. If its all in the cache then im not worried but what if it isnt? and is it only a teoretical problem, or will it happen in everyday use?.

Assisted Solution

teledata-consulting earned 1000 total points
ID: 33754132
I've been using the HP LeftHand product for years.  I had been looking for a good product to provide for my SMB and mid-sized customers, and got onboard before the HP acquisition.

It's been a fantastic product and delivered a great value (both IOPs and GB per $).  I encourage you to take a look at the HP Renew as a further way to get into the HP LeftHand product even more cost effectively.  (the 12x450 models have now been replaced with 12x600GB, so the former model will start appearing in HP Renew soon).

The majority of my customers are using LeftHand in VMware environments, and it has exceeded expectations.

The simplicity of the LeftHand design is the SAN/iQ software on standardized HP x86 hardware.   Additional features will continue to be added to the SAN/iQ software stack.  As De-dupe is fast becoming a feature in demand I would be very surprised if we didn't see it added to the SAN/iQ stack in the next year or two.
LVL 42

Expert Comment

ID: 33754546
The dedupe works well as Netapp uses cache and a larger cache can be added if needed.  As the data is static any new data is written to the volume.  I have configured much larger environment and have not seen any significant performance penalty.  Your Netapp vendor should be able to calculate what you need.



Author Closing Comment

ID: 33768075
Thanks for the replys. I now belive i have the info i need to make a more informed choice between the 2 offered solutions.

Featured Post

Get free NFR key for Veeam Availability Suite 9.5

Veeam is happy to provide a free NFR license (1 year, 2 sockets) to all certified IT Pros. The license allows for the non-production use of Veeam Availability Suite v9.5 in your home lab, without any feature limitations. It works for both VMware and Hyper-V environments

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

When we purchase storage, we typically are advertised storage of 500GB, 1TB, 2TB and so on. However, when you actually install it into your computer, your 500GB HDD will actually show up as 465GB. Why? It has to do with the way people and computers…
Compliance and data security require steps be taken to prevent unauthorized users from copying data.  Here's one method to prevent data theft via USB drives (and writable optical media).
This video teaches viewers how to encrypt an external drive that requires a password to read and edit the drive. All tasks are done in Disk Utility. Plug in the external drive you wish to encrypt: Make sure all previous data on the drive has been …
This tutorial will walk an individual through the process of installing the necessary services and then configuring a Windows Server 2012 system as an iSCSI target. To install the necessary roles, go to Server Manager, and select Add Roles and Featu…

752 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question