SAN - 450 Tb of repeated data. How to reduce to unique data?

We have an HP XP12000 SAN with 450 Tb of data, with multiple types of RAID config. We want to reduce the size of the SAN, which we see as happening in two ways: a. Reduce RAID 5 to a lower but still resilient level and b. Trawl through the data remove duplicates etc. Two questions - a. What is an optimal resilience level? and b. What tools (SW etc) are available to manage and control data? FYI - we use SAP R/3 and BW with CRM coming soon. We also have an very active web site.
peter2407Asked:
Who is Participating?

Improve company productivity with a Business Account.Sign Up

x
 
Duncan MeyersConnect With a Mentor Commented:
> a. What is an optimal resilience level?
Depends on your business requirements. RAID 1/0 may be perfect for some, RAID 5 or RAID 6 for others.

>b. What tools (SW etc) are available to manage and control data?
Double Killer is a nifty piece of software that will find duplicate files: http://www.bigbangenterprises.de/en/doublekiller/

But it sounds like Data De-duplication might be what you're after, given that you've got a mix of structured and unstructured data. StorNext from Quantum (http://www.quantum.com/Products/Software/StorNext/Index.aspx) might be just the ticket. It handles HSM tasks as well as de-duplication.

Alternatively, you could wait until HP releases de-dupe for the XP series of arrays.
0
 
Duncan MeyersCommented:
Thanks! Glad I could help.

Now - to your other question: http://www.experts-exchange.com/Storage/Storage_Technology/Q_23520840.html. I'm a bit concerned about what you've got in mind...
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.