• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 634
  • Last Modified:

Home Server Hardware Suggestion

Hello All -
I have a home "server" which is getting a bit old and needs to be replaced soon. One of the biggest drawbacks to it now is that it hosts a lot of media and can't transcode higher def files fast enough.   I plan to build the new one myself - not buy prebuilt - as I already own Windows licenses and don't want to pay for a new one plus have a few specific specs I need.

What I need suggestions on are the best solutions / hardware for what I need that won't be crazy expensive.

- Currently, I have 5 SATA drives, 1 SATA DVD drive, and 1 IDE drive. These are all 1tb or 2tb and are full of the rips i made of my movie disc collection (not downloaded!)  Combined they are 90% full so need room to grow.  Therefore I'd need a mb with at least 6 SATA ports or PCIE ports for expansion cards. Any other storage solution ideas for this are welcome... But the media is accessed frequently.

- CPU capable of better transcoding (mencoder via PS3 media server usually has issue). They are transcoded for play on mobile devices or XBOX.  Bad part is think mencoder can only use 1 core under Windows.  Currently have Intel 2.7 Dual Core.  Would there be big difference between i3, i5, i7 on this?

-OS - Been using Win 7 x64 due to compatibility with some apps. Would different OS improve performance?  

Any suggestions for reasonable motherboard or proc?  Thanks!
0
BzowK
Asked:
BzowK
1 Solution
 
DavidCommented:
Put together a system based on an open flavor of solaris and ZFS.  Among other things the file system can automatically do de-duplication to free up space and remove duplicate files.  You can also have it automatically compress directories, so you can move stuff you want to keep but will rarely access.

Plus, it is all RAID protected.  If you want screaming speed you can also get a pair of cheap ssds, and use them in the same raid-protected pool.   No need for a hardware RAID controller either, and you can keep all the hardware you have.   (Yes, performance will be much better)
0
 
RGRodgersCommented:
The key is to attack the source of the bottleneck.  

That all presumes that the CPU bottleneck has been solved.  Faster disks and access paths make no difference if the data is sitting there waiting on the CPU.  

Issues like deduplication can also increase the CPU load which may not be beneficial.  In this case, that may not be an issue as long as other cores are idle anyway.

The important objective is to either pick a product that can span CPU's or, at minimum, get bigger faster CPU's.  To that end, the faster the core the better.  Even better, though, would be to find alternative software not bound to one core.

...RG
0
 
DavidCommented:
Deduplication actually lowers CPU overhead in this implementation (and many others).  Here is an article by symantec with benchmarks as an example.  The overhead in dedup is additional RAM, but it can easily drop your I/O requirements in half or more.

http://www.symantec.com/business/support/index?page=content&id=HOWTO36374

Easy way to rationalize why.. Common files stay cached, and writes are eliminated because the data is already there.  As far as the users are concerned, there are 25 copies of the same DLL, but in reality one DLL, and 25 stub files that identify hash totals, permissions, and access timestamps.

As long as you have a modern multi-core system, the de-duplication is threaded code that uses freely available CPU resources.
0
Get your Conversational Ransomware Defense e‑book

This e-book gives you an insight into the ransomware threat and reviews the fundamentals of top-notch ransomware preparedness and recovery. To help you protect yourself and your organization. The initial infection may be inevitable, so the best protection is to be fully prepared.

 
RGRodgersCommented:
That article describes the substantial CPU overhead associated with deduplication.  And, I did say that that issue didn't really matter as long as you had idle cores.

Anyway, the point was that you have to attack the bottleneck which is, in this case, either the size of the core or the number of cores that can be used for the priority process.  Adding additional other resources without relieving that bottleneck will not improve performance to any reasonable degree.

...RG
0
 
xmlmagicianCommented:
On the storage front I would suggest a NAS and depending how your setup is carded/wired you can have up to 1Gbs/s transfers. Earlier today I received an offer for 3TB drive for £102 which is nothing. Let me know if you want suggestion on the NAS front.

When it comes to video editing and ripping don't most people use macs?
0
 
CallandorCommented:
I would drop the IDE drive, since that interface is disappearing.

There is a difference between the i5 and i7 vs Core Duo cpus.  For transcoding and any cpu-intensive task, you want a high performance cpu so that the system doesn't bog down and is free to do other tasks.  If your current cpu is the E8200, it scores 1932 on Passmark, while the i5-3570 scores 7618 and the i7-3770K scores 10384.  A good motherboard for these cpus is the

Asus P8Z77--V Pro
http://www.newegg.ca/Product/Product.aspx?Item=N82E16813131819&nm_mc=KNC-GoogleAdwordsCA&cm_mmc=KNC-GoogleAdwordsCA-_-pla-_-NA-_-NA

or less expensive Asrock Z77 Extreme4
http://www.newegg.com/Product/Product.aspx?Item=N82E16813157294&nm_mc=OTC-pr1c3grabb3r&cm_mmc=OTC-pr1c3grabb3r-_-Motherboards+-+Intel-_-ASRock-_-13157294

Win7 64-bit is probably the best OS for home server use, a good balance between cost, compatability, and memory access.
0
 
RGRodgersCommented:
ASUS support sucks.  I know because I am using an ASUS laptop as we speak.  I won't buy another ASUS product.  YMMV.

Windows 7 will probably do you as well as any other server.  For the most part in configurations with few users, server software doesn't make that much difference.  The difference at that level tends to be more about functionality than performance and, with less than a handful of users, that means little to nothing.

I'd stay pretty close to home, or to what you are doing.  Focus on what will make a big difference, liek bigger CPU's or software what will use more of them.  Change other factors only when they will make a big difference.  Otherwise, you are chasing the brass ring with no prize in play.

...RG
0

Featured Post

Configuration Guide and Best Practices

Read the guide to learn how to orchestrate Data ONTAP, create application-consistent backups and enable fast recovery from NetApp storage snapshots. Version 9.5 also contains performance and scalability enhancements to meet the needs of the largest enterprise environments.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now