Raid card differences- LSI 9260-16i or LSI 9280-16i

I have a q30 pod from www.45drives.com 
It has 30 sata drives in it, actually, I'll be buying 6TB nas grade drives here and I'm debating on a raid card.

I'm looking at the LSI 9260-16i or the LSI 9280-16i.  I have just confirmed with the manufacturer that both are compatible.
I know the 9260 is the older, but it's almost half the price.

I'm planning to do a hardware raid, raid 6, using 16 drives.  Will I experience a big performance boost using the newer card, or maybe I should
ask, will the older card be much slower?

The other question is, I will have 30 drives total, so I was planning on buying two of the 9260-16i cards, creating one volume with 16 drives,
and then creating a 2nd volume on the 2nd card with 12 drives.  The other option I was looking at is to go with the LSI 9280-24i card, and just use 24 drives, one large volume, raid 6 instead.  Would I get better performance this way?  Someone mentioned I would get better performance if I used a larger volume, more disks instead of using two cards with 2 volumes.

I will be running windows server 2016.  I'm currently using windows spaces for my raid, and am not pleased at all, and want
to move to a hardware raid.
DanNetwork EngineerAsked:
Who is Participating?

[Product update] Infrastructure Analysis Tool is now available with Business Accounts.Learn More

x
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

Casey WeaverManaged Services Windows Engineer IIICommented:
It's not the age difference, it's the design. 9260 cards are all internal ports. 9280 are all external or external and internal. There's no 9280-16i, only 9280-16i4e, which has 16 internal plus 4 external ports. In your use case it seems that you would be better served by 9260 cards. As for two cards vs one, it depends on if you can use split volumes. You can't make one super volume across two of those cards (except for software based, which you're already used to). There is no 9260-24i, so you'd be looking at a 9280-24i4e. You could also use SAS expanders like the LSISASx12 with a cheaper 9260, like a 9260-4i. But remember when you expand a port, you're bottleneck is the host port the expander connects to. So the drives on the expander would all be sharing the 6gb port it's connected to. This is more useful for large cold storage arrays than for active data.

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
DanNetwork EngineerAuthor Commented:
So I understand I can't make one big volume with 2 cards, so will the performance be about the same if I just use the 9280-24i4e card with 24 drives, instead of using 2 cards, the 9260-16i cards with each having their own volumes?

I don't need any external ports.
I plan to use raid 6, so I can have some redundancy and get the most storage as possible.
andyalderSaggar maker's framemakerCommented:
The older card is only PCIe 2 but there again I think you're using 7.2K drives so it should have plenty of performance.

Assuming that storage box/server has disk backplanes rather than loose cables you can't have 16 disks on one card, only 15 as the backplanes have 15 disks (4 connectors) on each.

You do get better performance with more disks, but if you can spread the load across the two controllers fairly evenly it will be even better than having all in one big array (except you lose 2 extra parity disks).
Determine the Perfect Price for Your IT Services

Do you wonder if your IT business is truly profitable or if you should raise your prices? Learn how to calculate your overhead burden with our free interactive tool and use it to determine the right price for your IT services. Download your free eBook now!

andyalderSaggar maker's framemakerCommented:
Didn't mean to repeat what you already said Casey, I just type slow ;)
Casey WeaverManaged Services Windows Engineer IIICommented:
It depends on your use case. 24 6gbs drives are technically capable of 144gbs of traffic. A single 9260/9280 can handle about 80gbs of traffic through its PCI-E interface. If you think it's possible you'll need every drive at full speed, you'll want two cards and two volumes. Also keep in mind that with large drives, like 4TB and up, that Raid 6 looses its safety factor. The larger the array, the longer the rebuild, and the exponential increased chance of having secondary and critical failures of the array. Splitting the array decreases rebuild time and increases volume reliability. Of course you can achieve the same with one card by creating two raid 6 virtual drives.
DanNetwork EngineerAuthor Commented:
Got it, so it's better to go with 2 cards is what I'm understanding?

I was planning to get these drives:
https://www.amazon.com/HGST-Deskstar-HDN726060ALE610-Certified-Refurbished/dp/B07B6JFSBV/ref=sr_1_1?s=electronics&ie=UTF8&qid=1522177113&sr=1-1&keywords=6TB+sata+nas+drives+refurbished

or

https://www.amazon.com/HGST-Ultrastar-HUH728080ALE604-Enterprise-Refurbished/dp/B079TL4TDJ/ref=sr_1_2?s=electronics&ie=UTF8&qid=1522177187&sr=1-2&keywords=8TB+sata+nas+drives+refurbished

I don't know exactly how the cables are cabled, but here's the specs, I think the fan out cables are 1-4, so one connector goes to 4 drives.
So on one card, I was going to install 16 drives, and on my 2nd card, I was going to install 12 drives.
https://www.45drives.com/pdf/StorinatorQ30_Enhanced_TechSpecs.pdf
Casey WeaverManaged Services Windows Engineer IIICommented:
Drives will be fine. I do see the specs show the cables as 8087 to 8482 cables. Those are the 4 connector breakout cables. So you'll wire it up like you're thinking. 4 sets of cables to the one card, 3 sets of cables to the other card. You'll want two 9260-16i for those tasks. Rebuilds will still be long if each card will be a single raid 6 volume. You can use http://wintelguy.com/raidmttdl.pl to perform a calculation of the arrays reliability.
andyalderSaggar maker's framemakerCommented:
If it's cabled rather than backplanes then you can split as you like (so long as cables are long enough. Have you bought it yet? If not then the 12Gb version would let you use the card I previously suggested for 28 disks without buying new cables.
DanNetwork EngineerAuthor Commented:
Great, thanks.  How about my boot drives, I currently have 2 SSDs in there, using software raid, but I have no idea if a drive fails or when it fails, as I installed the intel management tool, and it doesn't recognize the drives.  

Should I use the last port on one of the cards where I will be using the 12 drives for my boot drives, or just buy a separate card, maybe this one?
https://www.newegg.com/Product/Product.aspx?item=9SIAAEE6R79519
I'm thinking I should just go with this card for my SSDs.  
Regarding the cables, I think this will work, just wanted to double check?
https://www.amazon.com/dp/B013G4EOEY/ref=sspa_dk_detail_2?psc=1&pd_rd_i=B013G4EOEY&pd_rd_wg=5rWr6&pd_rd_r=7AWGQWWV49KH5ZNE36S0&pd_rd_w=qY9Vv
DanNetwork EngineerAuthor Commented:
My system will not support the 12Gb version, I have asked the manufacturer.
It does have a backplane.   Only my 2 SSDs for my OS do not have a backplane, so I need to get cables for those, not sure, but I'm thinking these will work?
https://www.amazon.com/dp/B013G4EOEY/ref=sspa_dk_detail_2?psc=1&pd_rd_i=B013G4EOEY&pd_rd_wg=5rWr6&pd_rd_r=7AWGQWWV49KH5ZNE36S0&pd_rd_w=qY9Vv

Unless someone wants to recommend something better?
Casey WeaverManaged Services Windows Engineer IIICommented:
Depends on how much speed you want from the SSD. No 9260 is built to keep up with SSD drives. A 9271 is but really just for caching. Most SSD are built for 12gbs ports. That being said, since it's for boot/OS I would just use the remaining ports in the two cards vs adding a third. The time you gain by adding a third card with 12gb support will be lost in the BIOS boot sequence now running through the BIOS boot process if the third card. Expect each raid card with drives attached to add between 10-30 seconds of boot time.
Casey WeaverManaged Services Windows Engineer IIICommented:
That cable will work fine and run both your SSD. Just remember they'll still need power as well.

I'm guessing your backplane has the 8482 SATA looking ports rather than the 8087 SAS port like on the raid cards. If your backplane is 8087 as well, just get 8087 to 8087 cables. They are cheaper and much tidier.
andyalderSaggar maker's framemakerCommented:
I think you'll find the SSDs for the OS are connected directly to the motherboard, I may be wrong though as I don't have one of those boxes to pull apart.
Casey WeaverManaged Services Windows Engineer IIICommented:
They should be, but the original poster can't get away from software raid without moving to a card, as that supermicro board uses a hybrid raid setup when using its onboard ports.
andyalderSaggar maker's framemakerCommented:
The spec sheet says it has 9 SFF 8087 to 4 8482 Cables (6Gb), they do not need to buy another cable. The non-RAID version of the box (that they already have) has a SATA HBA with 8 connectors on, so the ninth is connected direct to the motherboard presumably using software or RST mirroring.

I don't think it has disk backplanes (physically they are in two rows of 15), it would be odd to have a backplane with fanout cables and in the previous question afacts listed the mobo as the backplane.
DanNetwork EngineerAuthor Commented:
yes, my OS SSD's are connected to the motherboard currently.  

So I'll get two of the 9260-16i, and add all the drives on the two cards.  I'll grab one of those cables for my OS drives, as my data drives already have cables.

By the way, do these cards have the ability to email me when a drive fails?  Currently, I have no idea if a drive fails, so how will I be able to configure the cards to notify me if there is a failure?
Casey WeaverManaged Services Windows Engineer IIICommented:
The cards themselves don't have an out of band management port to do that. But you can install the LSI MegaRAID software, which you'll want to do anyway. Install it, it will pick up the cards, and then you can set your alerts, including pop up windows, sounds, and emails. Broadcom now owns LSI, so I can say that getting the software from their dumpster fire of a support/downloads page is like pulling teeth. The software is generic, however, and you can download it from most other OEM websites carrying the megaraid software like Lenovo, IBM, and Dell. It's from April 2017, so there might be newer, but this is the last good link I could get for the software. https://docs.broadcom.com/docs-and-downloads/raid-controllers/raid-controllers-common-files/16.11.00.03_MSM_Windows.zip
Casey WeaverManaged Services Windows Engineer IIICommented:
Actually I just managed to get it to work. Guess I learned something new. I used MSM as the search query and didn't select anything product family or anything, that seemed to not break the search page. This link uses the query: https://www.broadcom.com/support/download-search/?pg=&pf=&pn=&po=&pa=&dk=msm

It's under the Management Software and Tools category. Newest will be at the top.
DanNetwork EngineerAuthor Commented:
Great, thank you, I'll do that.
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Server Hardware

From novice to tech pro — start learning today.