Determining Server Configuration based on specs

Our healthcare application needs to be implemented in a hospital, the application is very critical in nature.  The entire application is basically writing images and reading it back and forth for multiple editing.  Yearly atleast 1.5TB of  images will be generated. At any point of time there will be 300 concurrent users. manipulating the images .  Per Image size will be approximately 250 KB.

So my questions is ,
1. How to determine the best server configuration for the above specs
2.  Should we go for on premise server or on cloud
3. We have looked at the Amazon S3 and Digital Ocean Spaces option , which is the better based on cost and performance
4. Need to desice keeping in mind the solution should work for atleast 6 years.

Bharat NarahariAsked:
Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

Mal OsborneAlpha GeekCommented:
Since you will be moving large numbers of large files around, you will probably need an onsite server.  Given "At least 1.5Tb per year", and "At least 6 years", that works out to at least 9Tb. Online storage of huge amounts of data can work our expensive, and it will be far faster to retrieve data from a local LAN, rather than going via the Internet.

I would probably be looking at a server with its own mirror array to boot from possibly using SSDs, and  a 15+Tb or so RAID10 array.  Using 16 2.4Tb 10K 2.5" SATA drives would give you around 19Tb useable space, and be very fast.

The next bottleneck you may see is the NIC; a server like that would be slowed down considerably with a NIC any slower than 10Gb. Common practice would be to use 10Gb for "Core" or "Distribution" switches, and 1Gb for "Access" switches.
I doubt there will be 300 concurrent users; a large hospital may have 300 surgeons and radiographers but they don't spend much of their time playing with images, they are too busy cutting people up. I spent some time at several hospital cath* labs maintaining their imaging servers and although the archive was huge the number of users inconvenienced by downtime was relatively small.

If the application vendor has any experience at all they could size it for you. You certainly need tiered storage, no sense storing 5 year old images on fast disks. You need a fast database server to find the patient's images quickly and then another one or two servers to act as fileservers. Bear in mind storage is easily expandable, adding another shelf of disks can be done on-the-fly with a good disk controller.

*Catheterization must be the most boring operation to watch, just a patient laying down and a surgeon sitting next to them apparently  twiddling their thumbs.
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Server Hardware

From novice to tech pro — start learning today.