Link to home
Start Free TrialLog in
Avatar of Bharat Narahari
Bharat Narahari

asked on

Determining Server Configuration based on specs

Hi,
Our healthcare application needs to be implemented in a hospital, the application is very critical in nature.  The entire application is basically writing images and reading it back and forth for multiple editing.  Yearly atleast 1.5TB of  images will be generated. At any point of time there will be 300 concurrent users. manipulating the images .  Per Image size will be approximately 250 KB.

So my questions is ,
1. How to determine the best server configuration for the above specs
2.  Should we go for on premise server or on cloud
3. We have looked at the Amazon S3 and Digital Ocean Spaces option , which is the better based on cost and performance
4. Need to desice keeping in mind the solution should work for atleast 6 years.

Regards
Bharat
Avatar of Mal Osborne
Mal Osborne
Flag of Australia image

Since you will be moving large numbers of large files around, you will probably need an onsite server.  Given "At least 1.5Tb per year", and "At least 6 years", that works out to at least 9Tb. Online storage of huge amounts of data can work our expensive, and it will be far faster to retrieve data from a local LAN, rather than going via the Internet.

I would probably be looking at a server with its own mirror array to boot from possibly using SSDs, and  a 15+Tb or so RAID10 array.  Using 16 2.4Tb 10K 2.5" SATA drives would give you around 19Tb useable space, and be very fast.

The next bottleneck you may see is the NIC; a server like that would be slowed down considerably with a NIC any slower than 10Gb. Common practice would be to use 10Gb for "Core" or "Distribution" switches, and 1Gb for "Access" switches.
Avatar of Member_2_231077
Member_2_231077

I doubt there will be 300 concurrent users; a large hospital may have 300 surgeons and radiographers but they don't spend much of their time playing with images, they are too busy cutting people up. I spent some time at several hospital cath* labs maintaining their imaging servers and although the archive was huge the number of users inconvenienced by downtime was relatively small.

If the application vendor has any experience at all they could size it for you. You certainly need tiered storage, no sense storing 5 year old images on fast disks. You need a fast database server to find the patient's images quickly and then another one or two servers to act as fileservers. Bear in mind storage is easily expandable, adding another shelf of disks can be done on-the-fly with a good disk controller.

*Catheterization must be the most boring operation to watch, just a patient laying down and a surgeon sitting next to them apparently  twiddling their thumbs.
This question needs an answer!
Become an EE member today
7 DAY FREE TRIAL
Members can start a 7-Day Free trial then enjoy unlimited access to the platform.
View membership options
or
Learn why we charge membership fees
We get it - no one likes a content blocker. Take one extra minute and find out why we block content.