The company I work for is rather small, we have about 60 desktops and run 4 servers.
The desktops range from Win XP, Core 2 Duo to Win 10 with Core i7. The newer desktops are used by our engineering staff (architects and civil/structural engineers) the older desktops by the admin staff.
Our servers, 1x DC, 1x Exchange 2010, 1x file and SQL server and 1x backup server. These servers have been in play since late 2010 early 2011, they are still running fine barring a few RAID failures in the past.
I am now required to setup a refresh policy for all hardware.
To date the company has worked on a 5 year refresh period for all workstations and a 10 year refresh period for servers, I don't feel that is ideal.
I thought I would start off with hardware that is of "mission critical".
Our engineering staff tend to do carry most of the work load in the company so I thought it would be a good ideal to have those desktops on a 3 year refresh, this way they are able to stay up to date with software changes.
The admin staff mainly use Office packages, ironically most of these desktops are still run just fine with Win XP and Core 2 Duos, sticking to a 5 year refresh should be fine
Directors and top management would be on a 3 year refresh, they are basically the heart and brains of the company so don't want them breathing down my neck about slow hardware.
Now the servers are running fine, no problems, performance is good, all is peachy... but... with these servers being the age they are they have long gone passed their warranty period, if a server had to have a main-board failure now it would mean weeks of down time trying to source a replacement or even to replace the server entirely.
Now my gut tells me to work on a 5 year refresh period and also to consider virtualization, one powerful server to tun 4 VMs and then use the old hardware as failover servers.
Do you guys have suggestions for me, what would do?