I have 2 servers and I was wondering if someone could acknowledege my findings.
Windows 2008 R2 Enterprise
8 GB Ram DDR3 1333 Mhz
Windows 2008 R2 Standard
8 GB Ram DDR3 533 Mhz
1 TB ST31000524AS
We use an single thread application and its doing a lot of calculations and intrapolation. The first server is considerably more faster than the second. A set of calculation 30 mins on the first server vs 4 hours on the second server.
Is it true, to say just with the processor, a 1200$ processor vs a 150$ processor, there is something there that the second server cant handle. The low speed ram is another indicator. Finally the hard drives (15K RPM vs 7K RPM).