Link to home
Start Free TrialLog in
Avatar of midfde
midfdeFlag for United States of America

asked on

What is the reason for 4 times elapsed time difference?

Please see my image and answer the question that is at its bottom.

In short, an application is running on two Wintel computers with very similar properties, and it displays a big difference in performance. Each application essentially creates dynamically an intricate sequence of queries and eventually INSERTs something in a linked table. All linked tables (a few dozens) are identical (copy command was used) instances and are located on the same hard drive that the application itself.
I'd appreciate any ideas about why it might be the case.
Faster one is a server of our datacenter service provider, slower one is my workstation.
Thanks.
SOLUTION
Avatar of jerryb30
jerryb30
Flag of United States of America image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
SOLUTION
Avatar of peter57r
peter57r
Flag of United Kingdom of Great Britain and Northern Ireland image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of midfde

ASKER

Correction!
I am sorry, I've fixed my terrible mistake on the picture. I apologize :-(
The cells in Output row should have been swapped, and this is what I've just done.
So computer with SSD (my workstation) displays lousy performance as compared to HD equipped one.
Avatar of midfde

ASKER

Inspiron 9400 with SSD is my workstation. Sorry for the confusion. Shame on me...
Xeon is a quad core. Rated much higher that the Core Duo of the T2300. That might easily cause the 4x processing time. Aside from the inherent speed of the Xeon, at the cost of power usage.

Any reason you expected things to be different?
Avatar of midfde

ASKER

In my opinion CPU clock, RAM size, and HD speed are crucial factors of performance level on different computers with
nearly identical instruction set
functionally nearly the same operating system
.
Although 2.7 GHz is better than 1.7 GHz, it is hard for me to imagine that it may cause fourfold elapsed time reduction, particularly for a database-heavy application.

 Well, I have a lot to say about all this but... I'd like to hear from Experts of this site something that might dispel my astonishment.
C: is local and D: is network mapped drive.
Different versions of Access
What disk system is used on server? May be there is some RAID? Raid 10 with SAS disks is compatible with SSD drives. Other reason - Windows XP have no native support for SSD disks. May be your drive was not properly aligned. Can you make Crystal Disk Mark tests on both computers?
http://crystalmark.info/software/CrystalDiskMark/index-e.html
Other processes running on the system during your test.
Device/channel contention
Avatar of midfde

ASKER

>>C: is local and D: is network mapped drive
Both are two partitions on SSD.
>>Other processes running on the system during your test.
Server runs a few user's processes, my workstation -- well usual stuff, MS Access taking almost all its resources.
>>Device/channel contention
Sorry, I have no idea about this one.

In fact my workstation is a regular developer-oriented Dell laptop that has always been much slower than the server. I hoped though that recent replacement of HD  with SSD might fix this, because apparently SSD is faster than whatever rotating RAID might be.
BTW, SSD did improve performance, and I feel it as its responsiveness is better. On the other hand the application is more of DB data mining than pure computing so disk speed was expected to be crucial.
<Each application essentially creates dynamically an intricate sequence of queries and eventually INSERTs something in a linked table. All linked tables (a few dozens) are identical (copy command was used) instances and are located on the same hard drive that the application itself.>
...Can you also take a moment to explain what this all means and why it is needed?
Perhaps there is a simpler approach...
Avatar of midfde

ASKER

>>...why it is needed
PACRAT software holds complex HVAC systems models in its database. PACRAT EXPERT application periodically analyses temporal data in order to detect 18 (see above) kinds of anomalies and put them into a linked table.

>>...what this all means...
The model includes at least hundreds pieces of SQL statements, and Expert applies them to newly available time series.
An example is here

>>Perhaps there is a simpler approach
No doubt about that, but... my question was about surprising difference in performance, whatever [in]efficient our approach is. Remember?
When you observe the operation, can you tell if all cpu's are firing?  Enabled?
Is memory/CPU usage for processes similar?
Are the processes disk intensive (lots of small i/o ops)?
ASKER CERTIFIED SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of midfde

ASKER

The answer is yet to come.