I'm a Delphi 7 programmer and have developed a multi-threaded network application that basically queues and generates database reports in printed and pdf format. The problem is that the application works 100% perfect on all segments of the network when the network isn't busy. However, when the network gets busy, it works 100% fine on the segment in which I've developed it, but in other areas of the network it either completely fails to produce the reports or it misses reports or data on reports intermittently.
The situation has become serious now that the network team consider my application to be the problem rather than the network it's running on and I've been asked to consider redevelopment (about 9 months work with all of the reports to rewrite). However, I find this difficult to understand when it can run fine on my segment of the network.
Can anybody shed some light on this problem and confirm my suspicions that it is more likely a network issue than the app, which after all runs fine all the time on one segment of the network (busy or not). Any suggestions as to what may be causing these inconsistencies would be greatly appreciated.
The details of are as follows:
We are running on a NetWare system on IPX on a single Dell PowerEdge dual-processor server (purchased new last year) with a mix of mainly 100BASE-TX and a few 10BASE-T workstations over a fibre backbone. The network team has recently changed hubs for Allied Telesyn switches on the 100Mbs segments of the network. At busy times there are probably in excess of 30 workstations running database apps from the server.
The database system:
We are running Extended Systems (now Sybase owned) Advantage Database Server (ADS) ver 7.1. There are a few small distinct databases and one very large DBF/CDX-based business-critical database. There is also a fairly busy medium-sized Paradox database running from the same server.
The application details:
My Delphi application queues and generates R&R Xbase reports. The application updates the ADS database before the R&R reports read from the tables. The application runs a synchronised thread which waits for each report to complete from the external report writer before submitting the next (uses an R&R control table for those who are familiar). ADS is client/server based, but R&R uses its own non-client/server Xbase driver to read the tables which therefore creates extra database traffic between the server and workstation. The R&R application that is called from my app runs locally on each workstation. The application and all the tables it creates are run from the server. At present there are around 5 workstations that may run the application concurrently.
If you require further details in order to answer this question, please don't hesitate to ask.
Thanks in advance,