My web app is running fine when I run it in the debugger on my dev machine, but fails with an overflow exception in system.data when I run it on the server.
is there some way I can get a better error than just the overflow? Can I somehow step into code when it is executing on the server? Alternatively, is there some way I could reproduce the error on my dev machine?
Detailed Problem Description
I've written a C# web app that filters a DataSet with ADO. I am using the DataTable.select method. I am creating the filterexpression argument dynamically in order to filter one table with data from a related table.
The first table named "cruises" holds data about river boat cruises. Each cruise stops at a number of ports during the cruise. This is determined by a "cruiseports" table that relates cruises to ports by cruise id. I am needing to build the cruise itinerary of ports from this table. "cruises" does not have any port id values. Instead this table is normalized and port ids must be derived via the cruise id in the "cruiseports" table. In other words, a single cruise will relate to multiple records in the cruiseports table. There will be one record for each stop that each cruise makes in a port.
Perhaps I should have used a DataRelation class object, but it didn't seem to give me much more? I am new to ADO.
Unfortunately, the related table is quite large and my design has turned out not to be scalable. Things worked fine with test data, but my code failed with a overflow exception error in System.data with real data.
I was able to fix this error on my local machine by breaking up my filterexpression into multiple DataTable.select calls. But this approach is still failing when I run the app on the server. I don't understand why this would happen since I was assuming that running the app on my dev server would be the same as running the app with a true client and server environment?
There is very little data being sent in both the request and response, so I don't think the problem is likely to be a race condition caused by latency. I am wondering if there is more memory available on my dev machine when the web app is run in in debug mode? I don't know enough about server memory. This seems like something that might come up with memory intensive apps such as search engines?
Do you have any idea why I would get this overflow exception in System.data? Is there some way that I can step through the code when it is failing on the server? Alternatively, is there a way I can cause my dev server to fail in this way? I'm stuck without a line number or even a more meaningful error message. Also, is there some known design pattern that deals with this problem? Thanks!