ASP.Net: Need strategy to read from 4 cascading/related data/api feeds using VB.net, and display them all in a single page

Jon Jaques
Jon Jaques used Ask the Experts™
on
I need to develop a good/correct strategy in ASP.Net to read from 4 cascading data/api feeds using VB, and display them all in a single page.

The first feed contains a list of clients and their id numbers, the second feed contains a list of client locations and their id numbers, the third feed contains a list of assets/equipment, and the fourth feed contains additional info about the equipment. Each feed except for the first requires an id number, so if a client has 4 locations, then the locations feed has to be queried for each location id and so forth.

The URLs are pretty predictable, as are the XML responses. The URLs looks like this:

http://xyz.com/abc.php?action=list_clients
http://xyz.com/abc.php?action=list_locations&clientid=1
http://xyz.com/abc.php?action=list_assets&locationid=1000
http://xyz.com/abc.php?action=list_assets&locationid=1001
http://xyz.com/abc.php?action=list_assets&locationid=1002

And for each asset found:
http://xyz.com/abc.php?action=list_properties&assetid=2000
http://xyz.com/abc.php?action=list_properties&assetid=2001
http://xyz.com/abc.php?action=list_properties&assetid=2002
etc.

I've played around with repeaters and xmldatasources with success, but once you try nesting them 4 layers deep, things get messy.

My need is to report on the 4th level data, flagging and displaying items which are in need of attention so that the appropriate people can act on them.

Some things I've considered:
* Caching XML results to file system
* Caching XML data to database
* Creating a class diagram with tight bound field definitions and relationships

Caching the XML results seems like it would be easier because of the way that the xmldatasource works, but would be messy, having to cache quite a few different different files to represent the whole data set.

Both the database and class approaches use tight binding, and I have no control over fields in the returned dataset, so if they change their data (again) I would have to update the data models to match.

Can anybody recommend a clean approach to this?

Your thoughts would be appreciated!
Comment
Watch Question

Do more with

Expert Office
EXPERT OFFICE® is a registered trademark of EXPERTS EXCHANGE®
Kyle SantosSoftware Test Analyst I at Dassault Systemes

Commented:
Hi,

I am following up on your question.  Do you still need help?

If you solved the problem on your own, would you please post the solution here in case others have the same problem?

Regards,

Kyle Santos
Customer Relations
Jon JaquesInformation Technologist

Author

Commented:
I finally resolved this on my own; what I discovered was that the customer only needed a once-daily report generated, and meanwhile the hosts of the API didn't want a ton of traffic, either, so I found it easiest to simple connect to each service, and then download its contents to a cache folder, where I was then able to process the data. Not the most elegant, but it works for now.
Information Technologist
Commented:
Cached all data, then processed XML files as needed.
Kyle SantosSoftware Test Analyst I at Dassault Systemes

Commented:
Thank you for letting us know!

Do more with

Expert Office
Submit tech questions to Ask the Experts™ at any time to receive solutions, advice, and new ideas from leading industry professionals.

Start 7-Day Free Trial