We currently have a Visual Basic .NET program, intended to be somewhat generic, which pulls data from our huge database, and creates reports.
Currently the general design of our VB.NET program is:
1. Run a SQL stored procedure to extract data from the database and build a table of the data we want.
2. Run another SQL stored procedure to access a 2nd table.
3. Enter a loop which reads rows from the 2nd table, one row at a time, converts the columns of each row into parameters which get passed to a 3rd stored procedure. The 3rd stored procedure creates a 3rd table, which is the specific data we want for this specific report. Then run a report maker program which uses the 3rd table data for creating the report.
4. Repeat for all rows in the 2nd table.
This is for creating a lot of reports that are very similar, varying only in details specified in the 2nd table.
The specific stored procedures which get called can be specified for each run of the VB.NET program, making it somewhat flexible (at least that's the idea).
A. To make this generic and flexible, the VB.NET program needs to be able to handle a variety of the 2nd table, where the number of parameters (columns) may vary, and the type of each parameter (column) may vary (string or integer).
B. I see there is a way to read an entire row into an array using SqlDataReader.GetSqlValues in VB.NET. But not sure how to pass this array into a SQL stored procedure and process it within. (Table-Valued Parameters?)
C. Not sure if I can use global temp tables for all the temp tables generated. Currently permanent tables are being created (with DROP TABLE if it exists right before we create it).
Also not sure if there might be a better overall strategy. (Possibly a philosophical question with no real answer, but if there's a commonly used design pattern that might be helpful here that would be great to know.)
Thank you for any help!
(We might switch to C# instead of VB.NET . I can program in both, so doesn't really matter to me.)