I have a fairly long SQL statement, around 65KB, from which I would like to migrate its dataset to Excel. I have two main methods I use for importing data into Excel: Microsoft Query, and the Command Text dialogue within the Properties of an externally derived dataset. See the image below.
The problem with both of these is that they have a limit of how much SQL content they can take. 65KB turns out to be more SQL code than either of these can handle. I am curious about whether I can use Notepad to create a SQL statement in its own text file, and then use Excel to establish a connection to this source. The query would actually have two sources: the connection string, establishing the data source and the text file containing the SQL statement.
The following is the technique I'm currently using:
Within VBA or VB.net, create an ADO connection to the desired data source.
Import a text file containing my long SQL statement, execute that statement with the ADO connection, and store the results into an ADO recordset.
Export the contents of the recordset to a tab-separated text file.
Read the text file directly into Excel.
Figuring out how to have Excel execute a SQL statement by reading it directly from a text file would provide the following advantages:
Override the code-size limitation inherent in Microsoft Query and the Command Text dialogue.
Streamline the process by eliminating step 3, thus saving time by eliminating the creation of one redundant copy of the dataset.
Replace the link with Excel from static to dynamic, enabling analysts to see changes to the underlying data source by pushing the "Refresh" button.
Simplify the work of maintaining the SQL statement, by editing the text file instead of having to go into Microsoft Query or the Command-Text dialogue within Excel.
All of this seems like a good idea to me, and I'm here to ask you experts if the current technology allows for what I'm envisioning. Thanks! ~Peter Ferber