I have a procedure with many steps and lots of complex queries that I am trying to make run faster.
Calculations are made over 100 years of data. The math performed on any particular year's data is based in part, on the results of calculations from the previous year (so it's sequencial).
Since there are several different scenarios run, the total number of rows of data is around 12,000. These data rows are summed and averaged into a few records which are used in a report. The 12,000 records are no longer used after the summary tables are populated.
So, my question is... would I gain a *significant* amount of improved performance if I used Temporary tables for the 12,000 records of data.. Currently, I am using regular database tables.
I am thinking this may be helpful, but before doing all the work, want to see what experts say. It is harder to debug with temp tables... I also use WITH CTE... to sum sequentially, any restrictions on using this with temp tables?