• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 105
  • Last Modified:

Looping with SQL

Hello

I want to run a very quick query against a table (1)
From this table I would like to extract some key records, and insert them into a new table (x)
I would then like to repeat this process on the next table (2) and repeat until I am out of tables.
The tables are all named with the date they were created as follows:
tbl_Table_2014_12_15
tbl_Table_2014_12_14
tbl_Table_2014_12_13
tbl_Table_2014_12_12

Is this something I am able to do with SQL?

Thanks
0
BananaFury
Asked:
BananaFury
3 Solutions
 
Vikas GargBusiness Intelligence DeveloperCommented:
Hi,

You can do this with While Loop.

Take Temp table store table name into it
Assign count of that table to variable and in the loop assign table names to another variable

In the query side perform dynamic query passing source and destination table

Hope this will help you
0
 
Guy Hengel [angelIII / a3]Billing EngineerCommented:
0
 
BananaFuryAuthor Commented:
Hello,

Thanks very much for your replies.

I confess, I am modest with SQL at best. Could you please provide me with an example of the code required for my particular scenario?

Thanks
0
The 14th Annual Expert Award Winners

The results are in! Meet the top members of our 2017 Expert Awards. Congratulations to all who qualified!

 
Guy Hengel [angelIII / a3]Billing EngineerCommented:
my article has sample code, only you need to fill in the part of the sql that you need
your explanation is vague, so providing some "end code" will not be possible...
as a side note, your requirement looks like you may actually need to step back and check if for example a partitioned table is the better/faster option in regards what you may need to do.
so, question 1: what is the process about, actually (why do you want to implement this)
question 2: please clarify what data should go from where to where exactly, and what about existing data etc...
0
 
BananaFuryAuthor Commented:
Hello,

Thanks for your help.
I want to go through each of the tables and pull from them all of the Open records. These records I would like to place in a separate table that contains only the Open records, but from each table (which is a point in time)

Below is the only way I know how to tackle such a task. We have 3 years of tables though, so 1000+ tables so I suspect there is an easier way of doing this.


INSERT INTO tbl_All_Open_Records
SELECT *
FROM tbl_Table_2014_12_15 WHERE [Status] = 'Open'

INSERT INTO tbl_All_Open_Records
SELECT *
FROM tbl_Table_2014_12_14 WHERE [Status] = 'Open'

INSERT INTO tbl_All_Open_Records
SELECT *
FROM tbl_Table_2014_12_13 WHERE [Status] = 'Open'

..... etc
0
 
SimonCommented:
Guy's advice to take a step back is good :)

If you're doing this as a one off and do want to get all the relevant table names to loop through with dynamic SQL...

SELECT table_name FROM YourDatabaseName.information_schema.tables
where TABLE_TYPE='base table'
and TABLE_NAME like 'tbl_Table_20[0-9][0-9][_][0-1][0-9][_][0-3][0-9]'

Open in new window


You could use this query to populate  the table-valued variable @c that Guy shows in his non-cursor code version in his article:

declare @c table (id int identity(1,1),name sysname)

insert into @c ( name) SELECT  table_name FROM [Your Database Name Here].INFORMATION_SCHEMA.tables
where TABLE_TYPE='base table'

Open in new window

0
 
Guy Hengel [angelIII / a3]Billing EngineerCommented:
so, indeed we take the code template from:
http://www.experts-exchange.com/Database/MS-SQL-Server/A_13640-processing-cursor-vs-temp-table-syntax.html
-- starting with the variables you will need in the processing part
declare @name sysname
declare @l nvarchar(max)

-- here is the table variable definition, which lives only for the duration of the run and is cleaned up automatically
-- for "small" results, it will stay purely in memory; larger sets may be "stored" in the temp database
declare @c table (  name sysname )
-- fill the table with data, update your select as you need it
-- if your process requires some "order" in the processing, still an ORDER BY here will be useless (unless you also use and need a TOP X clause) see below on where you have to put the ORDER BY

set nocount on
insert into @c ( name)
SELECT TABLE_NAME 
  FROM information_schema.tables
 WHERE TABLE_TYPE='base table'
   AND TABLE_CATALOG='YourDatabaseNameHere'
   AND TABLE_NAME like 'tbl_Table_20[0-9][0-9][_][0-1][0-9][_][0-3][0-9]' 

-- process, either if the above insert did at least 1 row, or if the below "delete" did indeed 
while @@ROWCOUNT <> 0
begin
  -- fetch 1 record from the table. 
  -- if your process requires some "order" in the processing, put the corresponding ORDER BY here
  select top 1 @name = name from @c

  if @@ROWCOUNT <> 0
  begin
   -- process the data here, exactly the same as with the cursor example.
   set @l =  ' INSERT INTO tbl_All_Open_Records
     SELECT *
	FROM [' + @name + '] WHERE [Status] = ''Open''
          '

   print @l
   exec(@l)
  end

  -- here we delete the row from the temp table , which is nothing else than a processing queue table, hence
  delete @c where name = @name
END 

Open in new window

0
 
Scott PletcherSenior DBACommented:
Here's some thoughts on this, in case they help.

I'd strongly suggest to using transactions for each month rather than day.  That many separate transactions will take longer and generate more log usage.  Even then, be sure to have the necessary pre-allocated, and thus pre-formatted, log space available before the loads start.

I'd also suggest -- on the extremely limited info so far -- clustering the final table on date, as I suspect that's how it will primarily be processed.  [Unless obviously you've already defined another clustering key that is NOT an identity.]

My preferred approach would be to dynamically create multiple jobs, one per affected month, to do the loading rather than using in-line execution.  This gives these advantages:
(1) async execution of multiple/all loads -- sp_start_job simply starts the job and returns, so multiple jobs could easily be run at once
(2) can easily re-run a single month without adjusting code;
(3) monthly transaction size -- hopefully not too big, but not too small either.
0
 
BananaFuryAuthor Commented:
Thanks very much for all of your help
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Featured Post

Free Tool: Site Down Detector

Helpful to verify reports of your own downtime, or to double check a downed website you are trying to access.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now