Solved

Looping with SQL

Posted on 2014-12-30
9
83 Views
Last Modified: 2014-12-31
Hello

I want to run a very quick query against a table (1)
From this table I would like to extract some key records, and insert them into a new table (x)
I would then like to repeat this process on the next table (2) and repeat until I am out of tables.
The tables are all named with the date they were created as follows:
tbl_Table_2014_12_15
tbl_Table_2014_12_14
tbl_Table_2014_12_13
tbl_Table_2014_12_12

Is this something I am able to do with SQL?

Thanks
0
Comment
Question by:BananaFury
9 Comments
 
LVL 14

Expert Comment

by:Vikas Garg
ID: 40523424
Hi,

You can do this with While Loop.

Take Temp table store table name into it
Assign count of that table to variable and in the loop assign table names to another variable

In the query side perform dynamic query passing source and destination table

Hope this will help you
0
 
LVL 142

Expert Comment

by:Guy Hengel [angelIII / a3]
ID: 40523427
0
 

Author Comment

by:BananaFury
ID: 40523593
Hello,

Thanks very much for your replies.

I confess, I am modest with SQL at best. Could you please provide me with an example of the code required for my particular scenario?

Thanks
0
 
LVL 142

Expert Comment

by:Guy Hengel [angelIII / a3]
ID: 40523601
my article has sample code, only you need to fill in the part of the sql that you need
your explanation is vague, so providing some "end code" will not be possible...
as a side note, your requirement looks like you may actually need to step back and check if for example a partitioned table is the better/faster option in regards what you may need to do.
so, question 1: what is the process about, actually (why do you want to implement this)
question 2: please clarify what data should go from where to where exactly, and what about existing data etc...
0
Find Ransomware Secrets With All-Source Analysis

Ransomware has become a major concern for organizations; its prevalence has grown due to past successes achieved by threat actors. While each ransomware variant is different, we’ve seen some common tactics and trends used among the authors of the malware.

 

Author Comment

by:BananaFury
ID: 40523608
Hello,

Thanks for your help.
I want to go through each of the tables and pull from them all of the Open records. These records I would like to place in a separate table that contains only the Open records, but from each table (which is a point in time)

Below is the only way I know how to tackle such a task. We have 3 years of tables though, so 1000+ tables so I suspect there is an easier way of doing this.


INSERT INTO tbl_All_Open_Records
SELECT *
FROM tbl_Table_2014_12_15 WHERE [Status] = 'Open'

INSERT INTO tbl_All_Open_Records
SELECT *
FROM tbl_Table_2014_12_14 WHERE [Status] = 'Open'

INSERT INTO tbl_All_Open_Records
SELECT *
FROM tbl_Table_2014_12_13 WHERE [Status] = 'Open'

..... etc
0
 
LVL 18

Assisted Solution

by:SimonAdept
SimonAdept earned 175 total points
ID: 40523670
Guy's advice to take a step back is good :)

If you're doing this as a one off and do want to get all the relevant table names to loop through with dynamic SQL...

SELECT table_name FROM YourDatabaseName.information_schema.tables
where TABLE_TYPE='base table'
and TABLE_NAME like 'tbl_Table_20[0-9][0-9][_][0-1][0-9][_][0-3][0-9]'

Open in new window


You could use this query to populate  the table-valued variable @c that Guy shows in his non-cursor code version in his article:

declare @c table (id int identity(1,1),name sysname)

insert into @c ( name) SELECT  table_name FROM [Your Database Name Here].INFORMATION_SCHEMA.tables
where TABLE_TYPE='base table'

Open in new window

0
 
LVL 142

Accepted Solution

by:
Guy Hengel [angelIII / a3] earned 250 total points
ID: 40523695
so, indeed we take the code template from:
http://www.experts-exchange.com/Database/MS-SQL-Server/A_13640-processing-cursor-vs-temp-table-syntax.html
-- starting with the variables you will need in the processing part
declare @name sysname
declare @l nvarchar(max)

-- here is the table variable definition, which lives only for the duration of the run and is cleaned up automatically
-- for "small" results, it will stay purely in memory; larger sets may be "stored" in the temp database
declare @c table (  name sysname )
-- fill the table with data, update your select as you need it
-- if your process requires some "order" in the processing, still an ORDER BY here will be useless (unless you also use and need a TOP X clause) see below on where you have to put the ORDER BY

set nocount on
insert into @c ( name)
SELECT TABLE_NAME 
  FROM information_schema.tables
 WHERE TABLE_TYPE='base table'
   AND TABLE_CATALOG='YourDatabaseNameHere'
   AND TABLE_NAME like 'tbl_Table_20[0-9][0-9][_][0-1][0-9][_][0-3][0-9]' 

-- process, either if the above insert did at least 1 row, or if the below "delete" did indeed 
while @@ROWCOUNT <> 0
begin
  -- fetch 1 record from the table. 
  -- if your process requires some "order" in the processing, put the corresponding ORDER BY here
  select top 1 @name = name from @c

  if @@ROWCOUNT <> 0
  begin
   -- process the data here, exactly the same as with the cursor example.
   set @l =  ' INSERT INTO tbl_All_Open_Records
     SELECT *
	FROM [' + @name + '] WHERE [Status] = ''Open''
          '

   print @l
   exec(@l)
  end

  -- here we delete the row from the temp table , which is nothing else than a processing queue table, hence
  delete @c where name = @name
END 

Open in new window

0
 
LVL 69

Assisted Solution

by:ScottPletcher
ScottPletcher earned 75 total points
ID: 40523916
Here's some thoughts on this, in case they help.

I'd strongly suggest to using transactions for each month rather than day.  That many separate transactions will take longer and generate more log usage.  Even then, be sure to have the necessary pre-allocated, and thus pre-formatted, log space available before the loads start.

I'd also suggest -- on the extremely limited info so far -- clustering the final table on date, as I suspect that's how it will primarily be processed.  [Unless obviously you've already defined another clustering key that is NOT an identity.]

My preferred approach would be to dynamically create multiple jobs, one per affected month, to do the loading rather than using in-line execution.  This gives these advantages:
(1) async execution of multiple/all loads -- sp_start_job simply starts the job and returns, so multiple jobs could easily be run at once
(2) can easily re-run a single month without adjusting code;
(3) monthly transaction size -- hopefully not too big, but not too small either.
0
 

Author Closing Comment

by:BananaFury
ID: 40525077
Thanks very much for all of your help
0

Featured Post

IT, Stop Being Called Into Every Meeting

Highfive is so simple that setting up every meeting room takes just minutes and every employee will be able to start or join a call from any room with ease. Never be called into a meeting just to get it started again. This is how video conferencing should work!

Join & Write a Comment

Suggested Solutions

This is basically a blog post I wrote recently. I've found that SARGability is poorly understood, and since many people don't read blogs, I figured I'd post it here as an article. SARGable is an adjective in SQL that means that an item can be fou…
Naughty Me. While I was changing the database name from DB1 to DB_PROD1 (yep it's not real database name ^v^), I changed the database name and notified my application fellows that I did it. They turn on the application, and everything is working. A …
It is a freely distributed piece of software for such tasks as photo retouching, image composition and image authoring. It works on many operating systems, in many languages.
Here's a very brief overview of the methods PRTG Network Monitor (https://www.paessler.com/prtg) offers for monitoring bandwidth, to help you decide which methods you´d like to investigate in more detail.  The methods are covered in more detail in o…

706 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

17 Experts available now in Live!

Get 1:1 Help Now