Want to protect your cyber security and still get fast solutions? Ask a secure question today.Go Premium

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 370
  • Last Modified:

Merge identical tables into a single table in SQL or Access

I need to merge an ever growing number of tables into a single table. All tables are identifcal in structure and they all begin the letter TX. As an example I have 20 tables with the names TX1, TX2, TX3, TX4,TX5....and so one. I would like a way to merge all TX tables into one table and name it TX_ALL. This would be great to have it automated nightly.
0
allenkent
Asked:
allenkent
  • 3
  • 2
  • 2
  • +2
1 Solution
 
Scott McDaniel (Microsoft Access MVP - EE MVE )Infotrakker SoftwareCommented:
You can just do an INSERT INTO statement to move the data from your tables into a Master table:

Dim i As Integer

For i = 1 to 20
  Currentdb.Execute "INSERT INTO TX_ALL SELECT * FROM TX & " Cstr(i)
Next i

To be safe, I think you'd be better off explicitly declaring the columns, however:

For i = 1 to 20
  Currentdb.Execute "INSERT INTO TX_ALL(Col1, Col2, Col3)  SELECT Col1, Col2, Col3 FROM TX & " Cstr(i)
Next i

In my experience, "identical" tables can end up being less than identical :)
0
 
Dale FyeCommented:
What are these tables, and what does their structure look like?

How frequently does an additional TX# table get created?

Generally this is a result of poor database design.  Are you prepared to do some significant work to rework the application so that you only have a single table, or are do you just want to merge the data from each of the individual tables into a new master table, that will be used for some other purpose?
0
 
Jeffrey CoachmanMIS LiasonCommented:
First I agree that what LSM posted does in fact answer your question directly.

I also agree with fyed that this may be better "Normalized".

Creating a system to do this "Nightly" automatically is also tricky to do.
(What is a day is missed, what if an error occurs?, what about rollbacks, error handling, ..etc)

Just so we are clear, whats the reason why this data needs to remain in separate tables...?
0
Independent Software Vendors: We Want Your Opinion

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

 
Scott PletcherSenior DBACommented:
Something roughly like below should do it.

IF the table names don't have to be contiguous -- that is TX9 and TX11 could be there without TX10 -- change the BREAK to CONTINUE and another check to break out when you've reached the max number of tables you want to check for ... for example:

IF @table_counter > 50
    BREAK
--TRUNCATE TABLE TX_ALL

DECLARE @table_counter smallint
DECLARE @table_name nvarchar(128)

SET @table_counter = 0

WHILE 1 = 1

BEGIN
    SET @table_counter = 1
    SET @table_name = 'dbo.' + 'TX' + CAST(@table_counter AS varchar(5))
    IF OBJECT_ID(@table_name, 'U') IS NULL
        BREAK
    EXEC(N'INSERT INTO TX_ALL SELECT * FROM ' + @table_name)
END

Open in new window

0
 
allenkentAuthor Commented:
Yes I agree that all should be created in the same table with an identifier. The designer was afraid the table would be too big since its Access. We are creating 6000 lines of data a day which would be 2 millions lines of data a year. The concern was that the table would get to big and the system would not be able to write the data quick enough. Designer decided to create a new table for each run (6 tables a day - 1000 lines each).

I was thinking to delete the new table TX_All everyday and recreate with some type of MERGE TX* type of thinking. Possible?

0
 
Dale FyeCommented:
If you are concerned about size (2,000,000 records may not be very large depending on the # of fields and their data types), then you might need to consider moving your backend to SQL Server or SQL Server express.
0
 
allenkentAuthor Commented:
This almost works for me. I see that if I run as is with TX1 and TX2 as a table it will do an infinate loop.
QUESTION:  How do I set to start at TX57 table and what if they are not in order?

--TRUNCATE TABLE TX_ALL

DECLARE @table_counter smallint
DECLARE @table_name nvarchar(128)

SET @table_counter = 0

WHILE 1 = 1

BEGIN
    SET @table_counter = 1
    SET @table_name = 'dbo.' + 'TX' + CAST(@table_counter AS varchar(5))
    IF OBJECT_ID(@table_name, 'U') IS NULL
        BREAK
    EXEC(N'INSERT INTO TX_ALL SELECT * FROM ' + @table_name)
END
0
 
Jeffrey CoachmanMIS LiasonCommented:
Why is this arbitrary "2 million" number important?

The number you may be referring to is the theoretical 2 Billion limit to an autonumber field.

Even at 6000 records per day
It would take 300,000 days to get to 2 billion records, ...that's over 900 years of data...

Finally, it is the "Size" of the DB that really matters, not the number of records in any one table...

I can't see creating a system this complex because someone "Thinks" that it might be an issue.
Why not normalize the data (Do it right) then see what happens...?
If nothing, ...then you are all good.

If the data grows to the point where it is "Too Big" for access, then simply upgrade at that point...
At least you are upgrading a normalized system.
And you are not relying on system of combining table every night...

Besides, even if the limit were reached, combining the data into one table would still involve manipulating the same amount of "Too Big" data...
In fact technically it would be double:
Original/Separate Data+ Combined Data (in the same DB)


Again, I have no issue with combining the data being the solution here, ...only that a system like this may end up being hard to manage, and be more work than simply normalizing the design.


JeffCoachman
0
 
allenkentAuthor Commented:
Great point. Thank you. My concern with the 2 Million was with Access. If I am writing live data from a machine at 1 record every second for 10 minutes. I was under impression with Access that we would have trouble if the table got to 'In the Millions' that it would have a hard time writing to that table quick enough.

0
 
Scott McDaniel (Microsoft Access MVP - EE MVE )Infotrakker SoftwareCommented:
Where are you running the code referred to in comment #37046341? That's TSQL code, not Access code.
0
 
Scott PletcherSenior DBACommented:
>> I see that if I run as is with TX1 and TX2 as a table it will do an infinate loop. <<
D'OH, sorry, miscoding.  Corrected below.

>> QUESTION:  How do I set to start at TX57 table and what if they are not in order?  <<
If there can be gaps, you will have to set an arbitrary ending value and BREAK when you reach that value:

--TRUNCATE TABLE TX_ALL

DECLARE @table_counter smallint
DECLARE @table_name nvarchar(128)

--SET @table_counter = 1 --normal start at 1
SET @table_counter = 57 --<<--override to start at 57

WHILE 1 = 1
BEGIN
    SET @table_name = 'dbo.' + 'TX' + CAST(@table_counter AS varchar(5))
    IF OBJECT_ID(@table_name, 'U') IS NULL
    BEGIN
        SET @table_counter = @table_count + 1
        CONTINUE
    END
    EXEC(N'INSERT INTO TX_ALL SELECT * FROM ' + @table_name)
    SET @table_counter = @table_count + 1
    IF @table_count > 200 --<<-- change to whatever max limit you need
        BREAK
END

Open in new window

0

Featured Post

Free recovery tool for Microsoft Active Directory

Veeam Explorer for Microsoft Active Directory provides fast and reliable object-level recovery for Active Directory from a single-pass, agentless backup or storage snapshot — without the need to restore an entire virtual machine or use third-party tools.

  • 3
  • 2
  • 2
  • +2
Tackle projects and never again get stuck behind a technical roadblock.
Join Now