Transaction Log FULL due to ACTIVE_TRANSACTION error?

I was running a simple TRUNCATE TABLE and INSERT INTO table script for a very large table that has over 100 Million Rows of data. I'm transferring this table data from our PRODUCTION server to our DEVELOPMENT server and during the transfer, 3 hours into the process, I received this ERROR:

(1 row(s) affected)
Msg 9002, Level 17, State 4, Line 2
The transaction log for database 'DB1' is full due to 'ACTIVE_TRANSACTION'.

I think the process completed, but checking into this now.

Can someone please assist me in understanding what this error means and what (if anything) is the issue?
LVL 17
MIKESoftware Solutions ConsultantAsked:
Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

Guy Hengel [angelIII / a3]Billing EngineerCommented:
In short it means the transaction log file was full, could not be extended (either because max size being set, disk full or extension could not be completed fast enough) and as a result, the transaction failed and shall be rolled back.

For such large transfers, rather use export/import or/and in several parts
0
MIKESoftware Solutions ConsultantAuthor Commented:
My original table was TRUNCATED and not reloaded...so,... what do I need to do now?

If transaction log is full, do I need to TRUNCATE it,...or..?
0
MIKESoftware Solutions ConsultantAuthor Commented:
BTW, ...this is our DEV server so not worried too terribly much about fix, just need to be able to move the data into this TABLE somehow.
0
Guy Hengel [angelIII / a3]Billing EngineerCommented:
To clean the tlog, perform a transaction log backup. Then, copy the data in parts (for example per year data...)
0
Scott PletcherSenior DBACommented:
Sadly, SQL Server does not smoothly handle the restore of a single table.

You can't prevent logging on an INSERT no matter what you do.  And 100M rows would always be a huge amount of logging total, even if done in chunks.

Particularly if this is something you might have to do again in the future, I'd consider a work around.  Restore the database or filegroup to another db name, then use a synonym in the original db to point that table to the new db (as long as foreign keys don't prevent this approach).  You can then drop the original table.
0

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Microsoft SQL Server

From novice to tech pro — start learning today.

Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.