[2 days left] What’s wrong with your cloud strategy? Learn why multicloud solutions matter with Nimble Storage.Register Now

x
?
Solved

Best Way to Archive Gigantic SQL Table

Posted on 2012-04-06
5
Medium Priority
?
258 Views
Last Modified: 2012-06-21
We have an audit table in our ERP database that is 51 gb in size. I need to archive the data in this table and shrink it significantly. I tried selecting all the records to a file, but that took HOURS, filled up the drive space and brought the system to a crawl. I eventually ended up killing that process. But I need to know the best way to archive this data, and how to do that on a regular basis. Once it's down to a manageable size, I could write it to a table on another drive and truncate this one, or at least delete records older than 6 months or something.

I do have a full system backup that runs every night so I have everything backed up, but what I'd really like to do is get this single table onto a blue ray disk or just on another network drive because the odds of someone actually requesting any of this data is truthfully slim to none. But as soon as just truncate the table, someone will want to know something. We tried truncating in our development environment and it was really fast, and irreversible - LOL.

Any suggestions?
0
Comment
Question by:Carla Romere
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 2
  • 2
5 Comments
 
LVL 15

Accepted Solution

by:
Deepak Chauhan earned 1000 total points
ID: 37816991
In your scenario first create  a secondary file group on another shared drive or other secondary drive  and partition this table on this newly created file group on the basis of data older than six months.

to do this first created a filegroup with at least one file and created partition functions and partition schema then move this table to on the file group as according partition functions.
0
 
LVL 25

Assisted Solution

by:jogos
jogos earned 1000 total points
ID: 37817554
For future:
Can you move that audit table to a own database? Then when it becomes big you just backup-up it and save it for the archives. You truncate table and start again for a new period.
Has also advantages
- for security reasons, less people who have permission on the whole database.
- large table does not put stress on maintenance (backup/restore/index/statistics) of your production-db  

Now:
You want to keep that data usable, best way is in a db-version. Make a new db simple recover model. Copy in manageble parts (+ commit  for log-file!) the records to new db.  Backup this db for your archive and drop it.
You can do this in db-units for example for audit logs of 6 months / database.

After the whole actual data is copied then you can indeed truncate the table on production.
0
 

Author Comment

by:Carla Romere
ID: 37817572
Unfortunately, I can't move the audit table to another database. It's built in to our ERP system. One of the problems was we were auditing too many fields on one of the tables we were auditing and it was creating hundreds of lines for one change. We've modified the tables being audited so it won't be growing out of control again, I just have to get this data itself backed up somewhere I can get to it and then I can truncate and start fresh. I will be digging in to this on Tuesday of next week to see if I can get somewhere.
0
 
LVL 25

Expert Comment

by:jogos
ID: 37817616
<<One of the problems was we were auditing too many fields >>
The classic  ... but have seen worse auditing all on initial load of a huge ERP. Why is it so slow and why does it takes so much space? Yeh right
0
 

Author Closing Comment

by:Carla Romere
ID: 37833981
As always, thanks for the help.
0

Featured Post

[Webinar] Lessons on Recovering from Petya

Skyport is working hard to help customers recover from recent attacks, like the Petya worm. This work has brought to light some important lessons. New malware attacks like this can take down your entire environment. Learn from others mistakes on how to prevent Petya like worms.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

by Mark Wills Attending one of Rob Farley's seminars the other day, I heard the phrase "The Accidental DBA" and fell in love with it. It got me thinking about the plight of the newcomer to SQL Server...  So if you are the accidental DBA, or, simp…
When writing XML code a very difficult part is when we like to remove all the elements or attributes from the XML that have no data. I would like to share a set of recursive MSSQL stored procedures that I have made to remove those elements from …
This tutorial will teach you the special effect of super speed similar to the fictional character Wally West aka "The Flash" After Shake : http://www.videocopilot.net/presets/after_shake/ All lightning effects with instructions : http://www.mediaf…
In this video, Percona Solution Engineer Dimitri Vanoverbeke discusses why you want to use at least three nodes in a database cluster. To discuss how Percona Consulting can help with your design and architecture needs for your database and infras…

649 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question