Link to home
Start Free TrialLog in
Avatar of Dustin Saunders
Dustin SaundersFlag for United States of America

asked on

SQL Backup skipping a few tables

I've got a Microsoft CRM database that I need to work with in order to prep for a data conversion to our new internal software.

However, this database is 40GB and it's problematic for me to get fresh copies of it from production.  That being said, there are 2 tables that are 30GB in size, just blobs that I don't need.  Can I run a backup into a .bak and exclude those from being exported?  (This is SQL2005 fyi)
SOLUTION
Avatar of Vitor Montalvão
Vitor Montalvão
Flag of Switzerland image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Do you need the up to the minute copy, or can you get the backup that (hopefully) occurs on schedule (nightly / hourly)?
Avatar of Dustin Saunders

ASKER

I just need to grab a copy every now and then, but waiting for the 40GB file to move to my dev environment takes a lot of time.

I'll need to write and test the SQL queries that will be used when we do the Go Live on the new software (so I will need to pull a fresh copy on that day too).  I'm thinking at this point my best bet is going to be to copy this database to a new one, skipping those tables, and then grabbing that .bak
ASKER CERTIFIED SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
I ended up created a new database with just the tables I needed, then backing that up-- but both of these posts contain good information on how to accomplish this in another way.
How do you keep those tables up to dated?
I don't, but I don't have a need to.

As mentioned, I'm doing a data conversion from our existing database into our new system so I just need a dummy copy of the database to write the data conversion SQL; so I just needed to be able to play with the data structure, test for conversion errors, etc.

On go-live day, I'll run the data conversion against the actual production database.