Solved

Create a database of only selected objects with a script

Posted on 2010-09-03
6
285 Views
Last Modified: 2012-06-21
Hello I am wanting to create a database back from a pre existing database.  However I want to pick and choose objects I want and the data contained therein.  I don't want to mirror or backup the database.  I want to create a script to create the database and applicable objects only that I can pick and choose and make a script that can generate the data objects and some of their actual data on command.  

The datababase is a SQL Server Enterprise 2005 Database.  I can preuse the existing database to get my objects, it's not down, it just needs to be trimmed.  I would like to get meta data to move based on a config table to pick and choose what to move(I am not set on this and if you have a better idea I'm all eyes).  I want a script preferrably so I can alter it as data changes later for new schemas, tables, functions, etc.

Below is my code that I have started.  I would like to know how are some built in way to move objects and data with the least possible effort but also I can expand them with a where clause to pick and choose what to move.

Any help is much appreciated.
use TempLoc

GO



if object_ID('FTG.TablesInMain') is not null

	drop table FTG.TablesInMain

GO



create table FTG.TablesInMain

	(

		DatabaseName	varchar(64)

	,	SchemaName	varchar(32)

	,	TableName		varchar(64)

	,	TableType		varchar(16)

	,	Remarks		varchar(64)  NULL

	,	Remove_Keep	bit

	)



Use Main

GO



insert into TempLoc.FTG.TablesInMain

	(

		DatabaseName

	,	SchemaName

	,	TableName

	,	TableType

	,	Remarks

	)

exec sp_tables

GO



update FTG.TablesInMain

set Remove_Keep = 1  -- 1 is keep in my example

GO



/*

------------------------------------

Where it get's interesting to actually set up data to be 

excluded

---------------------------------



update FTG.TablesinMain

set Remove_Keep = 0

where TableName in (specifics will be added later to hunt for unneccary tables)



exec (some sproc to go off this temp table to move data based on bit reference in Keep_Remove table



-------------------------------------

need more help on possible functions that could actually move

the data and other built ins that find the triggers and such.  

If I could use the Object_ID for most of this I would be fine

with that I just need to know the syntax of finding the objects

and moving them

-----------------------------------

*/

Open in new window

0
Comment
Question by:CNCDeveloper
  • 3
  • 2
6 Comments
 
LVL 32

Expert Comment

by:ewangoya
ID: 33598159
Why not just use the database scripting already provided in Management studio then transfer your data with DTS

in sql management studio
Right click on Database - Taks - Generate scripts

This will let you select what objects you want to script then save your script to a file.
You can the use the import export wizard to transfer your data
0
 

Author Comment

by:CNCDeveloper
ID: 33600857
I did not know that part of the SSMS existed, thanks.  

I will go with this method but is there any way to make it run based object sets I set saying what to copy and what not to?  The database I am copying has close to 380 tables, 456 stored procedures, and numerous other things.  To go through a wizard may take a while.

For the data transfer I was hoping to keep it with a single script but pick and choose data to transfer.  You can fashion your query with DTS instead of just bulk copying a table correct?

I know the wizards can do a lot but it needs a person unchecking a lot of stuff too.
0
 
LVL 75

Accepted Solution

by:
Anthony Perkins earned 250 total points
ID: 33601306
>>The database I am copying has close to 380 tables, 456 stored procedures, and numerous other things.  To go through a wizard may take a while.<<
The good news: You can do this with SSMS by first selecting all the tables and then outputing the script.
The bad news: You will have to do it for each object and worse then that you will have to do it once for each table in order to get the indexes.

What you really need is a third party tool to do this.  Red-Gate's SQL Compare comes to mind.
0
Control application downtime with dependency maps

Visualize the interdependencies between application components better with Applications Manager's automated application discovery and dependency mapping feature. Resolve performance issues faster by quickly isolating problematic components.

 

Author Comment

by:CNCDeveloper
ID: 33843584
Well unfortunately what I really needed was a dynamic scripting data tool.  I understand that during complex procedures it is not always easy to explain what you want.  The scripting tool in SSMS is very limited in that it may generate errors when copying massive tables (more than 3 million rows).  I wanted to create a tool to generate scripts.  Eventually what I did was below.  This code is an example of code but it is basically three parts with a fourth left up to the user to generate a list of tables from their

1st part is to get your column values from the sys.columns data table so you can create a 'select *' into a single string.  I chose to make a view to give me something like 'select cast(column1 as varchar(max)) + ', ' + cast(column2 as varchar(max)) ... etc.

2nd part is to dynamically select this data from the view as a procedure.  I select the string to run as a long string then run it dynamically in SQL.

3rd Part 2 is just for a single script so to really get fast on hundreds of tables I need to loop this.  I set up a looping sproc to run this from.

4th Part is the table that part 3 is going to run from.  You need a master list of tables and criteria column to base what tables you want to keep or not on.

There are certain issues this of course will run into:

1. Varchar(max) does not deal well with varbinary.  You could go a level deeper than I did and add case logic to your column listings in the view.  That required too much I thought as my database although massive, did not have but 6 tables that stored in varbinary, I just did those manually.

2. Constraints may upset this process.  I would find someone else online that has a dynamic way to remove all constraints and print them out for reapplying them once data insertion is done.

3. Not all tables store an identity column so I could have set a bit somewhere if the table had an identity column or not.  Did not do that.  It's pretty simple thought to just remove the two lines that reference that though.

4. You should not run this script on tables over 5 million rows except one at a time and you may get an ASP.Net error if you are trying to copy and paste out of SSMS.  I would recommend on the extremely large tables running part 2 by itself on the table listed out explicitly and to file.

This tool actually helps a lot though as the SSMS wizard is slow, clunky, you need to set options.  There may be things I don't have as stated though like RedGate's tools. Or there may be SSIS things to automate the existing wizard faster.  I did not see much though except recreating the wheel each time for a dataset I wanted to copy.  I could have done multiple ones but you would have had to point and click one at a time and save it.  I chose my method because mainly I could run it FAST.  

Sorry I did not reply earlier and I know Redgate probably has a publishing tool that puts this to shame.


-- Part 1 is to create a view that can take the columns apart.  I --- need to create a string to create a script from.

create view {MySchema}.vIdentityInsert as



with obj as

	(

	select 

		s.name + '.' + o.name as TableName

	,	o.name as tbl

	,	o.object_id

	,	o.create_date

	,	o.modify_date

	from {MyDatabase}.sys.objects o

		join Creditnet.sys.schemas s on o.schema_id = s.schema_id

		and o.type_desc = 'USER_TABLE'

	) 

, cols as 

	(

	Select

		TableName

	,	obj.object_id

	,	obj.tbl

	,	create_date

	,	modify_date

	,	(

		select 

			name as n

		from {MyDatabase}.sys.columns c

		where c.object_id = obj.object_id

		for xml auto

		) as cols

	from obj

	where TableName in (select TableName from DataStaging.FTG.CreditnetTables)

	) 

, report as 

	(

	select

		TableName

	,	object_id

	,	tbl

	--,	create_date

	--,	modify_date

	,	replace 

		(

			left(right(cols,len(cols) -6),len(cols)-9)

					, '"/><c n="', '{|}'

		)

					 as ColumnsData

	from cols

	)

select 

	*

,	'Insert into ' + TableName + ' ( ' + replace(ColumnsData, '{|}',',') + ' ) Values' as insertstmt

,	'isnull(cast(' + replace(ColumnsData, '{|}', ' as varchar(max)),''NULL'') + ''{|}'' + isnull(cast(') + 

	' as varchar(max)), ''NULL'') as Data' as TableInsert

from report



-- part 2 is to create a procedure that can script this data but 

-- be dependent on the table you put in to get it's sys object ------ values from



if object_id('{MySchema}.spScripter') is not null

	drop proc {MySchema}.spScripter

GO



create proc Brett.spScripter

	(

		@Table	varchar(128)

	)



as



declare

	@Cols				varchar(max)

,	@Ins				varchar(256)

,	@SQL				varchar(max)





Select 

	@Cols = TableInsert

,	@Ins =	insertstmt

from {MySchema}.vIdentityInsert

where TableName = @Table



Set @SQL = 

'

declare @Temp table ( Data	varchar(max)  )



insert into @Temp values (''Use {NewDatabaseName}'')

;



insert into @Temp values (''set identity_insert ' + @Table + ' on'')

;



insert into @Temp

select ' + @Cols +



' 

from Creditnet.' + @Table + '

;



update @Temp

set Data = '''''''' + replace(Data,''{|}'','''''','''''') + ''''''''

where Data != ''set identity_insert ' + @Table + ' on'' 

and Data != ''Use {NewDatabaseName}''

;



update @Temp

set Data = replace(Data,''''''NULL'''''',''NULL'')

where Data like ''%NULL%''

;



update @Temp

set Data = ''' + @Ins + ' ('' + Data + '')''

where Data != ''set identity_insert ' + @Table + ' on'' 

and Data != ''Use Creditnet_Seed''

;



insert into @Temp values (''set identity_insert ' + @Table + ' off'')

;



insert into @Temp values (''GO'')

;



insert into @Temp values ('''')

;



Select * From @Temp

'



exec (@SQL)



-- Part 3 is to automate this on multiple tables with a query that -- loops the part 2 stored procedure



use DataStaging



if object_ID('{MySchema}.spTablesValidater') is not null

	drop proc {MySchema}.spTablesValidater

GO



create proc {MySchema}.spTablesValidater

	(

		@Type			varchar(16)

	,	@ResultType		varchar(16)

	)



as



declare @Temp table

	(

		TableID		int				identity(1,1)

	,	TableName	varchar(128)

	)



declare 

	@Name		varchar(128)

,	@Start		int



insert into @Temp

select TableName

from {MySchema}.{TablesToRunTable}

where Keep_Remove = @Type



Set @Start = 1



while @Start <= (select count(*) from {MySchema}.{TablesToRunTable} where Keep_Remove = @Type)

begin

	

	Select @Name =  TableName from @Temp where TableID = @Start

	

	

	if @ResultType = 'Script'

	begin

		exec {MySchema}.spScripter @Name

	End

		

	set @Start = @Start + 1

	

End



-- 4th part is a table that is a listing of tables to go off of.

-- I choose to use the sql built in function 'sp_tables' after my

-- 'use {Database to choose from}.  Then I added a column for 

-- identification of what to move or not (Keep_remove) and added

-- another sproc to populate it's criteria.

Open in new window

0
 
LVL 75

Expert Comment

by:Anthony Perkins
ID: 33971631
>>I know Redgate probably has a publishing tool that puts this to shame.<<
That is correct.
0
 

Author Closing Comment

by:CNCDeveloper
ID: 34021702
This was a difficult thing to ask for so I did not expect to get a complete answer but a list of possibilities which I did receive.
0

Featured Post

Control application downtime with dependency maps

Visualize the interdependencies between application components better with Applications Manager's automated application discovery and dependency mapping feature. Resolve performance issues faster by quickly isolating problematic components.

Join & Write a Comment

Having an SQL database can be a big investment for a small company. Hardware, setup and of course, the price of software all add up to a big bill that some companies may not be able to absorb.  Luckily, there is a free version SQL Express, but does …
Introduction SQL Server Integration Services can read XML files, that’s known by every BI developer.  (If you didn’t, don’t worry, I’m aiming this article at newcomers as well.) But how far can you go?  When does the XML Source component become …
Via a live example, show how to shrink a transaction log file down to a reasonable size.
Viewers will learn how to use the UPDATE and DELETE statements to change or remove existing data from their tables. Make a table: Update a specific column given a specific row using the UPDATE statement: Remove a set of values using the DELETE s…

757 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

18 Experts available now in Live!

Get 1:1 Help Now