Using Source Safe with SQL Server

Posted on 2009-05-07
Medium Priority
Last Modified: 2013-11-25

We have just installed a Visual Source Safe solution on our server and we are trying to figure out if it's possible to place database objects under source control.  Initial searches say only stored procedures, but it'd be nice if we could also keep track of DB structure changes as well.  Does anyone know if this is possible?


Question by:MClarkCCR
  • 4
  • 2
  • 2
  • +1
LVL 43

Expert Comment

ID: 24326319
Instead of db objects you should store scripts to create/update these objects. The reason for storing this info in VSS is a possibility to create these objects from the scratch to the latest state or to given point which is feasible by SQL scripts.

It means you have to create script for any database structure/definition change, run the script and store it into VSS. Even some lookup and application configuration data may be created by the script.
LVL 22

Expert Comment

by:Tapan Pattanaik
ID: 24326393
hi MClarkCCR,
                         Open Microsoft SQL Server Management Studio the in file menu click Source Control , then click Launch Microsoft Visual Source Safe, then add your files what ever you want.

ie Microsoft SQL Server Management Studio-->File-->Source Control-->Launch Microsoft Visual Source Safe-->( after opening visual Source safe u can add what ever files you want to add)


Author Comment

ID: 24326827

We would like if it could act in a similar way to how we work with websites in visual studio.

We check out files to a local copy of the project on our machines and test changes on a local IIS (not ASP.NET development server).  When we're happy with the changes we check in our files back to the VSS DB.  

We also have a public development server that we use so that clients can view the changes/progress to their site without affecting a live production version.  We deploy from VSS to the public dev only when we've got everything working as expected, so the client doesn't call us in a panic everytime the dev site throws an exception :).

We want the same to hold true for our DB, so that we can check in/checkout database data and structure changes to VSS, without actually modifying the DB that is wired to our public dev version of the site.  Then when we feel it's ready, we will push the DB stored in VSS to the version that the public dev will use.  We essentially want to feel like we can change/break whatever we want without the client knowing and only update what they can see when we're ready.

As far as scripting goes, can the process be automated so that any changes I make to the data/structure on the DB be put into a script, I guess like an iteration?  Or will I just have to rescript the entire database creation evertytime there's a release and then store that script in VSS?


After I open VSS and create a new project (i.e MyNewDevDB) and choose add files, do I have to go to the files on the DB server and add the .mdf and log file directly?

Thanks so much for your responses,

Granular recovery for Microsoft Exchange

With Veeam Explorer for Microsoft Exchange you can choose the Exchange Servers and restore points you’re interested in, and Veeam Explorer will present the contents of those mailbox stores for browsing, searching and exporting.

LVL 51

Expert Comment

by:Mark Wills
ID: 24329079
really should be treating each change to the database as a program. Write the program first, test, put it into source safe, then apply to the real database.

with my site, there are NO changes to the database without a script and each and every script is in source safe, and as each script is run, it updates a central table placing an entry with what it is, the version etc...

sounds a bit stifled but it works like a charm and can always find any script and any version... did have a bit of pushback from the developers at first, but then, every one does it and no complaints.
LVL 43

Accepted Solution

pcelba earned 1000 total points
ID: 24331153
I agree the fact each change must be scripted. Then the question is if it is necessary to store scripts in VSS because while the script is finalised and executed it should not be changed any more and all additional changes require another script... OK, scripts stored in VSS are better versioned at least.

The development work on database objects is rather different from application development. To allow pure local development you would need local copy of the database also. I don't know what data are you storing/processing in your database but they are obviously not suitable for versioning in VSS. Database backup is the most common method for versioning and latest structures conservation. Scripts or bulk exports for some lookup and initial data are the only exceptions for versioning in VSS.

Scripts are obviously "incremental" and they always should check if they were executed already to avoid their duplicate implementation. To script the whole database is not necessary because this work can be done by all incremental scripts application (in proper order, of course).

Your public dev is obvious solution for large applications. Then you need internal development database where you apply finished changes from all team members and local (temporary) databases for some long tasks which could block others from their work on internal development.

Automated scripting is very good idea and you maight program it yourself for incremental scripts creation. But it is not necessary because each change in your development should start with script creation, so all scripts are prepared by developers in advance. To script the whole database is easier but you obviously don't need such scripts. They are useless for production data because they destroy existing data obviously.
LVL 51

Assisted Solution

by:Mark Wills
Mark Wills earned 1000 total points
ID: 24339626
Versioning, date time, user, are all important aspects and reasons why VSS is important, and being able to rollback / retrieve / inspect a specific version is important.

If you allow changes directly to the dtabase, then it can get cumbersome to control. Some procedures and changes can involve a few attempts and really should be regarded a development task as per any development. Just because it facilitates instant and immediate change does not mean it can bypass all the controls we know and trust in a classic development task.

We insist on specific formatting of any script. It must contain a specific layout and documentation up the top. The database developers can test and change on a local instance of a database, upgrade to a common dev database for more integrated testing, and is then ready for QA. The QA process includes looking at the compliance of the script, running the script (we have written a stored procedure to facilitate and check and log the script), and the results tested. When handed over to QA, it essentially stops being a development project per se and is now part of the QA and acceptance testing space.

Part of the development plan is to know why the script is being generated and the expected results. part of the QA process is to ensure it runs and then all a testing stage to begin. Part of the QA environment is having an appropriate representative, reliable and predictable database as a basis. Once approved in QA, it can then be 'packaged' or scheduled for release to "live" databases.

Sounds like a lot of work, but set the environment and the processes and it will work with very few and far between surprises.

Having said that, you can run DDL triggers, while it may be more automated, it is harder to track and trace...

-- create a log table to record all DDL events
CREATE TABLE ddl_log (ChangedOn datetime, ChangedBy nvarchar(100), Event nvarchar(100), TSQL nvarchar(2000));
-- create a DDL trigger on this database
   DECLARE @data XML
   SET @data = EVENTDATA()
   INSERT ddl_log (ChangedOn, ChangedBy, Event, TSQL) 
   VALUES (GETDATE(), CONVERT(nvarchar(100), CURRENT_USER), @data.value('(/EVENT_INSTANCE/EventType)[1]', 'nvarchar(100)'), @data.value('(/EVENT_INSTANCE/TSQLCommand)[1]', 'nvarchar(2000)') ) ;
--Test the trigger
CREATE TABLE TestTable (a int);
ALTER TABLE testtable add b varchar(100);
CREATE VIEW vw_testtable as
SELECT * from testtable
ALTER view vw_testtable as
select a,b from testtable
CREATE procedure usp_get_testtable(@a int)
   select * from testtable where a = isnull(@a,a)
-- now tidy up example DDL activities
DROP VIEW vw_testtable;
DROP procedure usp_get_testtable;
DROP TABLE TestTable ;
-- now let us look at what was logged...
SELECT * FROM ddl_log ;
-- finish tidy up of our testing...
DROP Table ddl_log 

Open in new window

LVL 51

Expert Comment

by:Mark Wills
ID: 24339688
Oh, forgot one important thing... DDL triggers capture stuff AFTER they have happened - so can sometimes be a case of "yep, that is what killed the machine" rather than preventing nasties before they happen....

Author Comment

ID: 24339798

That is certainly a lot of information to take in.  I really appreciate your guys' insight to this situation.  As I'm not exactly a DBA by trade a lot of this stuff seems a bit over my head, but I've definitely got some good starting points for researching a solution that could work for our development process.  Our actual process is still in its infancy stage though, as we've previously relied solely on cowboy coding and all the 'perks' that go along with it.  

Thanks for all your help,

LVL 51

Expert Comment

by:Mark Wills
ID: 24339963
Mike, it has been a pleasure, and good to be thinking like this whilst still in its infancy stage. Easier to start with a few new rules than to break old bad habits :)

Featured Post

Technology Partners: We Want Your Opinion!

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

When trying to connect from SSMS v17.x to a SQL Server Integration Services 2016 instance or previous version, you get the error “Connecting to the Integration Services service on the computer failed with the following error: 'The specified service …
One of the most important things in an application is the query performance. This article intends to give you good tips to improve the performance of your queries.
Familiarize people with the process of retrieving data from SQL Server using an Access pass-thru query. Microsoft Access is a very powerful client/server development tool. One of the ways that you can retrieve data from a SQL Server is by using a pa…
Viewers will learn how the fundamental information of how to create a table.
Suggested Courses

578 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question