[Last Call] Learn about multicloud storage options and how to improve your company's cloud strategy. Register Now

x

Microsoft SQL Server 2008

49K

Solutions

17K

Contributors

Microsoft SQL Server 2008 is a suite of relational database management system (RDBMS) products providing multi-user database access functionality.Component services include integration (SSIS), reporting (SSRS), analysis (SSAS), data quality, master data, T-SQL and performance tuning. Major improvements include the  Always On technologies and support for unstructured data types.

Share tech news, updates, or what's on your mind.

Sign up to Post

Ever needed a SQL 2008 Database replicated/mirrored/log shipped on another server but you can't take the downtime inflicted by initial snapshot or disconnect while T-logs are restored or mirror applied?
You can use SQL Server Initialize from Backup Transactional replication as described below.
1
 
LVL 40

Author Comment

by:lcohan
Comment Utility
Yes, there are some SQL code samples under each "dot"/item number and in my opinion they should stay in that place for the step by step instructions to be helpful.
0
 
LVL 40

Author Comment

by:lcohan
Comment Utility
Updated now and thanks for all the details/guidance.
0
Tech or Treat! - Giveaway
LVL 10
Tech or Treat! - Giveaway

Submit an article about your scariest tech experience—and the solution—and you’ll be automatically entered to win one of 4 fantastic tech gadgets.

This article explains how to reset the password of the sa account on a Microsoft SQL Server.  The steps in this article work in SQL 2005, 2008, 2008 R2, 2012, 2014 and 2016.
4
 
LVL 51

Expert Comment

by:Mark Wills
Comment Utility
Good Article and very well presented. Gets my YES vote :)
0
 

Expert Comment

by:Abrienne Jonethan
Comment Utility
Very Good and Informative Article,
I have also gone through other blog as there are other ways to reset a Lost SA Password in Microsoft SQL Server.
Apart from recovering from manual way you can also use free software like SQL Password Recovery
0
In this article we will get to know that how can we recover deleted data if it happens accidently. We really can recover deleted rows if we know the time when data is deleted by using the transaction log.
4
 
LVL 10

Expert Comment

by:Jason clark
Comment Utility
Thanks for sharing this helpful article. But in my scenario i didn't have any recent backup of my deleted records also the recovery mode is set to simple. Then I tried SysTools SQL Log Analyzer and successfully recovered deleted records via SQL Transaction log file. it's really helpful if you don't have any backup of your database.
1
 

Expert Comment

by:parag Deshmukh
Comment Utility
Hi Yashwant,

I did not get one thing, when we are taking full backup of database before deletion of rows. Restore of that backup should directly give us all rows since we have taken backup before deletion of rows. Then why do we need transaction log backup?
0
Use this article to create a batch file to backup a Microsoft SQL Server database to a Windows folder.  The folder can be on the local hard drive or on a network share.  This batch file will query the SQL server to get the current date & time and will then append that to the file name.
0
 

Expert Comment

by:Shaad Parihar
Comment Utility
The above code is for one database.
If i have multiple databases and i want to use a single batch file.how i add multiple database in the code.
0
How to leverage one TLS certificate to encrypt Microsoft SQL traffic and Remote Desktop Services, versus creating multiple tickets for the same server.
0
In this article I will describe the Copy Database Wizard method as one possible migration process and I will add the extra tasks needed for an upgrade when and where is applied so it will cover all.
4
 
LVL 7

Expert Comment

by:Yashwant Vishwakarma
Comment Utility
Nice & informative article Vitor !!!
Voted as Good Article :)
0
 
LVL 52

Author Comment

by:Vitor Montalvão
Comment Utility
Thank you Yashwant
0
In this article I will describe the Backup & Restore method as one possible migration process and I will add the extra tasks needed for an upgrade when and where is applied so it will cover all.
8
 
LVL 66

Expert Comment

by:Jim Horn
Comment Utility
Voted Yes.
0
 
LVL 7

Expert Comment

by:Yashwant Vishwakarma
Comment Utility
Nice Explanation !!!
0
In this article I will describe the Detach & Attach method as one possible migration process and I will add the extra tasks needed for an upgrade when and where is applied so it will cover all.
2
 
LVL 66

Expert Comment

by:Jim Horn
Comment Utility
Nicely done.  Voted Yes.
0
Occasionally there is a need to clean table columns, especially if you have inherited legacy data. There are obviously many ways to accomplish that, including elaborate UPDATE queries with anywhere from one to numerous REPLACE functions (even within REPLACE functions). Over time this can be quite unreadable. It can also become a maintenance chore where this is not a single one-off effort.

This happened in a recent question here on EE. I'd like to share the solution that came out of that thread. I have written a function called udf_CleanDataFn. It can be invoke within any query as follows:
 
SELECT dbo.udf_CleanDataFn(YourColumn, @Options)

@Options is a varchar(255) field, which can accept any of the following "options":
 
  • ltrim - This will left-trim the value in YourColumn.
  • rtrim - This will right-trim the value in YourColumn.
  • trim - This will both left- and right-trim the value.
  • nocomma - This will remove any and all comma's.
  • nospace - This will remove any and all spaces.
  • nopunc - This will remove all standard punctuations (!@#$% etc)
  • nonum - This will remove all numbers (0-9).
  • noalpha - This will remove all alpha characters (A-Z and a-z).
  • alphaonly - This will remove everything except alpha characters.
  • numonly - This will remove everything except numbers.

Following are a few working examples that will demonstrate the different results . This first example takes out only the commas:
 
SELECT dbo.udf_CleanDataFn('This !  is *  a  13,21  @  test', 'nocomma')
Result:
1
 

Expert Comment

by:Nick Sawe
Comment Utility
ok thanks, i will give it a go. thanks for sharing this by the way !
0
 
LVL 20

Author Comment

by:dsacker
Comment Utility
You're welcome. If it proves helpful, please click the "Helpful" button. :)
0
If you have heard of RFC822 date formats, they can be quite a challenge in SQL Server. RFC822 is an Internet standard format for email message headers, including all dates within those headers. The RFC822 protocols are available in detail at:
 

The problem is that RFC822 date formats do not always fit within SQL Server's valid date formats. The CONVERT/CAST functions within SQL Server will only convert a limited variety of date formats. When trying to convert RFC822 date formats, some of the RFC822 accepted formats will result in the well-known T-SQL error:
 
Msg 241, Level 16, State 1, Line 1 
Conversion failed when converting date and/or time from character string.

According to the RFC822: Standard for ARPA Internet Text Messages, as long as a date adheres to the constraints in section 5 of the RFC822 protocols, it is a valid RFC822 date. Many RFC822 Dates fail to convert into a T-SQL datetime format.

For instance, the following is a valid RFC822 date:
 
Wed, 27 Sep 2006 21:49:19 GMT

This format (and a number of other RFC822 formats) will not convert to a datetime type. In addition SQL Server does not inherently recognize nor convert time zone codes. (See footnote 1 at the end of this article for a list of valid RFC822 date formats.)

I have written a function called …
1
 
LVL 20

Author Comment

by:dsacker
Comment Utility
Appreciate that. And thanks to Eric. He gave me some good pointers along the way.
0
 
LVL 66

Expert Comment

by:Jim Horn
Comment Utility
I ran the code blocks and the result was excellent.  Nice job.
0
Prepare for your VMware VCP6-DCV exam.
LVL 1
Prepare for your VMware VCP6-DCV exam.

Josh Coen and Jason Langer have prepared the latest edition of VCP study guide. Both authors have been working in the IT field for more than a decade, and both hold VMware certifications. This 163-page guide covers all 10 of the exam blueprint sections.

SQL Server engine let you use a Windows account or a SQL Server account to connect to a SQL Server instance. This can be configured immediatly during the SQL Server installation or after in the Server Authentication section in the Server properties screen, Security tab:
SQLAuthenticationMode.PNGBy default Windows Authentication mode is selected since is the one recommended by Microsoft and also the one that is more secure because it uses Windows Security tokens so no passwords will be passed through the network when connecting to the SQL Server instance. It is also the one that is more easy to use since the user doesn't need to remember one more login name and password and it is more easy for maintenance since it will be managed by the Active Directory (AD) so it will respect the user AD policies.
The creation of a Windows authentication login is as simple as providing the domain and user name as the Login name:
DomainLogin.PNGThe same can me performed using the following T-SQL command:
CREATE LOGIN [MyDomain\DomainUserName] FROM WINDOWS

Open in new window

In Windows authentication mode the sa login is disabled so ensure that someone else has the sysadmin role.


SQL Server logins are managed inside the SQL Server engine. The passwords are stored in the SQL Server master database and when connecting to the SQL Server instance the user credential is passing through the network. So, when creating a SQL Server authentication login a login name and password need to be provided:
MSSQLLogin.PNGThe same can me performed using the following T-SQL command:


Open in new window

6
 
LVL 52

Author Comment

by:Vitor Montalvão
Comment Utility
Submitted
0
Tracking database activity

There are a number of ways to view SQL Server activity; SQL Profiler is one of the most popular, which allows you to see the number of type of transactions being excuted. However, what if you want to see the level of activity on each table in your database? How many rows are being queried or changed during a period of time or every second? A tool like SQL Profiler will give you the type and volume of transactions in motion at any given time (and the T-SQL actually being executed), but you can't use it to directly tell which database tables are being touched and how many rows are affected.

Using sys.dm_db_index_usage_stats
 
However, as of SQL Server 2008, there's a handy dynamic management view called sys.dm_db_index_usage_stats that shows you number of rows in both SELECT and DML statements against all the tables and indexes in your database, either since the object was created or since the database instance was last restarted:
 
SELECT *
  FROM sys.dm_db_index_usage_stats

Open in new window


However, it uses object and index ID numbers and isn't very friendly, as well as displaying usage numbers for system-based processes, which aren't of much use when you're comparing application activity. That said, here's a query for the same view, but using object/index names:
 

SELECT OBJECT_NAME(ius.object_id) as TableName,
       si.name as IndexName,
       si.index_id as IndexID,
       ius.user_seeks,
       ius.user_scans,

Open in new window

3
 
LVL 66

Expert Comment

by:Jim Horn
Comment Utility
Voted yes.  This article was very useful for me as I've been asked to come up with a list of unused databases / tables in preparation for an AWS migration.  Thanks for writing..
0
 

Expert Comment

by:Andrew Red
Comment Utility
Good info!
But there is a problem in the query, you missed to insert the "index_id" into the table the and the use it to do the join!

-- Do actual table comparison and give results
      SELECT a.TableName,
               a.updatedrows - isnull(b.UpdatedRows,0) as RowsUpdated,
              CONVERT(INT, (a.updatedrows - isnull(b.UpdatedRows,0)) / @SecondsSince) as RowsPerSecond
       FROM #TableActivity_After a
       LEFT
       JOIN #TableActivity_Before b
         ON b.TableName = a.TableName  AND b.index_id=a.index_id !!!!!!!!!!!
    WHERE a.updatedrows - isnull(b.UpdatedRows,0) > 0
      ORDER BY RowsUpdated DESC
0
In couple weeks ago, I encountered an extremely difficult problem while deploying 2008 SSIS packages to a new environment (SQL Server 2014 standard). 

My scenario is: We have one C# application that is calling 2008R2 SSIS packages to load text files by C# code, not the dtexec tool. One of my clients upgraded their system to SQL Server 2014 while other clients have kept the 2008 version, but are using the same 2008 SSIS packages. We don't want to duplicate packages because it will take more time to manage (one for SQL 2008 and another for SQL 2014). My problem is "how to run them in both environments successfully." Some components of 2008 SSIS refer to 10.0.0.0 version of .dll files such as Script Task component... which are not existed in new environment, so they will fail when run. We will see this error message (or something like it):

System.Runtime.InteropServices.COMException: Retrieving the COM class factory for component with CLSID {BA785E28-3D7B-47AE-A4F9-4784F61B598A} failed due to the following error: 80040154 Class not registered (Exception from HRESULT: 0x80040154 (REGDB_E_CLASSNOTREG)).

After researching,  I found out the solution to resolve this problem by inserting some configuration codes into .config file of our application.
<runtime>
    <gcServer enabled="true"/>
    <disableCommitThreadStack enabled="true"/>
    <generatePublisherEvidence enabled="false"/>
    <assemblyBinding 

Open in new window

0
 
LVL 8

Expert Comment

by:Andrei Fomitchev
Comment Utility
I had similar issues with DTS/SSIS and 32-bit/64-bit.
I re-implemented SSIS with C# and stored procedures - it has no dependencies from MS SQL Server version anymore.
0
 
LVL 8

Author Comment

by:Dung Dinh
Comment Utility
Hi Andrei,

Your approach is the simplest but it only works if you have several SSIS packages.

How long will you spend if you have several hundres packages.

Thanks,
0
EE-OutofDate.png
A Guide to installing Microsoft Failover Clustering and SQL Failover Clustering on VMware vSphere on 2 Windows VMs across 2 VMware hosts

VMware vSphere supports clustering using MSCS across virtual machines. Clustering virtual machines can
reduce the hardware costs of traditional high-availability clusters

Clustering Configuration Overview

Several applications use clustering, including stateless applications such as Web servers, and applications with built-in recovery features such as Database servers. You can set up MSCS clusters in several configurations, depending on your environment.

A typical clustering setup includes:
 
  • Disks that are shared between nodes. A shared disk is required as a quorum disk. In a cluster of virtual machines across physical hosts, the shared disk must be on a Fibre Channel (FC) SAN.
  • A private heartbeat network between nodes
 
sql1.png
Pre-Requisites
 
  • Failover Clustering feature is available with Windows Server 2008/R2 and above Enterprise/Datacenter editions. You don't have this feature with the Standard edition of Windows Server 2008/R2
  • You also need a form of Shared Storage (FC)
  • To use the native disk support included in failover clustering, use basic disks, not dynamic disks and format as NTFS
  • 1 or more Windows 2008 R2 Domain Controller Virtual Machines with Active Directory Services and a Domain
  • 1 x Windows Server 2008 R2 Virtual Machine for Node 1 of the Windows Cluster with 2 NICs
0
 
LVL 1

Author Comment

by:rhiancohen
Comment Utility
Hi LHerrou,

Thankyou for your comments

Can you expand on your 2 points as I would really like to know where it can be improved and especially not be dangerous to anyone

1. Would it be better to have this article  based on Windows Server 2012 and SQL 2012?  Did you mean using Windows Server 2008 R2 is dated? The VMware version can be 4 or 5 upwards for this article. What do you mean when you say much of the content is not original? Do you mean other people have written the similar articles?

2. Could you tell me what advice could be construed as dangerous and I will amend asap. Any feedback welcome as I didn't want to put anyone at risk with any of this article's content.

Many Thanks

Rhian
0
Hi all,

It is important and often overlooked to understand “Database properties”. Often we see questions about "log files" or "where is the database" and one of the easiest ways to get general information about your database is to use “Database properties”. I am writing here a very brief explanation for each and every property of SQL Database and will discuss the details in future articles.

To show properties, you use SSMS and right click on the database name and select properties. A pop up window will be displayed with options (or pages) on the left and relevant information on the right. So, once you have it displayed, let’s start discussing the properties one by one.


1.  General Properties
 General PropertiesIn the above image you can see different properties as below:

Last Database Backup: It shows the Timestamp of last backup.
Last Database Log Backup: It shows the Timestamp of last log backup
Name : It shows name of Database
Status : It shows status of Database like (Online, Offline etc)
Owner : It shows name of owner of Database like (sa, dbo or any user)
Date Created : It shows creation date of Database
Size : It shows allocated space to Database.
Space Available : It shows available space to Database (Allocated space – Used space).
No of Users: It shows no of users those who have permission to Database.
Collation : It shows the current collation for the database. It is a big thing to understand. We will see in future series.


2.  Files
Files
3
 
LVL 10

Expert Comment

by:Ramesh Babu Vavilla
Comment Utility
Good one
0
 
LVL 7

Expert Comment

by:Yashwant Vishwakarma
Comment Utility
Nice information shared dude !!!
0
Naughty Me. While I was changing the database name from DB1 to DB_PROD1 (yep it's not real database name ^v^), I changed the database name and notified my application fellows that I did it. They turn on the application, and everything is working. A few minutes later, alerting emails start to flood my phone - "Hi there, Log Shipping is broken........." I made a wrong assumption that log shipping would be just intelligent enough to work with the changes. Think about it again -- with the way log shipping was configured, how would it know the database name was changed and reconfigured? I should have turned off log shipping first. Shame on me.

So, what's next? The application is back online so there is no turning back (or the manager would be very upset). Here is what can be done to rectify the messy situation.  I researched on what to do, and then this MSDN URL came through:

How to: Remove Log Shipping (Transact-SQL)

So firstly, connect to the primary database server to break the relationship between the primary and secondary database. This is done by running the built-in stored procedure named sp_delete_log_shipping_primary_secondary.  Here's the example:

EXEC master.dbo.sp_delete_log_shipping_primary_secondary
@primary_database = N'DB1'
,@secondary_server = N'PrimaryServer'
,@secondary_database = N'DB1_DR'

Open in new window


Easily done without hassle. Then, connect to the secondary server and remove the log shipping job information with the stored procedure named sp_delete_log_shipping_secondary_database, again here is the code:

sp_delete_log_shipping_secondary_database
@secondary_database = 'DB1_DR'

Open in new window

4
 

Expert Comment

by:Angel Vyang
Comment Utility
Thank you so much. Very nice article. It's worked for me.
0
Long way back, we had to take help from third party tools in order to encrypt and decrypt data.  Gradually Microsoft understood the need for this feature and started to implement it by building functionality into SQL Server. Finally, with SQL 2008, we have improved and in-built facilities for Encryption and Decryption within SQL Server.

To encrypt data and decrypt it, it is really a very crucial task as one mistake and your data go out of your reach. It can become more difficult when you will encrypt some data in one database in one server and try to restore that database into different server.

In this Article, I am going to show you the script which can encrypt data in one database on one server, take its backup, and restore that encrypted database anywhere else and you will get your data with 100% security and no data risk.

We will now create one database which is going to be used in throughout this example.

 
--CREATE First Database for encryption and decryption testing
USE master;
GO
CREATE DATABASE Encry1
ON 
( NAME = Encry1_dat,
    FILENAME = 'C:\Encry1Data.mdf',
    SIZE = 3,
    MAXSIZE = 5,
    FILEGROWTH = 1 )
LOG ON
( NAME = Encry1_log,
    FILENAME = 'C:\Encry1Log.ldf',
    SIZE = 1MB,
    MAXSIZE = 5MB,
    FILEGROWTH = 1MB ) ;
GO

Open in new window


As soon as you are ready with database, let us now move on to create one table, we will encrypt data in that table after inserting some records.

 
--create one table which will use encrypted and decrypted data in it
Use Encry1
GO
Create Table emps
(
      name varchar(20),
      dept varchar(20),
      EncryptedName Varbinary(256),
      DecryptedName Varchar(20)
)

Insert into emps (Name,Dept)
Select 'Ritesh','MIS' union all
Select 'Rajan','Acct'
union all
Select 'Bhaumik','IT'

Select * from emps
go

Open in new window


Ok. Now we have database and table ready to encrypt. Before we really encrypt the data, we should be armed with some weapons. Let us prepare it.

 
--create one Database Master Key
Create master key
Encryption by Password ='$qlhub1234'

Open in new window


To know more about Database Master Key, Please Click Here.

Now, we will need Certificate based on the database master key we have generated above.

 

Open in new window

12
 
LVL 66

Expert Comment

by:Jim Horn
Comment Utility
Very well written.  Voted Yes.
0
 

Expert Comment

by:Crazy_SQL
Comment Utility
Well Written
0
There have been several questions about Large Transaction Log Files in SQL Server 2008, and how to get rid of them when disk space has become critical. This article will explain how to disable full recovery and implement simple recovery that carries much smaller transaction log files.

Here is a recommended MS document showing proper procedures for truncating log size. I recommend you review this article before proceeding: http://msdn.microsoft.com/en-us/library/ms189085(v=SQL.100).aspx

The recovery model in SQL server is set by default for all databases to "FULL". In the full recovery model all transactions from the beginning of the database creation are kept in a file that grows along with the database file until it reaches outrages sizes. There are methods of shrinking these transaction log files but that doesn't always provide a long term solution. Full recovery is recommended so that in the event of corruption or failure the database can be recovered to the very minute the database went down. Simple recovery will only allow you to recover the database to the last backup copy of the database.

If you are okay with losing a small amount of transaction log you can switch to the simple recovery model and free up a large amount of space on your server, by following these simple steps as it worked for me:

Perform a Full Backup of the Database

Open SQL Management Studio

Browse to the database with the large transaction log file and right click the database, go…
1
 
LVL 60

Expert Comment

by:chapmandew
Comment Utility
I would never recommend this route to shrink a transaction log.
0
 
LVL 51

Expert Comment

by:Mark Wills
Comment Utility
The article makes more sense if you were to follow the recommendations of the Author in the second paragraph :

I recommend you review this article before proceeding: http://msdn.microsoft.com/en-us/library/ms189085(v=SQL.100).aspx

It covers some of the concerns mentioned above and provides far greater / more in depth account of what the Transaction log is and does along with the links on how to manage in various conditions.

I agree by and large with some of the concerns, and the Article should be considered as a last resort, and there are other methods as described in the MSDN article.

Truncating the log and reducing the physical disk consumption are quite different. Going from FULL recovery to SIMPLE would be "if all else fails" and life is fairly critical (like transaction log has filled the disk).

Still, the first approach if you must shrink the physical size of the transaction log would be DBCC SHRINKFILE as per : http://msdn.microsoft.com/en-us/library/ms178037(v=sql.100).aspx
0
by Mark Wills

PIVOT is a great facility. Well that was the opening line in my previous similarly named article : http://www.experts-exchange.com/Microsoft/Development/MS-SQL-Server/SQL-Server-2005/A_653-Dynamic-Pivot-Procedure-for-SQL-Server.html

But what if you want to pivot multiple columns, or maybe multiple aggregates, or both, or don't have the PIVOT statement?  That's exactly what happened recently in a question where the SQL 2008 server was actually pointing to a SQL 2000 database.

So how do we solve such a problem?  Simple: we go back to basics and write our own pivot routine and that is exactly what the solution was for the above situation (seen here: http://www.experts-exchange.com/Microsoft/Development/MS-SQL-Server/SQL-Server-2005/Q_26675089.html).

Basically, we need to get the different values which will become the new columns, aggregate something for those columns, and then do a group by and case statement to check for those columns. Sounds easy, right? Well, not quite.  For the specific problem it can be easier than the approach actually taken, but when you start playing with dates (such as month names) then you have to be able to make sure that the correct sequence is achieved and that adds a few challenges when trying to resolve the problem using data to help build the dynamic SQL.

So lets get started.  But first we need some data...

-- first our customer masterfile (yeah I know it is pretty ordinary)

CREATE TABLE 

Open in new window

11
 
LVL 51

Author Comment

by:Mark Wills
Comment Utility
Hi keithfearnley,

Thanks for the feedback and for the additional insights.

Apologies for not acknowledging your comment earlier, was looking at extending the routines for a MTD and YTD requirement and only just noticed your comment.

Cheers,
Mark
0
 
LVL 8

Expert Comment

by:Leo Torres
Comment Utility
Hello mark,
I noticed that you used master..spt_values
why will it work with Month,Day, Year, But not Quarter
I have some commented code I added but it was to analyze

USE [AdventureWorks2008R2]
GO
/****** Object:  StoredProcedure [dbo].[uPivot]    Script Date: 02/28/2012 13:56:48 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO

ALTER PROCEDURE [dbo].[uPivot] (
                 @sourcedata varchar(8000),
                 @Pivot_Over_Source_Column varchar(2000), 
                 @pivot_value_column varchar(2000), 
                 @Pivot_Value_Aggregates varchar(2000),    
                 @pivot_value_format varchar(500) = NULL)
AS

/*
@sourcedata is any legitimate select statement
@Pivot_Over_Source_Column is the column(s) that define the row
@pivot_value_column is the source of data giving the new column names
@Pivot_Value_Aggregates is what the individual cells will contain NB mandatory square brackets immediately after each aggregate
@pivot_value_format is used as the convert style for @pivot_value_column with the additions of 'day' and 'month' and 'year'

-----------------------------+-----------------------+-----------------------+
Pivot_Over_Source_Column     |Pivot_Value_Column 1   |Pivot_Value_Column 2   |
-----------------------------+-----------------------+-----------------------+
row1                         |Pivot_Value_Aggregates |Pivot_Value_Aggregates |
-----------------------------+-----------------------+-----------------------+
row1                         |Pivot_Value_Aggregates |Pivot_Value_Aggregates |
-----------------------------+-----------------------+-----------------------+

Now those aggregates in @Pivot_Value_Aggregates (yes there can be more than one)
must be expressed as a legit aggregate followed by [] or [suffix] (no spaces)

e.g.	1) sum(my_column)[]       
	2) count(my_column)[_count], max(my_column)[_max]

The first example will simply have the headings as defined by @pivot_value_column
The second will have two columns the first as per @pivot_value_column plus the suffix '_count' 
the second column as per @pivot_value_column plus the suffix '_max'

*/

Set nocount on

declare @sql varchar(8000)
declare @columns varchar(8000)
declare @id int

-- we need to use a few temp tables to unpack our delimited list of columns.
-- while these temp tables will survive just within this stored procedure, 
-- I have an old habit of dropping first - mainly because I write these things initially just as T-SQL code then turn it into a SP

If object_id('tempdb..#pivot_column_name','U') is not null 
	drop table #pivot_column_name
If object_id('tempdb..#pivot_column_style','U') is not null 
	drop table #pivot_column_style
If object_id('tempdb..#pivot_column_data','U') is not null 
	drop table #pivot_column_data

create table #pivot_column_name (id int identity,pivot_column_name varchar(255))
create table #pivot_column_data (id int identity,pivot_column_name varchar(255),pivot_column_data varchar(255),pivot_column_donor varchar(255))
create table #pivot_column_style (id int identity,pivot_column_style varchar(5))

-- now unpack our delimited list of pivot columns

insert into #pivot_column_name (pivot_column_name)
select substring(pivot_value_column,number,charindex(',',pivot_value_column+',',number)-number) 
from (select @pivot_value_column as pivot_value_column) p
cross join (select number from master..spt_values where type = 'p')  n
where substring(','+pivot_value_column,number,1) = ','
and number <= len(pivot_value_column)


--Set @sql =
--'insert into #pivot_column_name (pivot_column_name)
--select substring(pivot_value_column,number,charindex('','',pivot_value_column+'','',number)-number) 
--from (select '+@pivot_value_column+' as pivot_value_column) p
--cross join (select number from master..spt_values where type = ''p'')  n
--where substring('',''+pivot_value_column,number,1) = '',''
--and number <= len(pivot_value_column)'

--Print @sql

--exec @sql

--insert into #pivot_column_style (pivot_column_style)
Select a.* Into #temp from (
select substring(pivot_column_style,number,charindex(',',pivot_column_style+',',number)-number) as id,n.high,n.low,n.number,n.status,n.type
from (select isnull(@pivot_value_format,'Quarter') as pivot_column_style) p
cross join (select s.high,s.low,s.number,s.status,s.type from master..spt_values s /*where type = 'p'*/)  n
where substring(','+pivot_column_style,number,1) = ','
and number <= len(pivot_column_style)
) a
Select * from #temp

insert into #pivot_column_style (pivot_column_style)
select substring(pivot_column_style,number,charindex(',',pivot_column_style+',',number)-number)
from (select isnull(@pivot_value_format,'') as pivot_column_style) p
cross join (select number from master..spt_values where type = 'p')  n
where substring(','+pivot_column_style,number,1) = ','
and number <= len(pivot_column_style)
 --and unpack out delimited list of formats needed for those columns

--Set @sql =
--'insert into #pivot_column_style (pivot_column_style)
--select substring(pivot_column_style,number,charindex('','',pivot_column_style+'','',number)-number)
--from (select isnull('+@pivot_value_format+','''') as pivot_column_style) p
--cross join (select number from master..spt_values where type = ''p'')  n
--where substring('',''+pivot_column_style,number,1) = '',''
--and number <= len(pivot_column_style)'

--Print @sql

--exec @sql

Select * from #pivot_column_name
Select * from #pivot_column_style

-- now we can examine our source data to get the unique values for the new columns

Select @id=min(id) from #pivot_column_name

While @id>0
Begin
    Select @pivot_value_column=pivot_column_name from #pivot_column_name where id = @id
    set @pivot_value_format = isnull((select pivot_column_style from #pivot_column_style where id = @id),'')
    set @sql = 'select distinct ''' + -- next column is the definition we need to use when comparing the actual data in the final query
               case when isnumeric(isnull(@pivot_value_format,'')) = 1 then 'case when convert(varchar(255),' + @pivot_value_column +','+@pivot_value_format +')'
                    when isnull(@pivot_value_format,'') = 'Day' then 'case when datename(Day,' + @pivot_value_column +')'
                    when isnull(@pivot_value_format,'') = 'Month' then 'case when datename(Month,' + @pivot_value_column +')'
                    when isnull(@pivot_value_format,'') = 'Year' then 'case when datename(Year,' + @pivot_value_column +')'
                    when isnull(@pivot_value_format,'') = 'Quarter' then 'case when datename(Quarter,' + @pivot_value_column +')'

                    else 'case when convert(varchar(255),' + @pivot_value_column + ')'
               end
               +''','+    -- next column is the data we need to use when comparing the actual data in the final query
               case when isnumeric(isnull(@pivot_value_format,'')) = 1 then ''', convert(varchar(255),' + @pivot_value_column +','+@pivot_value_format +')'
                    when isnull(@pivot_value_format,'') = 'Day' then ' datename(Day,' + @pivot_value_column +')'
                    when isnull(@pivot_value_format,'') = 'Month' then ' datename(Month,' + @pivot_value_column +')'
                    when isnull(@pivot_value_format,'') = 'Year' then ' datename(Year,' + @pivot_value_column +')'
                    when isnull(@pivot_value_format,'') = 'Quarter' then ' datename(Quarter,' + @pivot_value_column +')'

                    else ' convert(varchar(255),' + @pivot_value_column + ')'
               end
               +','+ @pivot_value_column 
               +' from ('+ @sourcedata +') srce order by '+@pivot_value_column
    Print @sql
    insert into #pivot_column_data(pivot_column_name,pivot_column_data,pivot_column_donor)
    exec (@sql)
    Select @id=min(id) from #pivot_column_name where id > @id
end 
Select * from #pivot_column_data
-- Because of dates we need to keep the original datetime to get correct chronology, but that could cause dupes, so no must remove any dupes from our column list

delete #pivot_column_data -- kill any dupes
where exists (select * from #pivot_column_data d2 where d2.id > #pivot_column_data.id and d2.Pivot_column_name = #pivot_column_data.pivot_column_name and d2.pivot_column_data = #pivot_column_data.pivot_column_data)

-- with a distinct list of new columns, we can now construct the aggregate values for each of the columns

select @columns = isnull(@columns+',',',') + replace(replace( @pivot_value_aggregates,'(','(' + Pivot_Column_name +' =''' + pivot_column_data + ''' THEN ' ),')[', ' ELSE '''' END) as [' + pivot_column_data )
from #pivot_column_data

-- with the columns fully defined, it becomes a fairly simple group by 

set @sql = 'select ' + @pivot_over_source_column+@columns + ' from ('+ @sourcedata +') srce GROUP BY ' + @pivot_over_source_column
exec (@sql)

-- now clean up - unnecessary inside a stored procedure, but again that old habit...

if object_id('tempdb..#pivot_column_name','U') is not null drop table #pivot_column_name
if object_id('tempdb..#pivot_column_data','U') is not null drop table #pivot_column_data
If object_id('tempdb..#pivot_column_style','U') is not null drop table #pivot_column_style

-- and that is that.

Open in new window

0
Learn Veeam advantages over legacy backup
LVL 1
Learn Veeam advantages over legacy backup

Every day, more and more legacy backup customers switch to Veeam. Technologies designed for the client-server era cannot restore any IT service running in the hybrid cloud within seconds. Learn top Veeam advantages over legacy backup and get Veeam for the price of your renewal

After restoring a Microsoft SQL Server database (.bak) from backup or attaching .mdf file, you may run into "Error '15023' User or role already exists in the current database" when you use the "User Mapping" SQL Management Studio functionality to allow necessary user permissions to the new database. This is puzzling enough since a database backup should bring us back to the original state. This is caused by Security Identification numbers (SID) that are mismatched or 'orphaned' in the sysusers table.

To resolve this problem, we can make use of the SQL Server stored procedure sp_change_users_login.

From MSDN, the syntax of sp_change_users_login is:

sp_change_users_login [ @Action= ] 'action'
    [ , [ @UserNamePattern= ] 'user' ]
    [ , [ @LoginName= ] 'login' ]
    [ , [ @Password= ] 'password' ]
[;]

1. Switch to the target database


use mydatabasename

Open in new window


2. List problematic SID


Lists the users and corresponding security identifiers (SID) in the current database that are not linked to any login. user, login, and password must be NULL or not specified. This step is informative and thus optional.

exec sp_change_users_login @Action='Report'

Open in new window


3. Fix problematic SID


Links the specified user in the current database to an existing SQL Server login. User and login must be specified. password must be NULL or not specified. The login name(s) to provide is/are from the result reported in step 2.

exec sp_change_users_login @Action='Update_One', @UserNamePattern='MyLoginID', @LoginName='MyLoginID'

Open in new window


Repeat running 'Update_One' statement for all login names.

4. Verify all is okay now

2
 
LVL 7

Expert Comment

by:Yashwant Vishwakarma
Comment Utility
Very helpful article
Voted as Good Article :)
0
     When we have to pass multiple rows of data to SQL Server, the developers either have to send one row at a time or come up with other workarounds to meet requirements like using XML to pass data, which is complex and tedious to use. There is a SQLBulkCopy object available in .NET to send multiple rows of data to SQL Server at once, but this still cannot be passed to a stored procedure. SQL Server 2008 provides a new feature called "Table-Valued Parameters". This provides us the ability to easily pass a table to a stored procedure from T-SQL code or from an application as a parameter.

Table-Valued Parameters

SQL Server 2008 Provides a New Feature Called Table-Valued Parameters
This provides us to easily pass a table to a stored procedure from T-SQL code or from an application as a parameter.
We first need to create a user defined type

Database Node > Programmability > Types > User-Defined Table Types

 Database Node > Programmability > Types > User-Defined Table Types

Script to create a User-Defined Table type
--Create User-defined Table Type
 CREATE TYPE dbo.MessageQueue AS TABLE 
 (
    id int PRIMARY KEY, 
    MessageType varchar(20) NOT NULL,  
    MessageContent varchar(1000) NOT NULL,
    PushDate datetime NOT NULL DEFAULT GETDATE()
 )
 GO
 
 --Using the User-Defined Table Type
 DECLARE @MyMessageQueue MessageQueue
 
 INSERT INTO @MyMessageQueue(id ,MessageType,MessageContent,PushDate)
 VALUES (1,'SMS','Hello World','2009-09-30 10:00:00'),
          

Open in new window

1
This is basically a blog post I wrote recently. I've found that SARGability is poorly understood, and since many people don't read blogs, I figured I'd post it here as an article.

SARGable is an adjective in SQL that means that an item can be found using an index (assuming one exists). Understanding SARGability can really impact your ability to have well-performing queries.

Incidentally – SARGable is short for Search ARGument Able.

If you have an index on phone numbers using LastName, followed by FirstName, including the suburb and address fields, you have something akin to the phone book. Obviously it becomes very easy to find people with the surname “Farley”, with the first name “Rob”, but often you want to search for people with the surname “Farley” with the first name beginning in ‘R’. I might be listed as “R Farley”, “R J Farley”, “Rob Farley”, “Robert Farley”, “Robert J. Farley”, or a few other variations. It complicates things even more if you need to find someone with a name that shortens a different way, like John/Jack, or Elizabeth/Betty. This is where SARGability comes into play.

Let’s just think about the First names for a minute.

If you want to find all the names that start with R, that’s easy. They’re all together and you can get to them very quickly. This is comparable to a query in SQL Server like this, (taking advantage of the index on the Name column in Production.Product)
    select Name, ProductID
 

Open in new window

13
 
LVL 60

Expert Comment

by:Kevin Cross
Comment Utility
Very nicely done, Rob!
I finally got off my tail to put my official "Yes" vote above, but you won me over on the first reading.

Cheers,
Kevin
0
 
LVL 66

Expert Comment

by:Jim Horn
Comment Utility
Nice article. Useful for me as I'm studying for the 70-433 exam.  

btw msmvps.com linky no worky
0
I have written a PowerShell script to "walk" the security structure of each SQL instance to find:  

      Each Login (Windows or SQL)
            * Its Server Roles
            * Every database to which the login is mapped
            * The associated "Database User" for this login
                  * Its Database roles
                  * Any explicitly assigned permissions

This is not an "all encompassing" look at every piece of security, but it does give a pretty comprehensive look at who has access to what.  I wrote a script because gathering this information manually would be very, very time consuming.  As far as I know, there is no function in SQL Server to collect all this information into one place.  I guess I could have used a series of T-SQL scripts to pull the info out by other means, but I am on a PowerShell kick, and I wanted to do it this way.

Besides, using my method, you can automate the process and audit multiple instances at once. Sound like fun?

I wrote this quick-and-dirty so there is no error handling. There is, of course, no warranty expressed or implied and you should use caution when running this (or any other script you find on the Internet). That said, all of my calls to SQL are read-only, so I don't think you'll have any problems.  


The code (explanation to follow below):
# ======================================================================================================================
#
# NAME: InventorySQLUserSecurity_DB.ps1
#
# Comment: This script is designed

Open in new window

4
 
LVL 60

Expert Comment

by:Kevin Cross
Comment Utility
poortatey:

Thanks for the tool!

I ran this on Windows XP after ensuring that I had enabled the scripts to run per:
http://technet.microsoft.com/en-us/library/ee176949.aspx

Very pleased to get my report after.  Will come in handy at some point I am sure, plus gave me some cool PowerShell to play with.  So for the practical and the fun, YES vote from me.

Regards,
Kevin
0
Audit has been really one of the more interesting, most useful, yet difficult to maintain topics in the history of SQL Server.

In earlier versions of SQL people had very few options for auditing in SQL Server. It typically meant using SQL Trace which is not the most intuitive tools. However in the latest editions, we are now given quite a lot of power for auditing with some purpose built tools. The SQL 2008 Audit is meant to be replacing SQL Trace, it is faster, more granular and easier to interact with from SQL Server Management Studio, and with code. In this article, I am using a T-SQL code approach.

Since this is one of the big topics in SQL Server 2008 (and bound to become more popular), it is not possible to cover it completely here. However, I will try to give you one detailed but simple example as an introduction to SQL Audit.

First of all when you want to use Audit in SQL Server 2008, you have to create a SQL Server AUDIT object. The Server AUDIT object is nothing more than just a container for defining the Target of the Audit specifications at the Server level and/or at the database level. You can store Audits in the Application Event Log, Security Event Log or Filesystem.

The general steps for creating and using an audit is (and you do need some privileges) :

Create a SQL Server Audit and define the target.
Enable the audit.
Create an Audit Specification at either the database or server level that maps to the audit.
8
 
LVL 13

Expert Comment

by:vora_bhaumik
Comment Utility
nice and really useful article. thanks.
0
By Mark Wills

Not so long ago, there was a fairly tricky question on Experts Exchange to do with unstringing a field, and some very good answers. Not wanting to be left out of the "fray", I submitted one of my old tricks. Despite the code being around for a while in various guises, my posting was met with intrigue and various superlatives. Naturally, I felt rather EElated at the response, though, cannot take all the credit.

So, I thought I would share some of those tricks that center around the use of a rather special table in MS SQL Server known as spt_values in the master database.

First, what is spt_values ?

It has been around since the beginning, it is not a system table, but is very much a part of SQL. It is rather simple, it is Microsoft's very own "look up" table for all those system type things (determined by the TYPE column) where you need a name. More importantly, it is probably always resident in memory.

Let's have a quick peek, and please scroll through the list after you have run the first select command, and you will see quite a variety of "names" which are actually held as numbers within the MS systems...

select * from master..spt_values

-- Now, the ones we are interested in are :

select number from master..spt_values where type = 'p' 

-- which gives us all the numbers from 0 to 2047 inclusive. 

Open in new window


So, what can we do with a list of numbers ?

Well, in this article, we can unstring a delimited field, and we can …
15
 
LVL 41

Expert Comment

by:Sharath
Comment Utility
Mark, nice article.

Why don't you provide examples on splitting the string, displaying vertically etc. here?

As you have already answered such questions, you can add links to those questions for anyone's reference whoever want to check the magic of spt_values.

-Sharath
0
 
LVL 60

Expert Comment

by:Kevin Cross
Comment Utility
Here is a nice follow-up tip from chapmandew on how spt_values trick from Mark can be utilized even when a higher than 2048 is needed.

http://sqlservernation.com/blogs/tipweek/archive/2010/06/27/quickly-generate-a-number-sequence-in-sql-server.aspx

Again this isn't always the optimal method over say a persisted numbers table, but boy have I had fun with this so just wanted to give kudos to Mark again despite the fact I already relinquished my vote above! :)
0

Microsoft SQL Server 2008

49K

Solutions

17K

Contributors

Microsoft SQL Server 2008 is a suite of relational database management system (RDBMS) products providing multi-user database access functionality.Component services include integration (SSIS), reporting (SSRS), analysis (SSAS), data quality, master data, T-SQL and performance tuning. Major improvements include the  Always On technologies and support for unstructured data types.