SQL Server Integration Services (SSIS) is a component of the Microsoft SQL Server database software that can be used to perform a broad range of data migration tasks. SSIS is a platform for data integration and workflow applications. It features a fast and flexible data warehousing tool used for data extraction, transformation, and loading (ETL). The tool may also be used to automate maintenance of SQL Server databases and updates to multidimensional cube data. SSIS replaced Data Transformation Services, which had been a feature of SQL Server since Version 7.0.

Share tech news, updates, or what's on your mind.

Sign up to Post

I’m loading data from a Csv file using SSIS(SSDT 2015.).I’m first loading the data into a stage table and then converting  the column data to the data types in the target table using  data conversion transformation, then, using a look up transformation in partial cache mode to compare the values in 1 column ( this could be null or not null)+ the load date( It’s the current date time that I’m converting to date data type while comparing) in stage table with the ones in destination.Both these columns are non primary key columns. Any row that matches the stage and final destination table should not be loaded but the non matching ones can be loaded to the final destination.

I do have a column store clustered index on the destination table. There is no primary key on the table.

in the lookup Transformation,I’m  using the following query:

select distinct colA , ColB from Stgtbl A
Left join destbl B
On A.colA= B.colA and A.colB=B.ColB
Where b. colA Is Null And b.colB is null

I tried to avoid using look up and tried using left join and where not exists as well in the Oledb source
But it was running really slow and sometimes even getting stuck after loading a few rows.
I thought the above look up might resolve the issue, but seeing the same issue.

Can someone please help me resolve this ASAP.

Thanks a million in advance!
OWASP: Forgery and Phishing
LVL 13
OWASP: Forgery and Phishing

Learn the techniques to avoid forgery and phishing attacks and the types of attacks an application or network may face.

Sql Server SSIS packages that interact with Microsoft Access 2016 files have been very inconsistent.  I have tried odbc connections as well as using the ACE driver.  Some work all the time other access files show locked all the time even when the lock file is not showing.  Is there a resource that documents how to work with these issues?  Of course it would be nice to know if you really can kill the lock file but I have looked quite a bit for solutions and found nothing that would work with the environment we are working with.  These access files are being connected to by other sources that an access front end files.  Sometimes even excel is used to pull data.  But again I have some that are being connected to in a variety of ways but works  in ssis all the time.  We are using a unc for the pointer to the access file itself in the package connection.  Then we could be pulling from a table in the access file and also pushing data from sql server to a specific table in access which is being worked through the access front end at the same time by multiple users.  This is a tough issue no one wants to do this but unfortunately right now there is no alternative.  The other factor is some files have passwords and the ones that don't seem to work so the encryption may be part of it.  Thanks you
Hi, I have an SSIS Package with no parameters. It has a connection manager with sql server Authentication. the save my password is ticked but it
disappears when you open the package in visual studio each time.

now we have scheduled this package in sql server agent job. this runs most of the times but it failed last night saying failed to acqiure connection to the
connection manager used in the package.

can someone suggest any solution how to overcome it ?

Many Thanks
I have a SSRS Pick List report for my shipping department. It is time consuming to print one at a time. I was tasked in creating a batch to generate a single PDF and the user only needs to send the print job once. I successfully created this data driven report. The only issue is that I get 100(example) emails with one pick report in each. I need one email with one PDF containing all  pages. Is it possible to get this in one file instead of individual ones? I'm using SSRS 2017.
I am setting up my laptop to use SQL Server Management studio, MS visual studio, and SSRS/SSIS. Can someone suggest the best combination of these tools for me to download and install.  I am NOT looking for enterprise/paid/trial version, only something like express version. Thanks.

I want to monitor a text LOG file for frequency of new LOGIN entries  by importing the event entries (search for the string "-----------------") into SQL SERVER using SSIS as a new record.

QUESTION: what is the script code to accomplish this?

EXAMPLE of a LOG file:

copy file
delete file

I want to import the LOG file into a SQL SERVER table such as (1st row is the header, the 2nd row on are the events from the LOG file):

08:00, LOGIN
09:00, logoff
09:15, copy file, delete file
09:30, logoff
Currently working on an SSIS ETL package. A Slowly Changing Dimension is configured as a Type 1 with one of the columns as a fixed attribute. I have set the package to continue when it detects changes to the fixed attribute and I would like to capture the data when this occurs. How could I capture this data?
Any help would be appreciated.
I have created a data migration project which takes '|' delimited flat files, imports them to SQL Server, processes the data, and then outputs to an Excel Template for loading into a new system. I have no responisbility for either the flat files, or the excel template as they are handled by third parties.
I am running on Visual Studio 15.9.5 for the SSIS development, and the package will be used only by myself and will not be distributed to anybody else. Connection is to SQL Server Azure.

I have had no issues with connecting importing the flat files (beyond the usual bad data problems) or connecting to SQL Server and running scripts. The issue I'm having is with the final stage in the process which involves transferring the contents of SQL tables to Worksheets within an Excel Workbook. I am running Excel on Office 365 and all processing of files is done on my laptop.

When I ran an initial dump to Excel I had some errors with the version, so I changed it to XLS to resolve any compatibility issues (which I have done before successfully). However, on this particular project, the limit for XLS files was breached as there are almost 70,000 rows of data to export. When I changed to XLSX I got another error, so I installed Microsoft Access Data Engine 2010 which seemed to work on some test runs.

Now I have finished all of the mapping, so a little more about the spreadsheet. There are multiple worksheets to which I have to export data; I have created Data Flow tasks which …
Hi, I have to loop through a folder and  sub folders to find a folder name TEST . If it is present then rename it to Test archive. I have to do this using file system task and for each loop container. Please help
I want to create a excel spreadsheet directly from a SSIS script task which is being pulled from a stored procedure. I'm able to create single sheet workbook with one stored procedure but I'd like to add another another sheet to the same workbook but from a different stored procedure.  I'm taking this from an example from techbrothersIT

using System;
using System.Data;
using Microsoft.SqlServer.Dts.Runtime;
using System.Windows.Forms;
using System.IO;
using System.Data.OleDb;
using System.Data.SqlClient;

namespace ST_563e77770d454d11b11087ae4d41268c
	public partial class ScriptMain : Microsoft.SqlServer.Dts.Tasks.ScriptTask.VSTARTScriptObjectModelBase
        public void Main()
            string datetime = DateTime.Now.ToString("yyyy_MM_dd_HHmmss");
                //Declare Variables
                string ExcelFileName = Dts.Variables["User::ExcelFileName"].Value.ToString();
                string FolderPath = Dts.Variables["User::FolderPath"].Value.ToString();
                string StoredProcedureName = Dts.Variables["User::StoredProcedureName"].Value.ToString();
                string StoredProcedureName2 = Dts.Variables["User::StoredProcedureName2"].Value.ToString();
                string SheetName = Dts.Variables["User::SheetName"].Value.ToString();
                string SheetName2 = 

Open in new window

Exploring SharePoint 2016
LVL 13
Exploring SharePoint 2016

Explore SharePoint 2016, the web-based, collaborative platform that integrates with Microsoft Office to provide intranets, secure document management, and collaboration so you can develop your online and offline capabilities.


I am upgrading a SQL SERVER 2008R2 to SQL SERVER 2016.  I have the DEV SQL SERVER 2016 running.

There are a lot of apps (SSIS, vbscript, SSRS,  that call the stored procedures and use FQDN file paths.

1. How do I test these stored procedures in the DEV SQL SERVER 2016 because they specify FQDN to text files in PROD?

Example:   a stored procedure is executed
Prod imports text file \\prod\import_file1.txt
Prod exports text file \\prod\export_file1.txt
Prod archives the text file by moving   \\prod\import_file1.txt to \\prod\Archive\import_file1.txt

2. an SSIS calls a stored procedure that does DML/CRUD (delete, update, insert).

How do I test the SSIS without going through and changing the CONNECTION MANGER (to text files) and the SCRIPT TASK (for any references to a text file in the PROD folder)?
I have a windows service running with a service/domain account. The service is running on server A.  The

service calls a stored procedure ( via SqlCommand etc ) to launch a database package (SQLServer 2014)

on server B using the following commands:

exec @status = ssisdb.catalog.create_execution 
		@folder_name = 'ExportUploadPackages',
		@project_name = 'ExportUploadPackages',
		@package_name = 'DataExport2.dtsx',
		@use32bitruntime = True,
		@execution_id = @execution_id output

DECLARE @loggingLevel smallint = 3 -- verbose = 3
	exec ssisdb.catalog.set_execution_parameter_value @execution_id,  @object_type=50, 		

@parameter_name=N'LOGGING_LEVEL', @parameter_value=@loggingLevel

exec @status = ssisdb.catalog.set_execution_parameter_value @execution_id, @object_type = 30, 		

@parameter_name=N'packageParmDocumentNumber', @parameter_value=@DocumentNumber

exec @status = ssisdb.catalog.set_execution_parameter_value @execution_id, @object_type = 30, 		

@parameter_name=N'packageParmDestinationFilePathName', @parameter_value=@DestinationFilePathName

exec @status = ssisdb.catalog.start_execution @execution_id

select @status = status from SSISDB.catalog.executions where execution_Id = @execution_id
while @status = 1 or @status = 2 or @status  = 5
	select @status = status from SSISDB.catalog.executions where execution_Id = @execution_id

if( isnull( @status, 0 ) <> 7 ) 
	log error and throw exception 

Open in new window

There is an existing SSIS package that has a PowerShell script that calls a batch file on a remote server named "Remote Server"
I have a batch file on a remote server named: GetSalesData.bat
The SSIS package does not currently pass the date parameters to filter out only the necessary time span.

Batch File sample below:

@echo off
set OrderDt=%~1
set ShipDt=%~2
cd /D F:\"Sales History"
mllmn -new-sls-file "F:\Sales History\Final\Bikes.txt" -ord-dts %OrderDt% -ship-dts %ShipDt% 

Open in new window

*** Running this works fine while on the server when passing the date params.....   GetSalesData.bat 2018-12-01 2018-12-31

On the SSIS server I have a PowerShell script that works without the params.
PowerShell script below:

Invoke-Command -ComputerName RemoteServer -ErrorAction Stop -ScriptBlock {Invoke-Expression -Command:"cmd.exe /c 'F:\Sales History\GetSalesData.bat'"}

Open in new window

I have been trying many many things to alter this script to use the two parameters and then call it within SSIS. I don't know if the issue is PowerShell or how I am setting up the Execute Process task in SSIS

In the "Executable" field I have: PowerShell.exe  (which works when I don't use parameters)
I have been trying this in the "Arguments" field:    -File J:\Sales\GetAllSales.ps1 "2018-12-01" "2018-12-31"

Note: the date format is important.

Any advice?
Thank you.

I have created an SSIS package with 3 input parameters. 2 parameters are mandatory (Required = True) and 1 parameter is NOT mandatory (Required = False).

I have deployed the package to SSISDB and am running it as a SQL Server Agent job. My question is, how do I set the non-mandatory parameter to blank\Null\Empty?

If I set it to nothing I get the following error message when I click OK.
I am running SQL Server 2017 Developer Edition with CU8 installed with SSMS 17.9.1.

I am running a script that goes through all the backup files on a network share and collects the header data for each file using the "RESTORE HEADERONLY" command.

I have run into some files that might be corrupt, and they take over an hour to return an error, whereas a successful run of the command is completed in 1-2 seconds.

Is there a way I can limit the execution time of the "RESTORE HEADERONLY..." command so that it runs for a maximum of 60 seconds?

I am reading files from network shares on other servers.

I have already tried limiting execution time to 60 seconds with this:

use master
exec sp_configure 'remote query timeout', 60;

Open in new window

But the RESTORE HEADERONLY command still kept running for a long time on the suspected corrupt file.

Any help would be greatly appreciated!

I have run into an issue I can not figure out. I am running a 2014 sql express server that is connected to and importing data from a 2008 R2 sql server.  Each server is Hosted within the same Virtual Machine host using the same data storage but on individual VMs. When trying an import the from the express machine the script runs well but hangs because of some type of temp disk space utilization that that fills the drive with over 40GB. On reboot the disk space returns to normal. The script is pulling only about 6000 records from the other data base and when complete the dtabase file is less than 50MB.

We copied the data from the remote server and placed it on the 2014 express server changed the script to look remotely and the import happened as it should.

My questions are why and how does the fail happen?  Why would a large temp file of some type be written when importing between VMs. How do I over come this?  

I will provide any other information you may need and thank you for your help.

I have found the following code to render an SSRS via a SSIS script. This script works as it is intended, e.g. saves the rendered report to disk.

But I would like to send the rendered report as an email attachment, without saving it to disk. Please could someone help? I have tried to render it to a memory stream but I can't quite get it to work.

Thanks, Greg

 Protected Sub SaveFile(ByVal url As String, ByVal localpath As String)
        Dim loRequest As System.Net.HttpWebRequest
        Dim loResponse As System.Net.HttpWebResponse
        Dim loResponseStream As System.IO.Stream
        Dim loFileStream As New System.IO.FileStream(localpath, System.IO.FileMode.Create, System.IO.FileAccess.Write)
        Dim laBytes(256) As Byte
        Dim liCount As Integer = 1
            loRequest = CType(System.Net.WebRequest.Create(url), System.Net.HttpWebRequest)
            loRequest.Credentials = System.Net.CredentialCache.DefaultCredentials
            loRequest.Timeout = 600000
            loRequest.Method = "GET"
            loResponse = CType(loRequest.GetResponse, System.Net.HttpWebResponse)
            loResponseStream = loResponse.GetResponseStream
            Do While liCount > 0
                liCount = loResponseStream.Read(laBytes, 0, 256)
                loFileStream.Write(laBytes, 0, liCount)
        Catch ex As Exception
        End Try
    End Sub

Open in new window

I am trying to run a DTSX package from a SQL Job scheduler. I have created my job using Visual Studio 2015 and it works if I run it from my machine. I have then copied my Package.dtsx to the SQL Server and it gets error whenever I run from the SQL Server Agent - Job. I can run it locally from my Windows machine but weird error from SQL 2016 Server.

My folder has read/write permission. I do not have the file open.

I have a folder with a bunch of files that come in a set of 2 files (e.g., Name.pdf, NameMD.pdf).  I need to loop through this source folder (call it FOLDER1) and move only each set of files to another folder (call it FOLDER2).   (NOTE: Then, I have another process that compares the contents of each set of PDF files.)

I need to search for a file that has a *MD at the end of the filename, then search for the 2nd file that does NOT have 'MD' at the end of the filename, then move both files to FOLDER2.  Then I have another process that processes files in FOLDER2.  Then repeat the loop in FOLDER1.

What is the code to do this as a vb script, a standalone console C#.Net or C#.Net inside of SSIS?




Expert Spotlight: Joe Anderson (DatabaseMX)
LVL 13
Expert Spotlight: Joe Anderson (DatabaseMX)

We’ve posted a new Expert Spotlight!  Joe Anderson (DatabaseMX) has been on Experts Exchange since 2006. Learn more about this database architect, guitar aficionado, and Microsoft MVP.

I have SSIS package created which transfers data from local mysql server to azure database.  the package runs fine inside visual studio and data gets transferred.  i created the sql job in local sql 2012 server by importing this package .  when i run the job  it throws the below error and job fails

sqlsvrjob_medicsteam_azure Connection manager ""     Description: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80040E4D.  An OLE DB record is available.  Source: "Microsoft SQL Server Native Client 11.0"  Hresult: 0x80040E4D  Description: "Login failed for user 'khsql'.".
Hi All,

I'm new to SSIS and I want to do incremental data load for 80 tables from one database to a other database and  I want to  to do it through CDC of SSIS.
Can anyone help me out abut this?

Or please let me know if any  the way to get only the delta from all 80 tables in one database and load into target tables in different database every day without using CDC of SSIS.

I tried to install Visual Studio 2017 Professional but it failed due to lack of disk space.

So when I opened my Visual Studio 2008 SSIS project, I get this error:

Class not registered (Exception from HRESULT: 0x80040154 (REGDB_E_CLASSNOTREG)) (System.Windows.Forms)

I have a DTS job that uploads some CSV data into a SQL table.

I wanted to schedual the DTS using an SQL job.

I have setup the job in SQL but when it is executed through the SQL job it fails (see error message below).

If I run the DTS directly it works ok.

I think it is something to do with the user permissions but I don't know why it fails as SQL user I have setup (web_admin) has SA rights.

Any ideas?

DTS fail
Hi , I would like to load data from 3 different SQL Command (3 different queries) in 3 different excel files in one folder with each excel file with Date time feature in One package .
I tried this solution however getting issue as It wont open mapping window for 2nd Excel destination. However 1 excel file is loaded successfully., Your help is appreciated

Error is:

Excel destination: Opening a rowset failed. Checked that objects exists in Database.
How to resolve culture problem of decimal and thousand separator  of number of excel using c#


SQL Server Integration Services (SSIS) is a component of the Microsoft SQL Server database software that can be used to perform a broad range of data migration tasks. SSIS is a platform for data integration and workflow applications. It features a fast and flexible data warehousing tool used for data extraction, transformation, and loading (ETL). The tool may also be used to automate maintenance of SQL Server databases and updates to multidimensional cube data. SSIS replaced Data Transformation Services, which had been a feature of SQL Server since Version 7.0.