SQL Server Integration Services (SSIS) is a component of the Microsoft SQL Server database software that can be used to perform a broad range of data migration tasks. SSIS is a platform for data integration and workflow applications. It features a fast and flexible data warehousing tool used for data extraction, transformation, and loading (ETL). The tool may also be used to automate maintenance of SQL Server databases and updates to multidimensional cube data. SSIS replaced Data Transformation Services, which had been a feature of SQL Server since Version 7.0.

Share tech news, updates, or what's on your mind.

Sign up to Post

When this stored proc executes, it doesn't seem to be able to evaluate whether the variable @SQLAgentJobType contains the text 'Incremental' or 'Manual'.  Right now the value is set in the SSIS environment as 'Manual', but it executed the steps for 'Incremental'. If I reverse the order of the IF -blocks, it will execute the steps for 'Manual' first if the value is set to 'Incremental'.  As an aside, before anyone complains about the start and end dates as being nvarchar(20)-- don't. They are set this way because they are concatenated to another set of SSIS environment variables for an openquery statement needed elsewhere in the SSIS package.  Please review and advise.

ALTER PROCEDURE [etl].[ResetProjectDaterangeVariables]
	-- Add the parameters for the stored procedure here
( @SQLAgentJobType varchar(20),
  @FolderName nvarchar(128),
  @EnvironmentName nvarchar(128),
  @startDate nvarchar(20),
  @endDate nvarchar(20)

	-- SET NOCOUNT ON added to prevent extra result sets from
	-- interfering with SELECT statements.


	 -- Note: The string dates are created in order to deal with cobbling together an Informix SQL statement as part of a 
	 --       concatenated text string built by variables used in source dataflows

	 @tempstartDate nvarchar(20)
	,@tempendDate nvarchar(20)
	-- leave room for a leading space
	,@fixedEndTime nvarchar(9)
	,@fixedStartTime nvarchar(9)
	,@var sql_variant;

   -- DEBUG [uncomment]
   --  declare 

Open in new window

CompTIA Network+
LVL 12
CompTIA Network+

Prepare for the CompTIA Network+ exam by learning how to troubleshoot, configure, and manage both wired and wireless networks.

Hi, I'm using SSIS 2008 and SQL SERVER 2008R2.

I have a table that has several months of data.fep_ssis_upload.JPG
How do I make SSIS prompt the user for a string such as 'JUN 2018' to pass to a stored procedure query or VIEW that gets the data?

Currently, my SSIS calls the table that has data.  Please see the attached 2 printscreens.
Need to get just 1 month of data from this
Can someone point me to a few good links that illustrate how to use SSIS to read an Excel xls file and then insert that data into a SQL table? First I want to check for the existence of the xls file and if it is there proceed with the import. I have started by creating a script task but I cannot see where the connection from that task is made to the FileExists variable I defined so I can determine whether or not to proceed. I have both the script task and the OLE DB Destination task started but I do not get the second arrow indicating failure. I have been scouring the internet for example but have not found any that provide a solution.
I have a SSIS Data Flow Task that read Data from OData services, I'm able to connect to it successfully via OData Connection Manager. But when I run the task within SSDT, I'm getting following error

[OData Source [2]] Error: The OData Source was unable to process the data. The remote server returned an error: (401) Unauthorized.

[SSIS.Pipeline] Error: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED.  The PrimeOutput method on OData Source returned error code 0xC02090F5.  The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.  There may be error messages posted before this with more information about the failure.

I'm using basic authentication to connect and I can view data in the preview section of OData source editor. I don't know why I'm getting the error, the user has permission to extract the data.
I’ve made some changes to some SQL Server objects, tables,views, stored procedures, functions etc in my test environment which I now want to move to the production environment.
I want to make backups of the existing objects in production, eg Production_Table_1 as Production_Table_1_backUp.
I’m looking for the easiest way to make a copy of the table in its existing database, and the copy to include all associated indexes,constraints, data etc…
I looked at Transfer SQL Server Object Tasks in SSIS but it doesn’t seem to allow altering the object name.
I’m now looking at using  the “Script Table as” option in Management studio but it doesn’t seem to copy indexes.
Any guidance appreciated.
Is it possible to attach a file with the email notification of an SSIS job being completed? I have several SSIS job that run in the morning and produce csv files. For one of the jobs the client not only want email notification of completion but wants the csv file attached to the email. Is this possible? I already have the email notification set up but how do you attach the file?
I was  creating another SSIS  project by copying to another project. The flat file source changed and the target SQL remained the same. I am getting this message"

[Flat File Source [2]] Error: Data conversion failed. The data conversion for column ""MyField"" returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page.".

and upon looking at the data I cannot see anything wrong. How do I put in an error trap that will give more detailed information?
The last time I used SSIS was with SQL Server 2008, and pretty straightforward ETL tasks with some conditional branching, etc. I am getting into SQL Server 2017/Visual Studio 2017 and maintaining some existing packages.
I have come across some code in script tasks and having some difficulty finding online documentation. I am finding the properties LockForOneRead(), LockForOneWrite() on VariableContainer() and LockForRead() and LockForWrite() on Variables and an Unlock() property. I finally did find something that gives me a little insight, but still a little unsure and hopefully an expert can give me a little more insight.
  1. If the idea is to lock a variable so competing processes within the package aren't accessing it at the same time, what happens if process A locks a variable and process B tries to lock the same variable.
  2. What happens if I forget to unlock the Variables collection in my task? Will it be unlocked when the script task goes out of scope?
I've read that you can do implicit locking by using the syntax Dts.Variables["User::var_name"].Value = "some value"; so I guess these questions refer to doing explicit locking and unlocking.
It seems to me that unless you lock a variable in a very long running script task (most of the code I've seen locks, gets value, sets another string value, etc., or updates a table with a parameter value, etc., then unlocks), the likelihood of two processes accessing the same variable(s) is …
I am trying to schedule refresh my data source in tableau online and I am getting a connection failure error. I talked with Tableau support to see if it's anything I did wrong in tableau but they said it's SQL Server issue. I can connect to SQL server using Windows Authentication and create reports and can publish them . when i publish them the server name is coming as localhost. So, I tried to do SQL Server Authentication and used username: sa and password and logged in Tableau Desktop and published the data source but the scheduling doesn't work and gives me an error of connection failure again. So i tried looking at the connections in tableau online. Tried localhost, username sa and when i press test connection it gives an error couldn't found the server.
I'll start by confessing that I am unfortunately VERY new to SQL and a bit out of my depth, so your help would be massively appreciated.

I'm trying to run an SSIS job which up until recently has been working fine.

After about 15 mins the job fails and I receive

"Product Sales With View Fact:Error: SSIS Error Code DTS_OLEDBERROR. An OLE DB error has occurred. Error Code: 0x80004005. An OLE DB record is available. Source: "Microsoft SQL Server Native Client 11.0" Hresult: 0x80004005 Description: "Transaction (process ID 72)Was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction."

Open in new window

So I rerun the job and the same thing happens over and over.

I have had a look for any running jobs (I assume I checked correctly...) and cannot find anything.

I assume something is stuck in a locked state from where the job was previously running and had to be stopped (via job activity monitor) but I haven't got a Scooby where to go from here.

We are using MS SQL 2014.

I have been researching deadlocks but as I mentioned i'm new to SQL and i'm not getting very far
If anyone has any specific ideas or advice, that would be amazing.

Exploring SharePoint 2016
LVL 12
Exploring SharePoint 2016

Explore SharePoint 2016, the web-based, collaborative platform that integrates with Microsoft Office to provide intranets, secure document management, and collaboration so you can develop your online and offline capabilities.

I have an ssis project that works fine
It pulls in a specific file from a folder

How do I modify the data source so that it pulls in any file that ends in xls regardless of the actual name?

There will only ever be one file in the folder.
My SSIS package needs to run on the last working day of the month, so I have created a lookup table that returns a row when the systemdate matches the calculated month end date which returns 1 row. When it is not month end, 0 row is returned.

With this I have created a variable (on success) to run the other packages.
However, I need suggestions on how to implement ending the execution without errors when the condition is not met (as the job will be scheduled to run daily).

See attachment for info.
Thank you
Hi, I am executing a stored procedure using an OLE DB Source inside a data flow task.
The SP contains 3 global temp tables (e.g, ##temptable1) but the task returns an error as it cannot find the target table in the database.
As a test, I have changed the SP to use CTE but the query runs too long and therefore need temp tables.

Please advise how can I overcome this by using global temp tables?
Let me know if further information is needed.

Thank you
I am working on a SSIS package where we are using a 3 rd party tool to connect to secured(Banking) FTP site and downloading one file at a time by passing a input variable. I am generating the variable value through a SQL task. As long as i keep passing a value other than 0 I have to keep looping and executing the 3party component.

How can I accomplish this?  whats the best way.. should I use forLoop  or For each Loop?  Can you please let me know how I can accomplish this. Fyi.. I am not good at script tasks, it would good if can use any other transformations other than Script tasks
I have several SSIS projects that have run for a while with no issues. They read a csv file and insert data into SQL tables. Recently I have noticed lost records. Digging into the issue I see the users have started putting quotation marks within quotation marks.  For example: "The Rain in Spain stays mainly in the plains" has become "The rain in Spain stays mainly in the "plains"". What is the best way to handle this? Do I eliminate all quotation marks?
I have a SQL server DB called DBX_REV where we have a lot of tables and other objects etc. I did create SSIS jobs to do stuff with the tables of that DB.
Now we are transitioning into a new work company. I would definitely want to save all the table definitions and objects and SSIS jobs that I created, within that server? How do I do it?
In other words…. How do I script a copy of the "skeleton" of that entire DB  In such a way so that I can re-create all the objects like tables and etc. within my local desktop  SQL server instance?
Hi Expert,
SSIS using Lookup Table , i need matched with Target table
(See attached screenshot)
01.Invoice Number
02.Invoice Sequence
03. Invoice Line
04. Order Number
05. Order Line
06. Order Realse Number
07. SiteID

Then if not matched records fields ,then need to add to Target table (INSERT all the above 7Lookup unmatched feilds Records)

Can some let me know , looking at screenshot it's correct way to matched.
I have used Cached -Partial
Hello community,

I have Googled this topic to death and I cannot find the solution to my problem.  Here is my setup:

Database Server 1:
SQL Server 2014 Developer
WIndows Server 2012 R2 Std

Backup Server 1:
SQL Server 2014 Developer
Windows Server 2012 R2 Std

Backup Server 2:
SQL Server 2014 Developer
Windows Server 2016 Std Eval

On Database Server 1, I have 3 databases: DB1, DB2, DB3.

I have created the following script to back each one up (one-at-a-time) to one of the backup servers:

Mapping Drives:
exec xp_cmdshell 'net use P: /delete'
exec xp_cmdshell 'net use Q: /delete'
exec xp_cmdshell 'net use P: \\\Backup_F'
exec xp_cmdshell 'net use Q: \\\Backup_G'
exec xp_cmdshell 'dir P:'
exec xp_cmdshell 'dir Q:'

Open in new window

Actual Backup:
@today nvarchar(20),
@DB_to_backup nvarchar(128) = 'DB1',
@PATH_FILE1 nvarchar(128) = 'P:\SS_BKP1',
@PATH_FILE2 nvarchar(128) = 'Q:\SS_BKP2',
@backup_type nvarchar(128) = 'diff' /* Specify 'full' or 'diff' */

SELECT @today = SUBSTRING(REPLACE(REPLACE(REPLACE(CONVERT (datetime2,GETDATE(),120),'-',''),':',''),' ','-'),0,16)

@DISK1 nvarchar(256) = @PATH_FILE1 + '\' + @DB_to_backup + '\' + @DB_to_backup + '_' + @today + '_' + @backup_type + '_MTS4Mb_BufferCount22_file1_of_2.bak',
@DISK2 nvarchar(256) = @PATH_FILE2 + '\' + @DB_to_backup + '\' + @DB_to_backup + '_' + @today + '_' + @backup_type + '_MTS4Mb_BufferCount22_file2_of_2.bak'


Open in new window


I have an SSIS package that has a single step to run a PowerShell script.

When I execute the SSIS package via visual studio no issues. But when I run the SSIS package via a job in the SQL instance, and it completes successfully (no errors), no file is outputted as per the script.

Any assistance is welcome, see the attached screenshot for the SQL job configuration.

I am thinking it could be a permissions issue, but I checked the SQL Agent service account and it has permissions to the folder the PowerShell script is writing too.  

Thank you.
Exploring SQL Server 2016: Fundamentals
LVL 12
Exploring SQL Server 2016: Fundamentals

Learn the fundamentals of Microsoft SQL Server, a relational database management system that stores and retrieves data when requested by other software applications.

Hello Experts,
I am trying to run a SSIS package using Visual Studio 2015.  When execute it with Start Debugging mode, the package runs just fine.  But when run in Start Without Debugging it fails with the below error message.  Any idea what is going on?  Please try to help.  Thank you in advance.

Error Message:  The task has failed to load. The contact information for this task is ""

Thank you!
Running Data Flow Task in package through SSIS debug mode (visual studio ssdt 2017) retrieves more rows than calling the package from sql server 2016 Agent job.
Not a rights issue as both run the exact same query and yet return different results.
The data of the query retrieved is in the same locale SQL Server database.
Anybody have a clue how this is possible; I ran it multiple times with exact same different results.
If I run the query in SQL Server Management Studio query window, it gets same results as query when run through debug within VS.
Somehow the SQL Server Agent Job calling the package runs the same query but returns less rows.

I have an SSIS package in which the tasks are:

1. TASK 1: log to a file that the package started.
2. TASK 2: email task
4. TASK 4: email a report

Problem is if the email server gateway is down, then TASK 2 fails.  However, I want TASK 3 to run regardless and I can do this by creating a Precedent Constraint arrow from TASK 1 to TASK 3.    Problem is how would I be notified that the email server is down so that when the email is working again, that I'd know when I can run TASK 4 (a report)?

I have bunch of files that I need to copy,but issue is that the file name is same but extensions are something like this .txt0,.txt1,.txt2 etc. So how can I create a SSIS package that could loop through these files and copy these files?
In my quest to create a poor man's address correction routine I created about 25 stored procedures similar to the one below. The only thing that changes are the strings used in the search. I have wrapped all these sp's into a VS program plus an SSIS job. What I just discovered is that the only changes that are sticking are the ones in the last sp that is called. If I run them individually the changes stick. I have a COMMIT TRAN at the end of each one.  What must I change to get the changes to stick as each sp is called? It must have something to do with all these sps called inside one session.

CREATE PROCEDURE [dbo].[apd_UpdateAddressesCASESB02]




DECLARE @TranCount int
DECLARE @ErrMsg nvarchar(4000), @ErrSeverity int

SET @TranCount = @@TRANCOUNT

IF @TranCount = 0


 FROM [dbo].[cases] mt
 ) AS chg01
 ) AS chg02

I'll be loading hundreds of XML files into the SQL Server and I am wondering what is the best solution out there. I have attached a sample XML file. Is SSIS a better solution to automate this process or are there any third party tools we can use to complete this project?



SQL Server Integration Services (SSIS) is a component of the Microsoft SQL Server database software that can be used to perform a broad range of data migration tasks. SSIS is a platform for data integration and workflow applications. It features a fast and flexible data warehousing tool used for data extraction, transformation, and loading (ETL). The tool may also be used to automate maintenance of SQL Server databases and updates to multidimensional cube data. SSIS replaced Data Transformation Services, which had been a feature of SQL Server since Version 7.0.