Want to protect your cyber security and still get fast solutions? Ask a secure question today.Go Premium

x
?
Solved

SQL Job used to work - now fails

Posted on 2011-05-05
7
Medium Priority
?
364 Views
Last Modified: 2012-05-11
i have a job - that has been working for a year now - it imports a file into a table then runs an update statement that goes against MULTIPLE tables...

the import table has now become larger (>100K records) - and now i'm getting The package execution returned DTSER_FAILURE (1) after about 8 hours...

any idea?
0
Comment
Question by:quigleyryan
7 Comments
 
LVL 143

Expert Comment

by:Guy Hengel [angelIII / a3]
ID: 35699165
that usually means you are missing indexes for the UPDATEs ...
having a index on the joining fields is the plan to have.
0
 
LVL 5

Expert Comment

by:simonpaul64
ID: 35699504
If the process used to be fast but has got (gradually or suddenly) slow then yes indexes could be a problem (because the updates could run so slow that something somewhere times out.  However, you can perform updates without indexes just fine, and 100k rows isn't huge.

You could try splitting the imported data into 3 or 4 different files and loading each individually to see if the volume is causing the problem or whether you have some other issue.
0
 
LVL 9

Expert Comment

by:radcaesar
ID: 35699920
check your windows event log for error details
0
Veeam Disaster Recovery in Microsoft Azure

Veeam PN for Microsoft Azure is a FREE solution designed to simplify and automate the setup of a DR site in Microsoft Azure using lightweight software-defined networking. It reduces the complexity of VPN deployments and is designed for businesses of ALL sizes.

 
LVL 35

Expert Comment

by:David Todd
ID: 35700890
0
 

Author Comment

by:quigleyryan
ID: 35701334
radcaesar - event log only says job failed, eventid = 12291

   [ Qualifiers]  16385
0
 
LVL 35

Expert Comment

by:David Todd
ID: 35704276
Hi,

Maybe you need to test the package - DTS or SSIS and see if you can find which step fails.

Given you are on SQL 2008, why not wrap etl procedures in a begin try end try - begin catch end catch pair, and simply log everything that goes through the catch block - I call a procedure to this for me adding the name of the procedure I'm calling from, and a note which is generally empty.

Then its just a matter of looking at the log to see what went wrong where.

HTH
  David
0
 
LVL 21

Accepted Solution

by:
Alpesh Patel earned 2000 total points
ID: 35704431
Please optimize SQL code in SSIS.
0

Featured Post

Upgrade your Question Security!

Your question, your audience. Choose who sees your identity—and your question—with question security.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Long way back, we had to take help from third party tools in order to encrypt and decrypt data.  Gradually Microsoft understood the need for this feature and started to implement it by building functionality into SQL Server. Finally, with SQL 2008, …
In this article I will describe the Copy Database Wizard method as one possible migration process and I will add the extra tasks needed for an upgrade when and where is applied so it will cover all.
this video summaries big data hadoop online training demo (http://onlineitguru.com/big-data-hadoop-online-training-placement.html) , and covers basics in big data hadoop .
Loops Section Overview

579 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question