Solved

Troubleshoot Job that continues to run into the next day

Posted on 2011-02-24
8
193 Views
Last Modified: 2012-05-11
I have a job that references an SSIS package which inserts, from a file, a bunch of data into a table with over 100 million rows.  I've performance tuned the insert statement and watched it take less than 2 seconds to run for each file (sometimes it takes as much as 10 seconds for a file with a large amount of data).

I run the job at 1:00 AM and for the past week I come in the next day and the job is still running.  But when I just run the job manually DURING THE DAY WHEN DB USE IS HIGHER, I watch it complete in about 1-2 hours.  

I don't really have any other jobs running at night (that aren't also running during the day) and none of them are hitting that table.  The only exception to that I can think of is I back up the transaction logs every 2 hours. I also have a scheduled task that moves a back up of the DB and all the transaction log backups to an external drive at 2AM.  Perhaps all the disk I/O is preventing the job from running successfully and then it just gets bogged down until I manually restart it?

Question 1:  should I stop the transaction log backups at night?
Question 2:  where would I begin to try to figure out what could be happening at night to this job.  Is there any history stored somewhere that would show what delays it seems to face at night?

0
Comment
Question by:davidcahan
  • 3
  • 2
  • 2
  • +1
8 Comments
 
LVL 39

Accepted Solution

by:
lcohan earned 350 total points
ID: 34970470
Here's what I would do:

1. Don't touch the backup jobs until you wasted all any other options
2. Check physical location of your input file for the job, database T-log, mdf/ndf files, and tempdb location. Ideally they should be all different for best I/O
3. Check the batch size of your import - should be somewhere in the SSIS step properties. Sometimes lower batch size can make it faster sometimes higher but do not go beyond 10.000 rows per batch. All these depend on the constraints/clustered indexes on the destination table.
4. You should add multiple files to tempdb - 1 per socket but no more than 4 to improve SQL performance - I'll try find the Microsoft link to support this statement.
5. Check make sure no viruscan is running against your SQL files and input file ASSUMING all data going into your DB is scanned at the app level and the input file was already checked.
0
 
LVL 6

Expert Comment

by:Rajesh_mj
ID: 34977454
It can be a message pop up from the SSIS during the run at night time.It might be accessing some files and getting error and there might be some script to throw the error as a message box. Or a message box due to some other reasons.
0
 

Author Comment

by:davidcahan
ID: 34984774
@lcohan

1,2,3,5 are good to go

4 -- I would have to double check

But the strange thing is that when I run it either manually (by right clicking on the job and starting it) or run it via the debugger in VS it runs without any issues.  Granted, I have not tried to run it that at the same time as the job is normally scheduled.  
0
Back Up Your Microsoft Windows Server®

Back up all your Microsoft Windows Server – on-premises, in remote locations, in private and hybrid clouds. Your entire Windows Server will be backed up in one easy step with patented, block-level disk imaging. We achieve RTOs (recovery time objectives) as low as 15 seconds.

 

Author Comment

by:davidcahan
ID: 34984825
@lcohan

also, I only have 2 "drives"  to split between.  Data is on d:, logs on c:.  Does it actually make sense to reverse that for tempdb?  that way primary DB data is on d: except for tempDB which is on c:  and therefore operations on primary DB that need tempDB aren't using the same drive?
0
 
LVL 75

Assisted Solution

by:Anthony Perkins
Anthony Perkins earned 150 total points
ID: 34985623
>>I'll try find the Microsoft link to support this statement.<<
I have got it:
Myth #12: tempdb should always have one data file per processor core.
http://www.sqlskills.com/BLOGS/PAUL/post/A-SQL-Server-DBA-myth-a-day-(1230)-tempdb-should-always-have-one-data-file-per-processor-core.aspx
0
 

Author Comment

by:davidcahan
ID: 34992824
@lcohan and @acperkins -- does my theory, that I should actually put the tempdb and my primary DB (the one I'm inserting data into) on different drive letters.   currently all mdf's on are one drive and all ldf's are on another.  i'm thinking that only with tempDB I should switch that.
0
 
LVL 75

Assisted Solution

by:Anthony Perkins
Anthony Perkins earned 150 total points
ID: 34994122
>> currently all mdf's on are one drive and all ldf's are on another. <<
If you are talking about physical drives and not logical drives, I would leave it that way.
0
 
LVL 39

Assisted Solution

by:lcohan
lcohan earned 350 total points
ID: 34997503
Only thing I would do as much as possible do NOT put tempdb and T-logs on the same RAID or physical disc. If C:\ and D\ are just logical on the same phisycal DISK or RAID then obviously is not much you could do.
0

Featured Post

Migrating Your Company's PCs

To keep pace with competitors, businesses must keep employees productive, and that means providing them with the latest technology. This document provides the tips and tricks you need to help you migrate an outdated PC fleet to new desktops, laptops, and tablets.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

This article will describe one method to parse a delimited string into a table of data.   Why would I do that you ask?  Let's say that you need to pass multiple parameters into a stored procedure to search for.  For our sake, we'll say that we wa…
So every once in a while at work I am asked to export data from one table and insert it into another on a different server.  I hate doing this.  There's so many different tables and data types.  Some column data needs quoted and some doesn't.  What …
Microsoft Active Directory, the widely used IT infrastructure, is known for its high risk of credential theft. The best way to test your Active Directory’s vulnerabilities to pass-the-ticket, pass-the-hash, privilege escalation, and malware attacks …
Established in 1997, Technology Architects has become one of the most reputable technology solutions companies in the country. TA have been providing businesses with cost effective state-of-the-art solutions and unparalleled service that is designed…

773 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question