Want to win a PS4? Go Premium and enter to win our High-Tech Treats giveaway. Enter to Win

x
?
Solved

Question regarding designing ETL process with DTS

Posted on 2008-10-08
7
Medium Priority
?
501 Views
Last Modified: 2013-11-30
    We have two ideas based on how a DTS package should be built for an ETL process. One methodolgy is to seperate it into multiple packages and then run them utilizing jobs. The other methodology is to create the process in one master package and run the single package as a job.
    Based on experience, what is the more logical design. If you answer, please do not choose A or B, identify which one is the best way and give supporting reasons (pro's and con's) on why one is more efficient than the other.
    Once again, thank you for your ongoing support!
0
Comment
Question by:Kevin_Maillet
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 2
  • 2
  • 2
  • +1
7 Comments
 
LVL 4

Accepted Solution

by:
randy_knight earned 668 total points
ID: 22675533
With DTS, I always did it like your option A for the following reasons:

1. The DTS ExecutePackage task is a pain to work with.
2. It is much easier to restart a failed job than to go into your master package, disable steps that have al ready completed, etc.  
3. If you build some metadata tables to handle pre-requisites and complettion of batches, etc. you can go massively parallel this way.

I coud go on and on.  SSIS is a different world but with DTS this is the only wayt to do it IMO.

0
 
LVL 75

Expert Comment

by:Anthony Perkins
ID: 22675597
It really depends if you are passing global variables from the parent to the child and back.  If the answer is yes, than you have no choice
0
 

Author Comment

by:Kevin_Maillet
ID: 22689823
Is there any performance reasons why one is more effective than the other?
0
Microsoft Certification Exam 74-409

Veeam® is happy to provide the Microsoft community with a study guide prepared by MVP and MCT, Orin Thomas. This guide will take you through each of the exam objectives, helping you to prepare for and pass the examination.

 
LVL 75

Assisted Solution

by:Anthony Perkins
Anthony Perkins earned 664 total points
ID: 22691865
You should not be focusing on performance as the key reason.  If there diffrences, they would have to be negligable.  Instead focus on maintenance and if the steps are independent of each other.  If they are than go with using jobs, if not than use a master/child DTS package approach.  That is the correct lithmus test to make.
0
 
LVL 51

Assisted Solution

by:Mark Wills
Mark Wills earned 668 total points
ID: 22696759
Definitely look towards operability and scalability. I do like a modular approach, with each module being it's own package, and also self contained. For that reason, often see a few packages being brought into an ETL process, but in such a way that they can largely run independantly of the previous package.

For example, load staging, validate staging, report exceptions, put away staging. Can be four distinct packages - each has a specific job to do, and works with whatever data may exist (or even does not exist because of a failure). The schedular is essentially the glue that co-ordinates the packages - and that is often where there is a sometimes mistaken belief that they all need to know about each other. Certainly there is no point validating data that didn't load, but then, the validation doesn't have to do anything anyway.

Some purists will toally disagree, but, the basis for my reasoning is, there is very little performance overhead in running a task that has nothing to do, and maximum flexibility within a discrete module. I can change validation package as much or as little as needed without impact on anything else. More so, I can farm out different jobs to different developers based on skills, exeperience, and knowledge of data.

So take an object oriented perspective of the ETL process and manage each object (package). It kind of flows on from acperkins statement "Instead focus on maintenance and if the steps are independent of each other" except I take it one step further and suggest make the steps independant. It also kind of fits in with rand_knights suggestions, in so much as each individual step can be isolated, worked on etc...

You do have to design it pretty well, and in the validate process flag errors, and or approved data so that any ensuing steps can take appropriate action. ETL is data driven, so let the data drive the process.
0
 

Author Comment

by:Kevin_Maillet
ID: 22713570
Thank you for your help gentlemen
0
 
LVL 51

Expert Comment

by:Mark Wills
ID: 22713603
A pleasure. What did you decide to do ?
0

Featured Post

NFR key for Veeam Agent for Linux

Veeam is happy to provide a free NFR license for one year.  It allows for the non‑production use and valid for five workstations and two servers. Veeam Agent for Linux is a simple backup tool for your Linux installations, both on‑premises and in the public cloud.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Windocks is an independent port of Docker's open source to Windows.   This article introduces the use of SQL Server in containers, with integrated support of SQL Server database cloning.
This month, Experts Exchange sat down with resident SQL expert, Jim Horn, for an in-depth look into the makings of a successful career in SQL.
Familiarize people with the process of retrieving data from SQL Server using an Access pass-thru query. Microsoft Access is a very powerful client/server development tool. One of the ways that you can retrieve data from a SQL Server is by using a pa…
Via a live example, show how to setup several different housekeeping processes for a SQL Server.

604 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question