Solved

Debate on best methods for implementing ETL, Datawarehousing and n-tier

Posted on 2013-02-05
5
357 Views
Last Modified: 2014-05-12
My firm is  in the middle of some debate.

We do millions of ETL rows from 1000's of flat files daily. Some of our databases are quite large 100-500GB.
Most are around 100GB or less.

We update and validate portfolio data for ~30M securities portfolios stored in approx 50 Databases on 5+ servers.

We also have other servers tasked with ftp and ETL. And our application web-servers.
All backended by sql server 2008R2.

A separate db is implemented for each client but many clients are on a single db server.

No consolidation of shared data is implemented so there is some replication between db's.

No typical data-warehouse or BI technologies are in place.

There are a couple of layers.
1-2. flat files are typically ftp downloaded and read using vb into sql server and processed into a 1st stage DB
then text files are exported and then imported into a
2nd stage database for validation logic.
Stages 1 and 2 should soon be merged.

3. Data is then posted from the 1-2 stages to the client database on a server.
Currently stored procedures do a lot of the work but increasingly the transforms/validation/final posting
operations are being moved to vb objects utilizing .net cache and lots of RAM. Data is read from DB manipulated
and written back so the DB is just concerned with select/update/insert/delete simple sql.

4. data maintenance/utility scripts for cleansing are mainly sp's and sql scripts that are scheduled
out side of sql server via a home grown vb app.

5. the application/business layer/business (data) layer logic is mixed
but there is movement to n-tier architecture and moving Business Logic out of the database. A lot of complex calculation and query intensive activity is taking place here.

I see no problem moving the application to n-tier but what about the etl and transform logic.
What is the current thinking on this?

One side says to transfer the etl transform and validation logic to this VB middle tier and add
lots of ram to support the .net caching and .net sqldata classes supporting the VB objects.
Also moving the business logic the this middle tier.

Another side argues to let the application move to three (or n) tiers BUT push etl to the BI stack
and add a data-warehouse implementation so cubing (to ease calculation in the business layer) and validating could go on during warehouse loading.

In essence use the BI (IBM, Oracle or MS) stack for loading and a 3 tier system for the application based on
current warehousing technology.

Also note that very little insert/update activity is generated by the app.

Your thoughts?
0
Comment
Question by:dthoms000
5 Comments
 
LVL 38

Accepted Solution

by:
Aaron Tomosky earned 168 total points
Comment Utility
If you are dealing with flat files and FTP getting that stuff into a db will always be step one. At that point it really depends on your data and your use for it as to how it's organized, but you will probably have a transform and load section (s). This could have different hardware requirements than your large accessible db.

Creating small ssd local storage or NSF/iscsi San can be done on the cheap, look at intel 3700 drives as a starting place. Go with raid 10 (or mirrored pairs if you do zfs).

I don't have any personal experience with cubes and bi all that much, but I etl disparet data into common aggregated databases constantly. My problem is that it's always a little different for different clients so step 1-2 is transforming and loading into a common format that the rest of the system can aggregate and work with.

Sorry for the "it depends" answer but it really does depend on your data and use of that data.
0
 
LVL 16

Assisted Solution

by:AlexPace
AlexPace earned 166 total points
Comment Utility
I think the development environment, error handling and logic branching is easier in VB..

However, the "generic programmer logic" method of munging large data sets can be MUCH less efficient (read: slower) than native SQL Server methods.  Also the optimization will be easier on the SQL Server.  Also it is easier to alter a stored procedure than a VB program... you don't need to recompile and redeploy it.

If you are trying to get close to real time I've had good luck with a Robo-FTP service that constantly polls a remote FTP server and downloads a source file as soon as it becomes available.  When the download is complete it executes a SQL statement that kicks off the bulk import job.  If the transfer fails it will retry automatically and resume where it left offf if the remote server supports that.
0
 
LVL 69

Assisted Solution

by:ScottPletcher
ScottPletcher earned 166 total points
Comment Utility
SSIS has many built-in functions/tools to help with ETL.  To me it would make sense to take advantage of those tools.
0

Featured Post

Free Trending Threat Insights Every Day

Enhance your security with threat intelligence from the web. Get trending threat insights on hackers, exploits, and suspicious IP addresses delivered to your inbox with our free Cyber Daily.

Join & Write a Comment

Suggested Solutions

Real-time is more about the business, not the technology. In day-to-day life, to make real-time decisions like buying or investing, business needs the latest information(e.g. Gold Rate/Stock Rate). Unlike traditional days, you need not wait for a fe…
Ever wondered why sometimes your SQL Server is slow or unresponsive with connections spiking up but by the time you go in, all is well? The following article will show you how to install and configure a SQL job that will send you email alerts includ…
This videos aims to give the viewer a basic demonstration of how a user can query current session information by using the SYS_CONTEXT function
Using examples as well as descriptions, and references to Books Online, show the documentation available for datatypes, explain the available data types and show how data can be passed into and out of variables.

744 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

17 Experts available now in Live!

Get 1:1 Help Now