Solved

12. Brief Summary Description on HOW TO Load Data into a STAR SCHEMA Data Warehouse..then PROCESS Cube?

Posted on 2014-10-16
1
119 Views
Last Modified: 2016-02-18
Experts:

You have an incremental data set that needs to populate into a Star Schema and then the corresponding SSAS cube needs to be processed. The STAR contains 1 fact, 4 Type two dimensions, 1 Time dimension, 3 Static dimensions, and 1 junk dimension. You need to be able to account for all incremental data in the STAR.

Walk through the process of loading the data and the steps of processing the SSAS cube.

Please only provide a BRIEF summary of how you would load this Star Schema and then processing the Cube.

Thanks!
0
Comment
Question by:MIKE
1 Comment
 
LVL 25

Accepted Solution

by:
jogos earned 500 total points
ID: 40386022
With "incremental load" you must think good about what you realy need. Just add the new data or also be able to update/delete data that has already been loaded. Lookup/conditionals split are the way to branch that path.

Only adding the new and new can be identified (ID > last ID, createddatetime .... ) then it is realy good for performance to test that so you don't evaluate all old data . But be aware if corrections can be made on old data it wont be populated anymore.

The dimensions you must evaluate if the it already exists  if not insert, if it does maybe something can be changed (address of customer)

An example of an incremental load
http://vsteamsystemcentral.com/cs21/blogs/applied_business_intelligence/archive/2007/05/21/ssis-design-pattern-incremental-loads.aspx
0

Featured Post

Best Practices: Disaster Recovery Testing

Besides backup, any IT division should have a disaster recovery plan. You will find a few tips below relating to the development of such a plan and to what issues one should pay special attention in the course of backup planning.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Checking the Alert Log in AWS RDS Oracle can be a pain through their user interface.  I made a script to download the Alert Log, look for errors, and email me the trace files.  In this article I'll describe what I did and share my script.
When table data gets too large to manage or queries take too long to execute the solution is often to buy bigger hardware or assign more CPUs and memory resources to the machine to solve the problem. However, the best, cheapest and most effective so…
Via a live example, show how to set up a backup for SQL Server using a Maintenance Plan and how to schedule the job into SQL Server Agent.
Viewers will learn how to use the UPDATE and DELETE statements to change or remove existing data from their tables. Make a table: Update a specific column given a specific row using the UPDATE statement: Remove a set of values using the DELETE s…

809 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question