Incremental load after first initial Load

Posted on 2013-06-05
Medium Priority
Last Modified: 2016-02-11
Hi experts,

Here's my concerns.I have initially loaded the huge data from the source.It took about 2 hrs,and also did the lookup on some column so as to compare with the source just to load the new rows.But my concern is my lookup will be fetching all the records from the source and it will take the same 2 hrs and will load the new records,but is there a a way to bypass the previous load and check just the new incoming rows in the source so that way load will be much faster while doing the incremental.Also there's no unique identifier like createddate,modifieddate on the source.Attached is my data as well.

Thanks in advance.
Question by:Josh2442
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 4
  • 2
  • 2
  • +1
LVL 35

Expert Comment

ID: 39225317
the data come from a sql server database? or, is it a .csv or .xslx file you read?

a csv should be much smaller (therefore faster) and could be compressed.

also you could read it in big chunks rather than reading record by record.

you could make a csv out of the xlsx by automation or out of database. and you could add some kind of load id/load timestamp that could help for incremental load.

LVL 21

Expert Comment

by:Jason Yousef, MS
ID: 39225329
so your source is an excel file? or a DB?
and how many records have you loaded in these 2 hours?

typically you would use any unique ID from the source, or date column against your load, also if your source is sql server you would use change data tracking or if you're using sql server enterprise edition you would use CDC (change data capture)


if none of these applicable to you, then optimize your package that loads from the excel file, load these into a staging table, apply CDC or change tracking, or use merge join, lookup, slowly changing dimension or even straight T-sql to do the upserts.

let me know if you need more info
LVL 19

Expert Comment

ID: 39225342
Hi Josh2442,

it would be difficult to do without a primary key. Are you sure there is none? Can you ask to include additional identifiers to the load?

You could try to use CHECKSUM function. If you create indexed computed column, you could then compare checksum of incoming data against what is already inserted. Check this script for illustration:
--Create sample table
	SomeNumber INT,
	SomeString VARCHAR(MAX),
	RowCheckSum AS CHECKSUM(SomeNumber, SomeString)

--Create index on checksum
CREATE UNIQUE NONCLUSTERED INDEX [IX_RowCheckSum] ON [dbo].[YourTable]([RowCheckSum] ASC)

--Insert some test data data
INSERT INTO YourTable VALUES (1,'A'),(2,'B'),(3,NULL)


--Prepare some more
CREATE TABLE #new_data 
	SomeNumber INT,
	SomeString VARCHAR(MAX),
INSERT INTO #new_data VALUES (3,NULL),(4,'New Data') 

--Try to insert it. Only one row should be inserted
INSERT INTO YourTable(SomeNumber, SomeString)
SELECT SomeNumber, SomeString 
FROM #new_data 
WHERE CHECKSUM(SomeNumber, SomeString) NOT IN (SELECT RowCheckSum FROM YourTable)


Open in new window

Moving data to the cloud? Find out if you’re ready

Before moving to the cloud, it is important to carefully define your db needs, plan for the migration & understand prod. environment. This wp explains how to define what you need from a cloud provider, plan for the migration & what putting a cloud solution into practice entails.


Author Comment

ID: 39225373
Source is from another db and i cannot do any dml on that source like doing identity insert on.Have you heard about cache monitor which use csql somewhat similar to that of tsql but not a whole lot, this is where source is coming from. I m loading that into regular sql server which is ssms.
LVL 19

Expert Comment

ID: 39225587
Can you try CHECKSUM approach then?

Author Comment

ID: 39226135
Hi expert,sorry I didnt analyze the data properly.Now when I went through all those data,rowid was incrementing but in delimited format.So let me know how we do insert based on that rowid.My approach was to compare the maxid row from destination to the source and load the data greater than that maxid.But its really hard to figure out how data is incrementing based on that delimited format.Here's how data is incrementing--i have listed the data how the first 3 numbers changed from 209 to 210 in rowid.
LVL 21

Accepted Solution

Jason Yousef, MS earned 2000 total points
ID: 39226159
Yeah I was also wondering  when I saw that column !! anyway you can write to a control table what rowID and load date, is the latest rowID you have on your destination,
and in your ETL, grab that rowID in a variable, and load greater than that rowID.

or use the "lookup transformation" based on the RowID.

Author Comment

ID: 39229297
Hi expert,
I was able to do the incremental load using that rowid,but I have a question,like what if someone changes the particular column  within the source as my incremental will look only for maxrowid,and will insert the new rows.But I need to do the update or delete based on the changed column which is obviously will be the row less than that maxrowid.Does it have to fectch the whole record?

Author Comment

ID: 39233460
Hi expert,,
I really need big favor. I m stuck @ ado.net sql command. Since the source is ado.net i can't directly pass the parameter like in oledb sql comman.i have created 2 variables maxdate n mindate.Mindate is 60 days less than maxdate. How do i write a query where columnname is between these 2 variables.i need this query in ado.net sql command in ssis expression buider .

Featured Post

Get real performance insights from real users

Key features:
- Total Pages Views and Load times
- Top Pages Viewed and Load Times
- Real Time Site Page Build Performance
- Users’ Browser and Platform Performance
- Geographic User Breakdown
- And more

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

JSON is being used more and more, besides XML, and you surely wanted to parse the data out into SQL instead of doing it in some Javascript. The below function in SQL Server can do the job for you, returning a quick table with the parsed data.
This article shows gives you an overview on SQL Server 2016 row level security. You will also get to know the usages of row-level-security and how it works
Via a live example, show how to shrink a transaction log file down to a reasonable size.
Viewers will learn how to use the UPDATE and DELETE statements to change or remove existing data from their tables. Make a table: Update a specific column given a specific row using the UPDATE statement: Remove a set of values using the DELETE s…
Suggested Courses

777 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question