Link to home
Start Free TrialLog in
Avatar of Ali Shah
Ali ShahFlag for United Kingdom of Great Britain and Northern Ireland

asked on

How to select top(N) records automatically from a huge data set

Hi guys,

I need to do ETL on 29.5 million records via SSIS. Is it possible to break the records to select top(N) records process them and then select next top (N) automatically until all records are transferred?

Regards,
Avatar of Pawan Kumar
Pawan Kumar
Flag of India image

Try.. You can use the below block and do it. It will give you number of records you want to fetch from each page.

--You can use below code

DECLARE @NoOfRecords AS INT = 200
DECLARE @WhichPage AS INT = 1


;WITH CTE AS
(
	SELECT * , ROW_NUMBER() OVER(ORDER BY id DESC) rnk
	FROM [yourtableName]
)
SELECT [cols...] FROM CTE
WHERE Rnk >= (( @WhichPage * @NoOfRecords ) + 1 ) AND RNK <= ( ( @WhichPage + 1 ) * @NoOfRecords ) 

--

Open in new window


Hope it helps !!
Avatar of Ali Shah

ASKER

Thank you. Can you please explain it how do i implement this in ssis? How do i change the @NoOfRecords value dynamically? Can you please come up by an example. Sorry i am relatively new with SSIS
Create a SQL procedure with below parameters -

DECLARE @NoOfRecords AS INT = 200
DECLARE @WhichPage AS INT = 1

And then create a for loop and pass dynamic values to these parameters and fetch the data , process them.

Hope it helps !
right, sorry to be a pain but as i said i am relatively new in SSIS. As far as i have read the for loop iterates for number of known times.
Also i forgot to mention that i am using the Scripting Task to do lots of operations on each row returned by stored procedure.
How do i configure the for loop so that for example the scripting task brings back first 1000 rows processes it and then brings back next 1000 rows until 29.5 millions records are processed.
ASKER CERTIFIED SOLUTION
Avatar of Pawan Kumar
Pawan Kumar
Flag of India image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of Vitor Montalvão
Can you explain more about the process?
Are you trying to import data from a file or from another database?
Is there any transformation task during the import process?
Why do you think you need this to be performed in batches? And why so small batches (1000)?
Consider using BCP and you can then set the batch_size parameter to set the number of rows.
Hi shah36,

Any update on this?

Regards,
Pawan
Hi Vitor,

I had to process each single record to break them down in different normalized tables and also there were some calculations needed to be performed for each record.

All the records are imported from a database and there are 29.77 millions record so that's why i was wondering to bring back a chunk I agree 1000 is small however our server resources are not great so i limited it to 10,000 records instead.
Hi Pawan,

Thanks a lot. I have used your idea to make the loop and the counter also the stored the last id of the batch in a table so that the next time loops select next 10,000 records

regards
shah36, how long it took for the 29.77 Million rows?
I just have the impression that you reinvented the wheel since tools like bcp (mentioned by Anthony Perkins) already does that and with a better performance.
Ah it tool over 30 hours to process the records. I am quite new to SSIS so learning. I am going to see how could i use bcp in the future.

Thanks a lot for your help.
bcp uses Bulk Insert so it should be much faster. And when I say much faster is not like will reduce the time by one or two hours. I'm almost sure that it will do that in 10%-20% of the time you spent.
Hi Vitor,

Have just seen how to use BCP. I might be wrong in understanding but it says when there are no transformation involved.
In my case i get an address result like

Halloway Ltd. The Kinder Gardens 14 Macfarren Street, Aston, Birmingham B12 4AG, ENgland 

Open in new window


The requirement is to break this string into normalised tables. In this case Halloway Ltd goes to a separate table, The Kinder Garden in separate, 14 in Separate, Macfarren Street Separate, Aston in Separate and same with Brimingham, B12 4AG and England.
Obviously it checks first if there is existing record then update else insert.
Also we need to see that all the post codes within 3 miles radius of B12 4AG and create point in the database for the mileage calculation for later use.

Is this all possible using BCP?

Regards,

Ali
BCP is only for BULK operations so when you're having transformations those tasks need to be performed after. In the case that you can't perform after then ofc BCP can't be used.
If you have multiple things to validate/process or multiple transformations then you can use SSIS or create a SQL proc and call it from SSIS.