I have a script that I'm passing about 60000 records to. One of the fields is a dt_text field. When I'm handing that field off to the script I'm converting it to unicode as you can see in the script below.
My problem is after around 50000 records are passed through the pipeline I get this error:
at ScriptComponent_f389ae70f7634d74ab65be5acee6a369.ScriptMain.Input0_ProcessInputRow(Input0Buffer Row)
at ScriptComponent_f389ae70f7634d74ab65be5acee6a369.UserComponent.Input0_ProcessInput(Input0Buffer Buffer)
at ScriptComponent_f389ae70f7634d74ab65be5acee6a369.UserComponent.ProcessInput(Int32 InputID, PipelineBuffer Buffer)
at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.ProcessInput(Int32 inputID, PipelineBuffer buffer)
I am not very familar with scripts but it seems to be working and from what I can guess the problem is a buffer issue.
I've played with the DefaultBufferMaxRows and DefaultBufferSize options but can't seem to get past this issue.
Any advise is much appreciated. I'm pasting my script below for reference.
' Microsoft SQL Server Integration Services user script component' This is your new script component in Microsoft Visual Basic .NET' ScriptMain is the entrypoint class for script componentsImports SystemImports System.DataImports System.MathImports Microsoft.SqlServer.Dts.Pipeline.WrapperImports Microsoft.SqlServer.Dts.Runtime.WrapperImports System.TextPublic Class ScriptMain Inherits UserComponent Public Overrides Sub Input0_ProcessInputRow(ByVal Row As Input0Buffer) 'Dim blobData() As Byte = Row.FolioLegalDescription.GetBlobData(0, CInt(Row.FolioLegalDescription.Length)) Dim inLegal As String = System.Text.Encoding.Unicode.GetString(Row.FolioLegalDescription.GetBlobData(0, CInt(Row.FolioLegalDescription.Length))) Dim inLegal2, inLegal3, inLegal4, inLegal5, inLegal6 As String inLegal = inLegal.Substring(1, inLegal.Length - 2) inLegal = Replace(inLegal, Chr(10), "") inLegal = Replace(inLegal, Chr(13), "") inLegal = Replace(inLegal, Chr(9), "") If inLegal.Length > 20000 Then inLegal6 = inLegal.Substring(20000, inLegal.Length - 4000) inLegal = inLegal.Substring(0, 20000) End If If inLegal.Length > 16000 Then inLegal5 = inLegal.Substring(16000, inLegal.Length - 4000) inLegal = inLegal.Substring(0, 16000) End If If inLegal.Length > 12000 Then inLegal4 = inLegal.Substring(12000, inLegal.Length - 4000) inLegal = inLegal.Substring(0, 12000) End If If inLegal.Length > 8000 Then inLegal3 = inLegal.Substring(8000, inLegal.Length - 4000) inLegal = inLegal.Substring(0, 8000) End If If inLegal.Length > 4000 Then inLegal2 = inLegal.Substring(4000, inLegal.Length - 4000) inLegal = inLegal.Substring(0, 4000) End If Row.Legal1 = inLegal Row.Legal2 = inLegal2 Row.Legal3 = inLegal3 Row.Legal4 = inLegal4 Row.Legal5 = inLegal5 Row.Legal6 = inLegal6 End SubEnd Class
Microsoft SQL Server 2005VB ScriptVisual Basic.NET
Last Comment
PedroCGD
8/22/2022 - Mon
PedroCGD
but why you use script?!
Why you dont use other components like conditionSplit or derived column?
Only some exception cases you should use script tasks.. other components are more optimized... depends on what you need, but in this case, I think you could replace the script by other components!
Cheers!
lostboyinsofla
ASKER
I am not the original author of the package so I'm not entirely sure why a script was used. That being said I think they probably found it easier to split the blob up.
How else can I do it? I still need to be able to take a dt_text datatype which can be up to 24,000 chars and break it out into 6 4,000 char dt_str datatypes.
lostboyinsofla
ASKER
Do you know what is causing the actual error I posted though? Is it the script or the buffers for the dataflow object? I'm not so sure the script is causing it and I don't want to go re-writting things just because if that isn't going to solve the issue.
try what frog suggest, and for each output add an OLE DB Destination and unlock table destination.
Avoid using Union All and Aggregates.
Helped?
Cheers!
PFrog
The problem with having multiple copies of the same destination, is that you can't use a fast load - a fast load has to have an exclusive lock on the table.
Even though Unions are best avoided where posssible, I think in this case they will improve the performance. Especiallly if there are a number of other transformations further along the pipeline that need to be applied to each output. Otherwise you would have to duplicate every transform 6 times... less than ideal for development and performance.
However, like most things with SQL/SSIS there is always more than one way to do things - the right way is very dependent on the specifics of each project.
PedroCGD
i'm with you frog... and notjing more important to test the performance of several ways to do the job!:-)
In sql2008... the performance it's high improved!!
cheers
Why you dont use other components like conditionSplit or derived column?
Only some exception cases you should use script tasks.. other components are more optimized... depends on what you need, but in this case, I think you could replace the script by other components!
Cheers!