Reading and Writing large SQL Table In MS Access, Getting '3183' error on large files

I have been using Access for a very long time and have some, but not a lot of experience working with SQl Server as a backend.

I am working on a conversion of legacy system data using Access 2013.  I had all of my conversion logic written and tested using a subset of the data provided by the client.  However, when I got the full set of data problems began.  I am reading a fixed length .txt file in my Access application and writing the output to a SQL table.  The input file has over 200k records but my logic will never get past processing 35,000 or so.  The record size of the input record is over, 14,000 characters.  One of the fields is 14,000 characters long.

After I got the 3183 I tried including BeginTrans and CommitTrans in the loop but the 3183 is still thrown after about 35,000 records.

The exact message I am getting is:
Error
MY system has 32GB of Ram and over 80GB of disk space free.  My Accdb is only about 60MB when I get this message.

In all my many years of working with Access I have never seen this error before and don't know how to resolve this situation to be able to process the entire file.

Here is my code:
Public Sub readFixedLengthTextFile()

'On Error Resume Next

clearSQLTable_Sproc "SQCS_CostFile_Import"
' output file
Set db = getCurrentDbC
Dim rsOut As DAO.Recordset
Set rsOut = db.OpenRecordset("SQCS_CostFile_Import", dbOpenDynaset, dbSeeChanges)


Const adOpenStatic = 3
Const adLockOptimistic = 3
Const adCmdText = &H1
'
recCount = 0
wkStartTime = Now

Dim objConnection As Object
Dim objRecordset  As Object
Dim strPathtoTextFile  As String

Set objConnection = CreateObject("ADODB.Connection")
Set objRecordset = CreateObject("ADODB.Recordset")

strPathtoTextFile = "I:\My Documents\Access_Databases\JordanDelinq\TestDirectory\CostSheet\"
          
objConnection.Open "Provider=Microsoft.ACE.OLEDB.12.0;" & _
          "Data Source=" & strPathtoTextFile & ";" & _
          "Extended Properties=""text;HDR=NO;FMT=FixedLength"""
          
objRecordset.Open "SELECT * FROM TQCSFILE.txt", _
          objConnection, adOpenStatic, adLockOptimistic, adCmdText

Do Until objRecordset.EOF

    recCount = recCount + 1
    totRecs = totRecs + 1
    
    
    '
    If recCount = 10000 Then
      '  Stop
        recCount = 0
        wkCurrTime = Now
        Debug.Print totRecs & ", " & RunTime(wkStartTime, wkCurrTime)
        DoEvents
    End If
    
    
   With rsOut
      .AddNew
            ![CostName] = Nz(objRecordset.Fields.Item("CostName"))
            ![MuniCode] = Nz(objRecordset.Fields.Item("MUniCode"))
            ![LotBlock] = Nz(objRecordset.Fields.Item("LotBlock"))
            ![TaxType] = Nz(objRecordset.Fields.Item("TaxType"))
            ![MuniCode02] = Nz(objRecordset.Fields.Item("MuniCode02"))
            ![ControlNumber] = Nz(objRecordset.Fields.Item("ControlNumber"))
            ![TaxType02] = Nz(objRecordset.Fields.Item("TaxType02"))
            ![CostDetail01_100] = Nz(objRecordset.Fields.Item("CostDetail"))
            ![GRBNumnber] = Nz(objRecordset.Fields.Item("GRBNumber"))
            ![GDNumber] = Nz(objRecordset.Fields.Item("GDNNumber"))
            ![Remarks] = Nz(objRecordset.Fields.Item("Remarks"))
            '![xxx] = Nz(objRecordset.Fields.Item("Filler"))
      .Update
   End With
    objRecordset.MoveNext
Loop

rsOut.Close
Set rsOut = Nothing

Debug.Print "AllRecs: "; totRecs & ", " & RunTime(wkStartTime, wkCurrTime)
MsgBox "Complete"

End Sub

Open in new window


If I comment out the write of the output file and just loop thru the input the logic reads to the end of the input file.  As soon as I include the .addnew ....     .update I can only thru about 35,000 records before getting the above message.

Does anyone know how to resolve this situation so I can read and and write all of the records in the input?
LVL 1
mlcktmguyAsked:
Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

PatHartmanCommented:
Did you use the transaction appropriately?  Access accumulates the updates between the start trans and the commit and keeps them in memory or in temp areas until the commit is executed.  You're on the right track with the transaction but you have to commit the data fairly frequently and start a new transaction.  I would do no more than 20,000 records before issuing the commit.
0
mlcktmguyAuthor Commented:
Thanks Pat:  I revised it down to 100 records between BeginTrans and CommitTrans but it still threw the message at around 35,000.

I've never used the Transaction concept before.  DO I have it coded correctly?
0
Gustav BrockCIOCommented:
Well, 200k * 14k is about 2.8 GB, so you need another database engine as backend, like SQL Server (the free Express edition will hold 10 GB) or MySQL/MariaDB  which both offer the LoadFile method for extremely fast import of clean data.

Alternatively, just link  the text file and then run a simple append query using the linked tables as source.
0
Determine the Perfect Price for Your IT Services

Do you wonder if your IT business is truly profitable or if you should raise your prices? Learn how to calculate your overhead burden with our free interactive tool and use it to determine the right price for your IT services. Download your free eBook now!

mlcktmguyAuthor Commented:
Gustav: SQL Server is the back end.

Some more background is in this prior question: Link to Previous Question
I've tried everything I can think of and all end up with the '3183' error shown above

1.  I tried importing a csv file directly to a SQL table with TransferText.

2. I tried linking to the input txt file using the linked table manager but that was unsuccessful.  I got a message that the input txt file was too big.

3. I tried dynamically linking the input csv.  The linking part worked  but after reading about 70,00 records form the input file I got the same 3183 error.  This is the logic I used when dynamic linking:
Public Sub readFixedLengthTextFile()

'On Error Resume Next

clearSQLTable_Sproc "SQCS_CostFile_Import"
' output file
Set db = getCurrentDbC
Dim rsOut As DAO.Recordset
Set rsOut = db.OpenRecordset("SQCS_CostFile_Import", dbOpenDynaset, dbSeeChanges)


Const adOpenStatic = 3
Const adLockOptimistic = 3
Const adCmdText = &H1
'
recCount = 0
wkStartTime = Now

Dim objConnection As Object
Dim objRecordset  As Object
Dim strPathtoTextFile  As String

Set objConnection = CreateObject("ADODB.Connection")
Set objRecordset = CreateObject("ADODB.Recordset")

strPathtoTextFile = "I:\My Documents\Access_Databases\JordanDelinq\TestDirectory\CostSheet\"
         
objConnection.Open "Provider=Microsoft.ACE.OLEDB.12.0;" & _
          "Data Source=" & strPathtoTextFile & ";" & _
          "Extended Properties=""text;HDR=NO;FMT=FixedLength"""
         
objRecordset.Open "SELECT * FROM TQCSFILE.txt", _
          objConnection, adOpenStatic, adLockOptimistic, adCmdText

Do Until objRecordset.EOF

    recCount = recCount + 1
    totRecs = totRecs + 1
   
   
    '
    If recCount = 10000 Then
      '  Stop
        recCount = 0
        wkCurrTime = Now
        Debug.Print totRecs & ", " & RunTime(wkStartTime, wkCurrTime)
        DoEvents
    End If
   
   
   With rsOut
      .AddNew
            ![CostName] = Nz(objRecordset.Fields.Item("CostName"))
            ![MuniCode] = Nz(objRecordset.Fields.Item("MUniCode"))
            ![LotBlock] = Nz(objRecordset.Fields.Item("LotBlock"))
            ![TaxType] = Nz(objRecordset.Fields.Item("TaxType"))
            ![MuniCode02] = Nz(objRecordset.Fields.Item("MuniCode02"))
            ![ControlNumber] = Nz(objRecordset.Fields.Item("ControlNumber"))
            ![TaxType02] = Nz(objRecordset.Fields.Item("TaxType02"))
            ![CostDetail01_100] = Nz(objRecordset.Fields.Item("CostDetail"))
            ![GRBNumnber] = Nz(objRecordset.Fields.Item("GRBNumber"))
            ![GDNumber] = Nz(objRecordset.Fields.Item("GDNNumber"))
            ![Remarks] = Nz(objRecordset.Fields.Item("Remarks"))
            '![xxx] = Nz(objRecordset.Fields.Item("Filler"))
      .Update
   End With
    objRecordset.MoveNext
Loop

rsOut.Close
Set rsOut = Nothing

Debug.Print "AllRecs: "; totRecs & ", " & RunTime(wkStartTime, wkCurrTime)
MsgBox "Complete"

End Sub

4.  In SSMS, I bulk loaded the input csv file into a SQL table and wrote the current logic to read the bulk table and reformat it.  I got the same error after about 70,000 records.  I revised the code to use Transactions which didn't help.  The current version of my code, which is shown below actually closes and re-opens the output file every 10 records.  It is painfully slow and I still ended up with the same error.  

Public Sub readBigLumpOfData()

Dim wkCostName As String
Dim wkMuniCode As String
Dim wkLotBlock As String
Dim wkTaxType As String
Dim wkMuniCode02 As String
Dim wkControlNumber As String
Dim wkTaxType02 As String
Dim wkCostDetail01_100 As String
Dim wkGRBNumnber As String
Dim wkGDNumber As String
Dim wkRemarks As String
'
Dim wkCurrent As DAO.Workspace
Set wkCurrent = DBEngine.Workspaces(0)

'On Error Resume Next
wkStartTime = Now

clearSQLTable_Sproc "SQCS_CostFile_Import"
' output file
Set db = getCurrentDbC
Dim rsOut As DAO.Recordset
Set rsOut = db.OpenRecordset("SQCS_CostFile_Import", dbOpenDynaset, dbSeeChanges)
'
'selectString = "Select *  from SQCS_EntireRecAs1Field_Import1 "   'Where [BRT] = 0 "
'Dim rsIn2 As ADODB.Recordset
'Set rsIn2 = New ADODB.Recordset
'rsIn2.Open selectString, CurrentProject.Connection, adOpenKeyset, adLockOptimistic

Set db = getCurrentDbC
Dim rsin2 As DAO.Recordset
'Set rsin2 = db.OpenRecordset("SQCS_EntireRecAs1Field_Import1", dbOpenSnapshot, dbReadOnly)   ' opened Readonly  //////////////// Read Only  //////////////////
Set rsin2 = db.OpenRecordset("SQCS_EntireRecAs1Field_Import1", dbOpenSnapshot, dbReadOnly)   ' opened Readonly  //////////////// Read Only  //////////////////

'
'wkCurrent.BeginTrans   '///////////////////////////////////////////

If rsin2.EOF Then
    '
Else
 '   If rsIn2.RecordCount > 0 Then
        '
        While Not rsin2.EOF
            '
             recCount = recCount + 1
             totRecs = totRecs + 1
             wkCloseCOunt = wkCloseCOunt + 1
             '
             If wkCloseCOunt = 10 Then
                 wkCloseCOunt = 0
                 rsOut.Close
                 Set rsOut = db.OpenRecordset("SQCS_CostFile_Import", dbOpenDynaset, dbSeeChanges)
             End If
             '
             If recCount = 10000 Then
              '   Stop
                 recCount = 0
                 wkCurrTime = Now
                 Debug.Print totRecs & ", " & RunTime(wkStartTime, wkCurrTime)
                 DoEvents
                 '
                 'wkCurrent.CommitTrans '////////////////////////////////////////
                 'DBEngine.Idle dbRefreshCache
                 'wkCurrent.BeginTrans  '///////////////////////////////////////
                 'rsOut.Close
                 'Set rsOut = db.OpenRecordset("SQCS_CostFile_Import", dbOpenDynaset, dbSeeChanges)
             End If
             
             
            wkCostName = Mid(rsin2!EntireRec, 1, 30)
            wkMuniCode = Mid(rsin2!EntireRec, 31, 4)
            wkLotBlock = Mid(rsin2!EntireRec, 35, 17)
            wkTaxType = Mid(rsin2!EntireRec, 52, 1)
            wkMuniCode02 = Mid(rsin2!EntireRec, 53, 4)
            wkControlNumber = Mid(rsin2!EntireRec, 57, 7)
            wkTaxType02 = Mid(rsin2!EntireRec, 64, 1)
            wkCostDetail01_100 = Mid(rsin2!EntireRec, 65, 14000)
            wkGRBNumnber = Mid(rsin2!EntireRec, 14065, 30)
            wkGDNumber = Mid(rsin2!EntireRec, 14095, 30)
            wkRemarks = Mid(rsin2!EntireRec, 14125, 76)
             
     ' Debug.Print "CostName: " & wkCostName; ", Remarks: " & wkRemarks
             
             
            With rsOut
            
               .AddNew
                     ![CostName] = wkCostName
                     ![MuniCode] = wkMuniCode
                     ![LotBlock] = wkLotBlock
                     ![TaxType] = wkTaxType
                     ![MuniCode02] = wkMuniCode02
                     ![ControlNumber] = wkControlNumber
                     ![TaxType02] = wkTaxType02
                     ![CostDetail01_100] = wkCostDetail01_100
                     ![GRBNumnber] = wkGRBNumnber
                     ![GDNumber] = wkGDNumber
                     ![Remarks] = wkRemarks
               .Update
            
            End With
    
            rsin2.MoveNext
        Wend
End If

wkCurrent.CommitTrans  '//////////////////////////////////////

rsOut.Close
Set rsOut = Nothing
'
rsin2.Close
Set rsin2 = Nothing

Debug.Print "AllRecs: "; totRecs & ", " & RunTime(wkStartTime, wkCurrTime)
MsgBox "Complete"

End Sub

Open in new window


I am out of ideas.  Any help would be appreciated.
0
Gustav BrockCIOCommented:
Search for the roots:

  1. Read the text file line by line
  2. Split/decode/convert each line to obtain the field values to import
  3. Add a record with the sanitised field values

Here is some basic code for this:

    Dim dbs As DAO.Database
    Dim rst As DAO.Recordset
    Dim strRecordSource As String       'Source for recordset, can be SQL, table, or saved query
    Dim intFileDesc As Integer      	'File descriptor for output file
    Dim strSourceFile As String     	'Full path of source file
    Dim strTextLine As String       	'Input buffer
    Dim strField1 As String         	'Extracted Field1 from buffer
    Dim strField2 As String         	'Extracted Field2 from buffer
    Dim strField3 As String         	'Extracted Field3 from buffer

    Set dbs = CurrentDb
    Set rst = dbs.OpenRecordset(strRecordSource, dbOpenDynaset)
    
    intFileDesc = FreeFile
    Open strSourceFile For Input As #intFileDesc
    Do While Not EOF(intFileDesc) 	' Loop until end of file.
        Line Input #intFileDesc, strTextLine                  'Read line into buffer
	' <  use string handling to extract the fields from strTextLine
	'    and place them in strField1, strField2, & strField3         >
	rst.AddNew			'depending on your situation, you may want to find an existing record and Edit instead
	rst!Field1 = strField1 ' or converted value, numeric, date/time, boolean
	rst!Field2 = strField2
	rst!Field3 = strField3
	rst.Update
    Loop
    Close #intFileDesc    'Close file.
    rst.Close		  'Close the recordset
    Set rst = Nothing
    Set dbs = Nothing     'Garbage handling before we exit the function

Open in new window

It will probably run faster than you might expect, and at extremely low resource consumption.
1
Jim Dettman (Microsoft MVP/ EE MVE)President / OwnerCommented:
<<Read the text file line by line
Split/decode/convert each line to obtain the field values to import
Add a record with the sanitised field values>>

 That's what I'd do as well.

and while your probably running into the DB limit (you can look for temp JET files in your temp directory and watch them grow), I'd also give two things a shot:

1. Use SetOption to increase the maxlocksperfile setting to 500,000

2. Don't use a transaction.   Transactions queue everything up and then commit.  Last thing you want to do when your doing bulk operations.  I always write a process as repeatable in case it fails, and then don't use transactions.

Jim.
0
mlcktmguyAuthor Commented:
Thanks Gutav for the code and Jim for the confirmation but I am confused as I look thru the code.  I've never worked with this string input before.

I understand the concept that I am going to read the .txt file a full line at a time and then break the full line down into the individual fields.  In this case that would work very well because all of the individual fields are strings.

I don't see how to break the input fields apart.

In your example is 'strSourceFile' = "I:\My Documents\Access_Databases\JordanDelinq\TestDirectory\CostSheet\"
and                           'strRecordSource' = "TQCSFILE.txt"

I don't know what to load into 'intFileDesc' because it's an integer

Your code "Set rst = dbs.OpenRecordset(strRecordSource, dbOpenDynaset)" , which looks like an input file because above it says
Dim strRecordSource As String       'Source for recordset, can be SQL, table, or saved query

but later there is an .rst.Addnew which should usually be there for an output file.

I don't see where to specify my output SQL table.

If possible could you please revise this a bit with my actual input and output? Perhaps then it would be obvious.

My output is being written to a linked SQl Server table called 'SQCS_CostFile_Import'.  

My input .txt file is: I:\My Documents\Access_Databases\JordanDelinq\TestDirectory\CostSheet\TQCSFILE.txt.

This is the definition of the input file I have in a Schema.ini file from earlier attempts.
[TQCSFILE.txt]
ColNameHeader=False
Format=FixedLength 
Col1=CostName Text Width 30
Col2=MUniCode Text Width 4
Col3=LotBlock Text Width 17
Col4=TaxType Text Width 1
Col5=MuniCode02 Text Width 4
Col6=ControlNumber Text Width 7
Col7=TaxType02 Text Width 1
Col8=CostDetail Memo Width 14000
Col9=GRBNumber Text Width 30
Col10=GDNNumber Text Width 30
Col11=Remarks Text Width 76
Col12=Filler Text Width 100

Open in new window


This is the layout of the linked SQL Output table
CREATE TABLE [dbo].[SQCS_CostFile_Import](
	[ID] [int] IDENTITY(1,1) NOT NULL,
	[CostName] [nvarchar](30) NULL,
	[MuniCode] [nvarchar](4) NULL,
	[LotBlock] [nvarchar](17) NULL,
	[TaxType] [nvarchar](1) NULL,
	[MuniCode02] [nvarchar](4) NULL,
	[ControlNumber] [nvarchar](7) NULL,
	[TaxType02] [nvarchar](1) NULL,
	[CostDetail01_100] [ntext] NULL,
	[GRBNumnber] [nvarchar](30) NULL,
	[GDNumber] [nvarchar](30) NULL,
	[Remarks] [nvarchar](76) NULL,
	[upsize_ts] [timestamp] NULL,
 CONSTRAINT [aaaaaSQCS_CostFile_Import_PK] PRIMARY KEY NONCLUSTERED 
(
	[ID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]

Open in new window


This would be the logic I envision to break apart the record into individual fields and write to the output file but I don't think it's valid in this construct.  I'm including it to the field lengths and names.
             
            wkCostName = Mid(rsin2!EntireRec, 1, 30)
            wkMuniCode = Mid(rsin2!EntireRec, 31, 4)
            wkLotBlock = Mid(rsin2!EntireRec, 35, 17)
            wkTaxType = Mid(rsin2!EntireRec, 52, 1)
            wkMuniCode02 = Mid(rsin2!EntireRec, 53, 4)
            wkControlNumber = Mid(rsin2!EntireRec, 57, 7)
            wkTaxType02 = Mid(rsin2!EntireRec, 64, 1)
            wkCostDetail01_100 = Mid(rsin2!EntireRec, 65, 14000)
            wkGRBNumnber = Mid(rsin2!EntireRec, 14065, 30)
            wkGDNumber = Mid(rsin2!EntireRec, 14095, 30)
            wkRemarks = Mid(rsin2!EntireRec, 14125, 76)
             
     ' Debug.Print "CostName: " & wkCostName; ", Remarks: " & wkRemarks
             
             
            With rsOut
            
               .AddNew
                     ![CostName] = wkCostName
                     ![MuniCode] = wkMuniCode
                     ![LotBlock] = wkLotBlock
                     ![TaxType] = wkTaxType
                     ![MuniCode02] = wkMuniCode02
                     ![ControlNumber] = wkControlNumber
                     ![TaxType02] = wkTaxType02
                     ![CostDetail01_100] = wkCostDetail01_100
                     ![GRBNumnber] = wkGRBNumnber
                     ![GDNumber] = wkGDNumber
                     ![Remarks] = wkRemarks
               .Update
            

Open in new window




If you could provide a little more insight I would appreciate it.
0
mlcktmguyAuthor Commented:
Jim Dettman:  I'm curious why them jet file would be goring.
and while your probably running into the DB limit (you can look for temp JET files in your temp directory and watch them grow), I'd also give two things a shot:

The logic throwing the error in y original post is reading from one linked SQL table and writing directly to another linked SQL table.  Why would that make the JetDb grow?  It obviously is growing, I just don't understand why.

Would it be better (as in not causing the jet db to grow) to have a Stored procedure retrieve all the data and then read it into Access as a recordset returned from the Stored procedure?
On the other side I could also write the data to SQL though a stored procedure.
0
PatHartmanCommented:
The logic throwing the error in y original post is reading from one linked SQL table and writing directly to another linked SQL table.  Why would that make the JetDb grow?  It obviously is growing, I just don't understand why.
Because Access acquires work space to perform the operation.  Once work space is acquired, making the database size increase, the only way to purge it when it is no longer needed is to compact the database.  Access cannot recapture space that was used but is no longer needed any other way.

I believe that others have already suggested that you use SQL Server itself to import the data.  Do not involve Access at all.
0
mlcktmguyAuthor Commented:
Thanks Scott: I have tried loading the txt directly into a SQL table and was successful using a SQL bulk load. However, once the large txt file is a SQL table I still have to read, reformat and write the records within Access.  The Access routine below is doing nothing more than reading the bulk loaded SQL input table and writing a SQL output table but it still blows with the same error 3183 message.

Would it be helpful to write the records using a stored procedure instead of a SQL table linked to the Access front end.  Would writing thru an SP require less Access resources so the logic could get thru all 200k records without exceeding?  If so I can do that.

I could also retrieve the input records thru a stored procedure if that would use keep Access from exceeding the storage limit?

I realize that reading and writing this many records in Access is very unusual.  This is a conversion, a one time thing that will never have to run again after we're in production.  I wrote and tested the conversion with a smaller set of data supplied by the client.  It worked perfectly.  The problems didn't come until the full size files were given to me.  For conversion purposes I still have to come up with a way to read thru all 200k legacy records so I can convert them to the new format.  There are several files this large that need converted and everything I've tried so far in Access fails.

I've been using Access for a very long time and never even seen the 3183 message.

I appreciate all of the help and suggestions given but so far none of them have ended up with a routine that could read and write all of the records in this file.  Interestingly, if I comment out the writing of the output file all 200k records are read very quickly with no issue.  Re-adding the write of the output table brings back the 3183.





Public Sub readBigLumpOfData()

Dim wkCostName As String
Dim wkMuniCode As String
Dim wkLotBlock As String
Dim wkTaxType As String
Dim wkMuniCode02 As String
Dim wkControlNumber As String
Dim wkTaxType02 As String
Dim wkCostDetail01_100 As String
Dim wkGRBNumnber As String
Dim wkGDNumber As String
Dim wkRemarks As String
'
Dim wkCurrent As DAO.Workspace
Set wkCurrent = DBEngine.Workspaces(0)

'On Error Resume Next
wkStartTime = Now

clearSQLTable_Sproc "SQCS_CostFile_Import"
' output file
Set db = getCurrentDbC
Dim rsOut As DAO.Recordset
Set rsOut = db.OpenRecordset("SQCS_CostFile_Import", dbOpenDynaset, dbSeeChanges)
'
'selectString = "Select *  from SQCS_EntireRecAs1Field_Import1 "   'Where [BRT] = 0 "
'Dim rsIn2 As ADODB.Recordset
'Set rsIn2 = New ADODB.Recordset
'rsIn2.Open selectString, CurrentProject.Connection, adOpenKeyset, adLockOptimistic

Set db = getCurrentDbC
Dim rsin2 As DAO.Recordset
'Set rsin2 = db.OpenRecordset("SQCS_EntireRecAs1Field_Import1", dbOpenSnapshot, dbReadOnly)   ' opened Readonly  //////////////// Read Only  //////////////////
Set rsin2 = db.OpenRecordset("SQCS_EntireRecAs1Field_Import1", dbOpenSnapshot, dbReadOnly)   ' opened Readonly  //////////////// Read Only  //////////////////

'
'wkCurrent.BeginTrans   '///////////////////////////////////////////

If rsin2.EOF Then
    '
Else
 '   If rsIn2.RecordCount > 0 Then
        '
        While Not rsin2.EOF
            '
             recCount = recCount + 1
             totRecs = totRecs + 1
             wkCloseCOunt = wkCloseCOunt + 1
             '
             If wkCloseCOunt = 10 Then
                 wkCloseCOunt = 0
                 rsOut.Close
                 Set rsOut = db.OpenRecordset("SQCS_CostFile_Import", dbOpenDynaset, dbSeeChanges)
             End If
             '
             If recCount = 10000 Then
              '   Stop
                 recCount = 0
                 wkCurrTime = Now
                 Debug.Print totRecs & ", " & RunTime(wkStartTime, wkCurrTime)
                 DoEvents
                 '
                 'wkCurrent.CommitTrans '////////////////////////////////////////
                 'DBEngine.Idle dbRefreshCache
                 'wkCurrent.BeginTrans  '///////////////////////////////////////
                 'rsOut.Close
                 'Set rsOut = db.OpenRecordset("SQCS_CostFile_Import", dbOpenDynaset, dbSeeChanges)
             End If
             
             
            wkCostName = Mid(rsin2!EntireRec, 1, 30)
            wkMuniCode = Mid(rsin2!EntireRec, 31, 4)
            wkLotBlock = Mid(rsin2!EntireRec, 35, 17)
            wkTaxType = Mid(rsin2!EntireRec, 52, 1)
            wkMuniCode02 = Mid(rsin2!EntireRec, 53, 4)
            wkControlNumber = Mid(rsin2!EntireRec, 57, 7)
            wkTaxType02 = Mid(rsin2!EntireRec, 64, 1)
            wkCostDetail01_100 = Mid(rsin2!EntireRec, 65, 14000)
            wkGRBNumnber = Mid(rsin2!EntireRec, 14065, 30)
            wkGDNumber = Mid(rsin2!EntireRec, 14095, 30)
            wkRemarks = Mid(rsin2!EntireRec, 14125, 76)
             
     ' Debug.Print "CostName: " & wkCostName; ", Remarks: " & wkRemarks
             
             
            With rsOut
            
               .AddNew
                     ![CostName] = wkCostName
                     ![MuniCode] = wkMuniCode
                     ![LotBlock] = wkLotBlock
                     ![TaxType] = wkTaxType
                     ![MuniCode02] = wkMuniCode02
                     ![ControlNumber] = wkControlNumber
                     ![TaxType02] = wkTaxType02
                     ![CostDetail01_100] = wkCostDetail01_100
                     ![GRBNumnber] = wkGRBNumnber
                     ![GDNumber] = wkGDNumber
                     ![Remarks] = wkRemarks
               .Update
            
            End With
    
            rsin2.MoveNext
        Wend
End If

wkCurrent.CommitTrans  '//////////////////////////////////////

rsOut.Close
Set rsOut = Nothing
'
rsin2.Close
Set rsin2 = Nothing

Debug.Print "AllRecs: "; totRecs & ", " & RunTime(wkStartTime, wkCurrTime)
MsgBox "Complete"

End Sub

Open in new window

0
Gustav BrockCIOCommented:
Dim strRecordSource As String       'Source for recordset, can be SQL, table, or saved query

Open in new window

That's the source for the recordset - the table you wish to import to: "SQCS_CostFile_Import".  

Dim strSourceFile As String     	'Full path of source file

Open in new window

That's the text file:
"I:\My Documents\Access_Databases\JordanDelinq\TestDirectory\CostSheet\TQCSFILE.txt"

The file number is assigned here:
intFileDesc = FreeFile

Open in new window


Your method for chopping the field values is perfectly OK, but there is no rsin2, it is the read line to chop:
             
            wkCostName = Mid(strTextLine, 1, 30)
            wkMuniCode = Mid(strTextLine, 31, 4)
            wkLotBlock = Mid(strTextLine, 35, 17)
            ' etc.
            wkRemarks = Mid(strTextLine, 14125, 76)
             
            ' Debug.Print "CostName: " & wkCostName; ", Remarks: " & wkRemarks             
             
            With rst            
               .AddNew
                     ![CostName] = wkCostName
                     ![MuniCode] = wkMuniCode
                ' etc.
                     ![Remarks] = wkRemarks
               .Update

Open in new window

If any field is not text, do convert the string to the data type of the field, at least to document for yourself that you are aware of this. For example:
![ControlNumber] = Val(wkControlNumber)

Open in new window

0
Jim Dettman (Microsoft MVP/ EE MVE)President / OwnerCommented:
<<The logic throwing the error in y original post is reading from one linked SQL table and writing directly to another linked SQL table.  Why would that make the JetDb grow?  It obviously is growing, I just don't understand why.>>

  JET will process some things locally.   Unless you are using a pass-through query, then it's up in the air whether the SQL operation will get pushed server side or not.  Joining to local tables, using VBA expresssions, or JET specific SQL will all cause local processing to take place, even if it is temporary.

 Outside of that issue, JET's query optimizer may decide that the best way to execute a query is to build a temp index or intermediate table.  On a large remote table, that may exceed the size of 2GB.

Jim.
0
PatHartmanCommented:
Since you know the code works fine with smaller a smaller recordset, why not process groups of records.  Add an outer loop with a counter.  I'm assuming you added an identity column so you can use that to control the groups.   Rather than opening the table directly, open a query that selects identity 1-20,000, update the loop and open the query with 21,000 - 40,000.  You may need to log the groups to a table in case you actually have to compact Access mid-stream.
0

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
mlcktmguyAuthor Commented:
Thanks Pat:  I'm going to start heading in that direction.  Prior to the new full size data set I had already told the client that the conversion logic was complete and tested.  Now I'm telling them there is an issue and things will be delayed.  I've lost several days hoping there some EE'r would have a way to process these records in one batch,.  That way I wouldn't have to revise the conversion but that is looking less and less likely. I have to get moving and get this done by whatever means necessary.

I'm going to leave the question open in case any one has a method within Access that will process all of these records in one pass but that is looking less and less likely.  I'm starting to think it may not be possible.  Either way I have to move forward.
0
PatHartmanCommented:
I think the recordset is simply too large to process at once.
0
Gustav BrockCIOCommented:
But why all that trouble?
Reading the file line by line directly, adding a record as shown, requires about zero ressources, and could go on "forever". 200k records is close to nothing - it could be 2 or 20 million.
1
PatHartmanCommented:
Reading the file line by line directly, adding a record as shown,
That is what micktguy did.  Apparently it is writing the records that suck up all the resources.
0
mlcktmguyAuthor Commented:
Thanks for all of the help and suggestions.  I implemented Pat's suggestion and processed the file in chunks of 50,000 records.  I also revised the input of the procedure to be a recordset returned from a stored procedure.  I added a small table to the application where the lastIDprocessed is stored.  Subsequent executions begin after that ID.

In case it helps anyone else, here is the final Access routine:
Private Sub btnstartCostSheetProcess_Click()


Dim wkCostName As String
Dim wkMuniCode As String
Dim wkLotBlock As String
Dim wkTaxType As String
Dim wkMuniCode02 As String
Dim wkControlNumber As String
Dim wkTaxType02 As String
Dim wkCostDetail01_100 As String
Dim wkGRBNumnber As String
Dim wkGDNumber As String
Dim wkRemarks As String
'
Dim wkString As String
Dim wkLastIDProcessed As Long
Dim wkNumToRead As Long
Dim wkNewLastIDProcessed As Long
'
wkLastIDProcessed = getLastCostSheetIDProcessed_DAO()
wkNumToRead = 50000  '///////////////////// Change number to Read
'
If wkLastIDProcessed = 0 Then
    clearSQLTable_Sproc "SQCS_CostFile_Import"
End If
'
DoCmd.Hourglass True
DBEngine.SetOption dbMaxLocksPerFile, 500000

'On Error Resume Next
wkStartTime = Now

' output file
Set db = getCurrentDbC
Dim rsOut As DAO.Recordset
Set rsOut = db.OpenRecordset("SQCS_CostFile_Import", dbOpenDynaset, dbSeeChanges)
'
Dim rs As ADODB.Recordset
Dim cmd As ADODB.Command
Set cmd = New ADODB.Command

With cmd
    .CommandText = "spReadCostSheetImport"
    .CommandType = adCmdStoredProc
    setSQLConnection
    .ActiveConnection = gConnection
    '
    ' Input Params to filter tax year recs
    '
    .Parameters.Append .CreateParameter("wkLastIDProcessed", adBigInt, adParamInput, , wkLastIDProcessed)
    .Parameters.Append .CreateParameter("wkNumToRead", adBigInt, adParamInput, , wkNumToRead)
    
     
    Set rs = New ADODB.Recordset
        With rs
            .CursorLocation = adUseServer
            .CursorType = adOpenKeyset
            .LockType = adLockOptimistic
            .Open cmd
        End With
        '
        ' Processed Returned Records
        '
        If rs.EOF Then
            MsgBox "No Matching Records Found"
        Else
            '
            While Not rs.EOF
                '
                wkString = rs!EntireRec
                '
                wkCostName = Mid(wkString, 1, 30)
                wkMuniCode = Mid(wkString, 31, 4)
                wkLotBlock = Mid(wkString, 35, 17)
                wkTaxType = Mid(wkString, 52, 1)
                wkMuniCode02 = Mid(wkString, 53, 4)
                wkControlNumber = Mid(wkString, 57, 7)
                wkTaxType02 = Mid(wkString, 64, 1)
                wkCostDetail01_100 = Mid(wkString, 65, 14000)
                wkGRBNumnber = Mid(wkString, 14065, 30)
                wkGDNumber = Mid(wkString, 14095, 30)
                wkRemarks = Mid(wkString, 14125, 76)
                '
                wkNewLastIDProcessed = rs!ID   ' this will be the new lastID porcessed
                '
                With rsOut
                    .AddNew
                          ![CostName] = wkCostName
                          ![MuniCode] = wkMuniCode
                          ![LotBlock] = wkLotBlock
                          ![TaxType] = wkTaxType
                          ![MuniCode02] = wkMuniCode02
                          ![ControlNumber] = wkControlNumber
                          ![TaxType02] = wkTaxType02
                          ![CostDetail01_100] = wkCostDetail01_100
                          ![GRBNumnber] = wkGRBNumnber
                          ![GDNumber] = wkGDNumber
                          ![Remarks] = wkRemarks
                    .Update
            
                End With

                '
                rs.MoveNext
            Wend
        End If
        '
        rs.Close
        Set rs = Nothing
        '
        rsOut.Close
        Set rsOut = Nothing
        
    ' You could now assign it to a form
   
    Set .ActiveConnection = Nothing
End With

UpdateLastCostIDProcessed wkNewLastIDProcessed
wkCurrTime = Now
DoCmd.Hourglass False
Debug.Print "Processed Records with IDs From: " & Format(wkLastIDProcessed, cMJOIntFormat) & _
            " Thru " & Format(wkNewLastIDProcessed, cMJOIntFormat) & ", " & RunTime(wkStartTime, wkCurrTime),
MsgBox "Complete"

End Sub

Open in new window

This is the stored procedure:
USE [JTSConversion]
GO
/****** Object:  StoredProcedure [dbo].[spReadCostSheetImport]    Script Date: 1/19/2018 9:23:44 PM ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
-- =============================================
-- 
-- =============================================
ALTER PROCEDURE [dbo].[spReadCostSheetImport] 
	-- Add the parameters for the stored procedure here
	@LastIDProcessed  int      = 0,
	@NumToRead        int      = 50000
AS
BEGIN

Declare @EndingID as int

set @EndingID = @LastIDProcessed + @NumToRead

	-- SET NOCOUNT ON added to prevent extra result sets from
	-- interfering with SELECT statements.
	SET NOCOUNT ON;

    -- Insert statements for procedure here
	SELECT *
	From aView_SQCS_EntireRecAs1Field_Import
	WHERE 
          ID   > @LastIDProcessed and ID < @EndingID
		  Order By ID 



END

Open in new window

0
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
SQL

From novice to tech pro — start learning today.