I have been using Access for a very long time and have some, but not a lot of experience working with SQl Server as a backend.
I am working on a conversion of legacy system data using Access 2013. I had all of my conversion logic written and tested using a subset of the data provided by the client. However, when I got the full set of data problems began. I am reading a fixed length .txt file in my Access application and writing the output to a SQL table. The input file has over 200k records but my logic will never get past processing 35,000 or so. The record size of the input record is over, 14,000 characters. One of the fields is 14,000 characters long.
After I got the 3183 I tried including BeginTrans and CommitTrans in the loop but the 3183 is still thrown after about 35,000 records.
The exact message I am getting is:
MY system has 32GB of Ram and over 80GB of disk space free. My Accdb is only about 60MB when I get this message.
In all my many years of working with Access I have never seen this error before and don't know how to resolve this situation to be able to process the entire file.
Here is my code:
Public Sub readFixedLengthTextFile()
'On Error Resume Next
' output file
Set db = getCurrentDbC
Dim rsOut As DAO.Recordset
Set rsOut = db.OpenRecordset("SQCS_CostFile_Import", dbOpenDynaset, dbSeeChanges)
Const adOpenStatic = 3
Const adLockOptimistic = 3
Const adCmdText = &H1
recCount = 0
wkStartTime = Now
Dim objConnection As Object
Dim objRecordset As Object
Dim strPathtoTextFile As String
Set objConnection = CreateObject("ADODB.Connection")
Set objRecordset = CreateObject("ADODB.Recordset")
strPathtoTextFile = "I:\My Documents\Access_Databases\JordanDelinq\TestDirectory\CostSheet\"
objConnection.Open "Provider=Microsoft.ACE.OLEDB.12.0;" & _
"Data Source=" & strPathtoTextFile & ";" & _
objRecordset.Open "SELECT * FROM TQCSFILE.txt", _
objConnection, adOpenStatic, adLockOptimistic, adCmdText
Do Until objRecordset.EOF
recCount = recCount + 1
totRecs = totRecs + 1
If recCount = 10000 Then
recCount = 0
wkCurrTime = Now
Debug.Print totRecs & ", " & RunTime(wkStartTime, wkCurrTime)
![CostName] = Nz(objRecordset.Fields.Item("CostName"))
![MuniCode] = Nz(objRecordset.Fields.Item("MUniCode"))
![LotBlock] = Nz(objRecordset.Fields.Item("LotBlock"))
![TaxType] = Nz(objRecordset.Fields.Item("TaxType"))
![MuniCode02] = Nz(objRecordset.Fields.Item("MuniCode02"))
![ControlNumber] = Nz(objRecordset.Fields.Item("ControlNumber"))
![TaxType02] = Nz(objRecordset.Fields.Item("TaxType02"))
![CostDetail01_100] = Nz(objRecordset.Fields.Item("CostDetail"))
![GRBNumnber] = Nz(objRecordset.Fields.Item("GRBNumber"))
![GDNumber] = Nz(objRecordset.Fields.Item("GDNNumber"))
![Remarks] = Nz(objRecordset.Fields.Item("Remarks"))
'![xxx] = Nz(objRecordset.Fields.Item("Filler"))
Set rsOut = Nothing
Debug.Print "AllRecs: "; totRecs & ", " & RunTime(wkStartTime, wkCurrTime)
If I comment out the write of the output file and just loop thru the input the logic reads to the end of the input file. As soon as I include the .addnew .... .update I can only thru about 35,000 records before getting the above message.
Does anyone know how to resolve this situation so I can read and and write all of the records in the input?