I am developing an MS Access 2013 application with a SQL Server backend for a client. As part of the project I have to write conversion logic to transform their legacy data into the new format, resident in SQL Server tables.
I wrote and tested my conversion routines with a subset of the legacy data provided by the client. Everything worked fine.
The client is supplying the data as .txt file. Within the txt file the fields are fixed length. I created Import Specs for each data set being converted and then imported each file to be converted using this construct.
Public Sub importThisOne(passedImportFIle As String, _
passedImportTable As String, _
passedImportSpec As String, _
Optional passedClearFileBeforeImporting As Boolean = True)
If passedClearFileBeforeImporting = True Then
deleteString = " delete * from " & passedImportTable
DoCmd.TransferText acImportFixed, _
passedImportFIle – is the complete path, including the file name, to the .txt file to be imported
PassedImportTable – is the linked SQL Server table linked where the imported data will go
passedImportSpec – Name of the Import Spec for the table being imported
ting -= Boolean flag. If yes, the SQL table is cleared of data before the import
Everything worked perfectly with the subset of the data provided by the client for initial testing. The client just provided a complete set of legacy data to test against my import and one very large input table is causing issues.
When the logic executes the above ‘Docmd.Transfer’ with the large table as input I get this error.
My accdb is 1GB and my disk is not close to full.
It appears that I will have to change my technique for moving the large input txt files to their related SLQ table. I think txt files can be linked to an application but I’ve never done that.
What method would you use to get this large, fixed length .txt input file to the SQL table?