We help IT Professionals succeed at work.

Check out our new AWS podcast with Certified Expert, Phil Phillips! Listen to "How to Execute a Seamless AWS Migration" on EE or on your favorite podcast platform. Listen Now


3349 (Numeric Overflow) error when importing to split database table

rjproulx asked
Medium Priority
Last Modified: 2013-11-27
I have a database that pulls in .txt files and appends them all to a table.  I used an import specification, called by a module, to accomplish this.  The database works perfectly until I split it.  Once split, I am getting a "3349 - Numeric Overflow" error.  

I understand this has to do with the data types.  Originally, I was pulling the data in as text, date/time, and currency.  I've since tried a few different combinations (replacing currency with long, replacing text wtih memo, etc) with no luck.  What is most frustrating is that this functions correctly until I split the DB.  

I have confirmed that the data types are the same between the import spec and the destination table.  What else should I look for?  I've attached an image that shows the text file, most of the import spec, and the design view of the destination table.
Watch Question

You might try (1) delete and relink the table and (2) recreate the import specification on a COPY of your database.


Thanks cwood.  Can you explain more about deleting and relinking the table?  You want me to delete the table on the back-end, then re-run the import?  That'll give me a local (front-end) table...can I then import that into the back-end again?  Or am I confused here?

And I'll go ahead and try rebuilding the import spec.

And yes, I'm doing ALL of this on a copy of my db :)
You can delete the link in the front-end (FE) by selecting the linked table and press Delete.
You can relink the table by (in the FE) selecting Tables => New => Link Tables => navigate to and select the back-end (BE) database => select the table in the BE => click OK.

Are you importing the exact same text file in the combined and split databases?


To answer your last question, yes, it is the same exact file in both instances.

Wanted to give everyone a bit of a breakthrough I just had, maybe this will help.  Knowing it was a data type issue, I went ahead and changed every field in the import spec to "text".  Then I changed every field in the table to "text".  It worked!  My plan was then to change the fields back to their original data type, one my one.  Instantly I got the error as soon as I changed "Dep Date" to 'date/time'.  From here, though, I'm stuck.  I kinda need that to come in as a date, or at least be able to change from text to date somewhere early in the process.  Would it help if I posted the text file I'm importing, so you can see that the first 9 digits from the left are trying to import as 'date/time'?
Most Valuable Expert 2014

Is there any reason you are using Memo fields instead of Text?

A Text field can hold up to 255 characters.


Sorry jimpen, using memo was just me testing new datatypes.  I should have re-done the screenshot with my original attempt, which was having everything as 'text'.  
It looks from the graphic as though the first *eight* digits from the left is the date/time and the ninth digit is a space. Could that be the problem?


Cwood: To test that, I edited the import script to put a field called 'blank1' that starts at position 9 and continues for 1 space.  I then changed the 'Date Dep' field to reflect positions 1 through 8.  This resulted in getting the same error.  I got the error whether or not I checked "skip" on 'blank1'.

Good idea though, I thought maybe you had it.
Most Valuable Expert 2014

Take off the Four digit Years check box.


Also just tried re-linking the tables between front and back ends with no resolution.

Taking a step back here...if I can import everything successfully as 'text', is there a way to perhaps do an update query that then changes the data types?  I need to be able to identify the currency (to be able to sum them) and the dates (to do some before/after logic later in the process), so I can't keep everything as text.  But if I can import as text and then somehow run a query to change the field types, that might work.  Just trying to come at this from another angle...
I think jimpen has the answer.


jimpen: That had no effect.  I appreciate the help thus far everyone.
Most Valuable Expert 2014

Take off the four digit. Do the fields as text.

Is there a place to say the first line is column names?
Most Valuable Expert 2014

You know what: I've never been a fan of import specs anyway.

Take a look at the following code, and see if it makes any sense.
Public Function Import_Fixed_Width_Q_24534037()
Dim FileNum As Integer
Dim InputFile As String
Dim InputString As String
Dim I As Integer
Dim StringArray() As String
Dim DB As Database
Dim RS As Recordset
Dim SQL As String
'Opening the table to write to
Set DB = CurrentDb()
Set RS = DB.OpenRecordset("MyTableName")    '<-- Change to your tablename
'Opening the file to read from
FileNum = FreeFile()
InputFile = "C:\MyFolder\MyTextFile"        '<-- Change to your folder and path
Open InputFile For Input Access Read Shared As #FileNum
I = 0
'Dump the header lines
Line Input #FileNum, InputString
Line Input #FileNum, InputString            'Read the data in
Do Until EOF(FileNum) = True
Line Input #FileNum, InputString            'Dump the header lines
Line Input #FileNum, InputString            'Read the data in
Do Until EOF(FileNum) = True
    With RS                                 'the input is an empty string write it
        ![Dep Date] = Mid(InputString, 1, 8)
        ![Policy Number] = Mid(InputString, 10, 19)
    End With
    Line Input #FileNum, InputString
Close FileNum
Set RS = Nothing
Set DB = Nothing
End Function

Open in new window

Unlock this solution with a free trial preview.
(No credit card required)
Get Preview
Most Valuable Expert 2014

I had another asker run into a very similar problem just last week:

Error 3349 'Numeric field overflow' when importing text file to linked table


Thanks for the info cwood.  I was hoping to use my existing queries and figure out what was behind this error, but in the end, the extra three tables and six queries required to go this route weren't very painful at all.  And to my users, it's seamless.  Best of all, it works!  Thanks Jim for the ideas, they were just a bit over my head.
Unlock the solution to this question.
Thanks for using Experts Exchange.

Please provide your email to receive a free trial preview!

*This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.


Please enter a first name

Please enter a last name

8+ characters (letters, numbers, and a symbol)

By clicking, you agree to the Terms of Use and Privacy Policy.