Solved

Compare tab delimited text file with table field.aize

Posted on 2011-03-20
8
300 Views
Last Modified: 2012-05-11
Hello,

I need to compare the field length in a text file with that of the required field size in its destination table.  If any of the fields are to long, I would like to create a report (can be an export to excel) that details which row the error occured on.  If there are no errors, I append the data to the table.  All the fields are text data type.

EXAMPLE - Field1 has a value of" A01E36" but the field size limit is 5,  this record would be on the report.

I am creating a staging table in DB1, but the destination table would be in a remote DB.

Any ideas??????

Thanks,

T


0
Comment
Question by:Halia
8 Comments
 
LVL 21
ID: 35177908
The way I handle this is the use a "staging tbale" withteh fiedl size set to 255 for all the text fiels.

In a query you can uery check the Length ( Len() ) fo rhta data look9ing for any records that have fiedl that have data that is to long.

... Where Len([FeidlName1]) > 5 or  Len([FeidlName2]) > 10 ....

Open in new window

If the query retrun any recod the you know there is an iss.
0
 

Author Comment

by:Halia
ID: 35177940
Thank You.  I need to make this a bit more dynamic though.  I have about 60 tables, the tables have anywhere from 10 - 150 columns.

I have my staging table with the field size set to 255 - is there anyway to check the staging table field lengths against the destination table field.size?

Thanks!
0
 
LVL 34

Expert Comment

by:jefftwilley
ID: 35180461
How many fields are in these text files...is that dynamic as well? or do they match the intended destination table?

What kind of data is it? All text, varied?

You can begin by creating a table of your field definitions by table.

As you open and begin importing your data, you would have to compare length to the known field length from that table.

You can do it on the fly, but it would be very slow.

What kind of text file is it? and what import method are you using?
J
0
Comprehensive Backup Solutions for Microsoft

Acronis protects the complete Microsoft technology stack: Windows Server, Windows PC, laptop and Surface data; Microsoft business applications; Microsoft Hyper-V; Azure VMs; Microsoft Windows Server 2016; Microsoft Exchange 2016 and SQL Server 2016.

 

Author Comment

by:Halia
ID: 35180611
They are all tab delimited text files. All the fields are being imported as a text data type.

 We have approximately 60 different files that are imported 3-4 times a month.  The destination table for each text file is already created so they SHOULD match.  

At the moment, I am bring the data into a staging table then will be appending that to the destination table.  I'm not concerned about the speed at this point, we are just getting some very crappy data coming in!!!!

Does that explain things better?
0
 
LVL 26

Expert Comment

by:Nick67
ID: 35183295
You said you are using a staging table.
I would make all of the fields in the staging table memo.
They'll then hold all the data regardless.

Then you need to check each field for length greater than 255
Create an tblErrors.
ErrorID as AutoNumber/long, Primary Key
RowNumber as long
FieldName as text

Find somewhere to run this code
It parsed a 53 field table with 28000 entries -- granted only one was memo field -- in 40 seconds, generating 12000 entries in tbErrors.

Private Sub ParseForTooLong()
Dim db As Database
Dim rs1 As Recordset
Dim rs2 As Recordset
Dim NumFields As Integer
Dim NumRows As Long
Dim x As Integer ' to count rows
Dim myfield As Field 'each field

Set db = CurrentDb
Set rs1 = db.OpenRecordset("Select * from tblStaging;", dbOpenDynaset, dbSeeChanges)
Set rs2 = db.OpenRecordset("Select * from tblErrors where 1=2;", dbOpenDynaset, dbSeeChanges)

NumFields = rs1.Fields.count

rs1.MoveLast ' populate the recordset fully
rs1.MoveFirst
NumRows = rs1.RecordCount

x = 0

Do Until rs1.EOF = True ' loop through the staging table one record at a time
    x = x + 1 'bump the row counter
    For Each myfield In rs1.Fields 'check each field
        If Len(myfield.Value) > 254 Then 'longer than 254 so it sin;t going to work as text?
            rs2.AddNew ' write an entry to tblErrors
            rs2!rownumber = x ' the row
            rs2!FieldName = myfield.Name 'the field
            rs2.Update
        End If
    Next myfield ' next field please
    rs1.MoveNext ' move ahead in the recordset
Loop ' next record!

MsgBox "Done!"

End Sub

Open in new window

0
 
LVL 26

Accepted Solution

by:
Nick67 earned 500 total points
ID: 35183412
Now, instead of just a hard limit of 254, you want dynamic comparison to another table
Add some code!
Add another recordset for the destination table
Add another counter for fields -- because fieldnames may not match

Voila

 
Private Sub ParseForTooLong()
Dim db As Database
Dim rs1 As Recordset
Dim rs2 As Recordset
Dim rs3 As Recordset
Dim NumFields As Integer
Dim NumRows As Long
Dim x As Integer
Dim y As Integer
Dim myfield As Field

Set db = CurrentDb
Set rs1 = db.OpenRecordset("Select * from tblStaging;", dbOpenDynaset, dbSeeChanges)
Set rs2 = db.OpenRecordset("Select * from tblErrors where 1=2;", dbOpenDynaset, dbSeeChanges)
Set rs3 = db.OpenRecordset("Select * from tblDestination where 1=2;", dbOpenDynaset, dbSeeChanges)

NumFields = rs1.Fields.count

rs1.MoveLast
rs1.MoveFirst
NumRows = rs1.RecordCount

x = 0

Do Until rs1.EOF = True
    y = 0 'field ordinal reset to 0
    x = x + 1
    For Each myfield In rs1.Fields
        If Len(myfield.Value) > rs3.Fields(y).size Then 'fields may not be named the same so refer by index
            rs2.AddNew
            rs2!rownumber = x
            rs2!FieldName = myfield.Name
            rs2.Update
        End If
        y = y + 1 ' increment the field
    Next myfield
    rs1.MoveNext
Loop

MsgBox "Done!"

End Sub

Open in new window

0
 

Author Comment

by:Halia
ID: 35183423
THANK YOU!  I will try this tonight and let you know how it goes....

T
0
 

Author Comment

by:Halia
ID: 35190424
WORKS GREAT!!!!!!    Thank you so much!
0

Featured Post

Simplifying Server Workload Migrations

This use case outlines the migration challenges that organizations face and how the Acronis AnyData Engine supports physical-to-physical (P2P), physical-to-virtual (P2V), virtual to physical (V2P), and cross-virtual (V2V) migration scenarios to address these challenges.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Introduction The Visual Basic for Applications (VBA) language is at the heart of every application that you write. It is your key to taking Access beyond the world of wizards into a world where anything is possible. This article introduces you to…
It’s the first day of March, the weather is starting to warm up and the excitement of the upcoming St. Patrick’s Day holiday can be felt throughout the world.
Learn how to number pages in an Access report over each group. Activate two pass printing by referencing the pages property: Add code to the Page Footers OnFormat event to capture the pages as there occur for each group. Use the pages property to …
In Microsoft Access, learn how to “cascade” or have the displayed data of one combo control depend upon what’s entered in another. Base the dependent combo on a query for its row source: Add a reference to the first combo on the form as criteria i…

830 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question