C# using FoxPro - Peformance Concern

I have an application where I am constantly needing to read and write from a FoxPro databases.  The dataset can be large, but it has around 12-15 at any given time.  I am using OleDbConnection, OleDbCommand and OldDbDataReader and it is VERY slow.  Any suggestions on how to speed up the reading and inserting?

Who is Participating?

[Product update] Infrastructure Analysis Tool is now available with Business Accounts.Learn More

I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

You need the correct indexes in place so that they can be used to optimise the queries.
If your dataset has 12 - 15 records then you cannot make it faster.
If your dataset has 12 - 15 thousands of records then index could help but depends how you are reading your data. Could you post some query sample?

No index can speed up inserts.

Did you compare the speed to other data source?
tgidirectAuthor Commented:
I meant to say 12-15 fields...Sorry.
OWASP Proactive Controls

Learn the most important control and control categories that every architect and developer should include in their projects.

tgidirectAuthor Commented:
Sample Code below...

OleDbCommand dbComm = new OleDbCommand("SELECT * FROM [Table];", dbConn);
OleDbDataReader dbRead = dbComm.ExecuteReader();

while (dbRead.Read())
     Customer customer = new Customer();
     customer.First = dbRead["First"].ToString().Trim();
     customer.Middle = dbRead["Middle"].ToString().Trim();
     customer.Last = dbRead["Last"].ToString().Trim();
     customer.Address1 = dbRead["Addr"].ToString().Trim();
     customer.Address2 = dbRead["Altaddr1"].ToString().Trim();
     customer.City = dbRead["City"].ToString().Trim();
     customer.State = dbRead["State"].ToString().Trim();
     customer.Zip = dbRead["Zip"].ToString().Trim();

dbRead.Close(); dbRead.Dispose();

Thank you.
SELECT * FROM [Table]  is not optimizable because it retrieves all table records.

If you change it to   SELECT * FROM [Table] WHERE CustomerID = 12345   then index on CustomerID column can speed it significantly up. (Suppose thousands of records in table, of course.)

The most visible speed improvement can be achieved by rewritting the app into Visual FoxPro language.
tgidirectAuthor Commented:
I need to read everything from the file to run through code and processing handled in the C# application.
Then you are limited by C# speed in conjunction with OLE DB data access...

To speed it up is possible. DBF is more less a flat data file with fixed data record length and some heading. You may read the DBF file as a stream. The DBF file structure is described e.g. here: http://www.dbf2002.com/dbf-file-format.html

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
In your OLE DB connection string, add a TABLEVALIDATE=0 to the trailing parameters and see if allowing the environment to not be so strict (needlessly in most cases) gives you a boost.
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today

From novice to tech pro — start learning today.