Still celebrating National IT Professionals Day with 3 months of free Premium Membership. Use Code ITDAY17

x
?
Solved

How to repair a corrupt FPT file

Posted on 2009-01-16
10
Medium Priority
?
5,119 Views
Last Modified: 2012-05-06
I have a corrupt FPT file that I cannot repair. The DBF that is associated with it has 52 records of information. Ive pin-pointed where the corrupted fields are. I have a memo field with the name mplacement. There are exactly 4 so called corrupted records in the mplacement field.  The corruption starts at record 18. Records 18  21 seem to be the corrupted ones. Records 1  17 seem to display information correctly. When I export the information from this DBF into an XLS, it only goes to record 17 and then quits, leaving the rest of the records out. When I open the FPT with a Hex Editor, the data displayed ends with record 17, like there is no more data.

Heres the tricky part. When I open the application and go into 4 records (other than the 4 corrupted ones) and enter data into that mplacement memo field, it magically fixes the FPT and I dont get the Error 41, Memo file is missing or invalid anymore.  When I open the FPT back up in a Hex Editor, it shows me records 1  17 and my newly added data for the 4 entries at the bottom. I still cant seem to recover all of the data from the memo field from records 18  52. Ive got backups (at the time of corruption) of the DBF and FPT so I can try anything. Any help is much appreciated.
0
Comment
Question by:HbugProject
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 4
  • 4
10 Comments
 

Author Comment

by:HbugProject
ID: 23395310
I'm taking the records from 18 - 52 as a loss. Since they do not show up in Hex Editor, they're gone. I have another question for you guys.

Is there a way I can write the data of the FPTs to another format (TXT, XLS, CSV, etc...) programatically? Maybe include a trigger some how that it will write the data out to another format every time the FPT is updated? So if the FPT gets corrupted, I can restore the information back from the most recent backed up ghost file so no data will be lost and a restore from backup won't be necessary.
0
 
LVL 30

Expert Comment

by:Olaf Doschke
ID: 23395899
In DBCs you can define Insert/Update/Delete Trigers that run beforehand and you can already access the new values and so this could be a solution. But it's a bit overacting to double store memo texts in the fear of file corruption. You may migrate data to a more solid server database. Actually I'm working on some kind of replication via triggers which would help storing everything double, but it's taking twice the time to write then. I opt for creating a log table without indexes and have a secondary process reading in from the log into replication dbfs. But that's going too far here. This sample code will write some memo field to a text file and can be put into a tables insert and update trigger as logmemo("tablename","memofieldname",tablename.ID).

tablename.ID should be the primary key value, which is used in the file name to identify into which record the memo.txt backup file belongs. You might change the strtofile() to append to a single file instead of generating new ones for every insert/update.
#Define ccBackupPath "\\server\data\backup\"
Procedure logmemo()
   Lparameters tcTable, tcMemo, tuID
 
   Local lcBackupFilename
   lcBackupFilename = ccBackupPath+Transform(tuID)+" "+tcTable+;
       " "+tcMemo+DTOS(Datetime())+Sys(2015)+".txt"
 
   Strtofile(Eval(tcTable+"."+tcMemo),lcBackupFilename,.f.)
Endproc

Open in new window

0
 
LVL 14

Expert Comment

by:tusharkanvinde
ID: 23395902
To repair files you can check out CMRepair or FoxFix http://www.xitech-europe.co.uk/foxfix.php.

You can create triggers on a VFP database. Check out modify procedure to create a procedure. and in modify structure you can set the triggers.
0
Implementing Azure Infrastructure Exam 70-533

This course is designed to familiarize and instruct students in the content that is covered by Microsoft Exam 70-533, Implementing Microsoft Azure Solutions. It focuses on all the November 2016 objective domain topics.

 

Author Comment

by:HbugProject
ID: 23450825
Olaf,

Thank you for the code snippet. We have tested this code and it appears to only work for ONE memo field in ONE record as is. Is there a way to dump the entire contents of the memo file to another format (text, csv, xml, etc...)?

Thanks again!
0
 
LVL 30

Accepted Solution

by:
Olaf Doschke earned 1500 total points
ID: 23453572
Not only that, it writes a single memo content to a single file. It's meeant that way, as this kind of dump does not have any structure to write many memo values and distinguish between them. The simplest file format that would do that would be the fpt format or rather similar, because it consists of the concatenated memo values and a length. Within the dbf all that's stored is an offset from the file begin.

So as you want a more stable file format, that's what I'd recommend. To get all memo fields, you'd add a call for each memo field. To cover all records it's sufficient to do this in insert and update triggers, as each such tale operation then will create the file for that specific memo.

So if you have repeated and reproducable problems with fpt files, you may investigate your network hardware and performance. If this is a one time case that let's you think about fpt files this way, I can't help you other than by saying that in my 9 years experience with VFP since VFP6 I had no corrupt fpt file, I had corrupt cdx and dbfs, but no fpt. And even if some records are corrupt, you can read out the intact values from the fpt, it's not much more than the concatenation of values, what's stored in such a file.

Bye, Olaf.
0
 
LVL 30

Expert Comment

by:Olaf Doschke
ID: 25670082
I providede code to replicate a memo field. This can be applied to more than one field, by calling it several times with each memo content you want saved seperately.

And to answer the final question: "Is there a way to dump the entire contents of the memo file to another format (text, csv, xml, etc...)?" sure you can scan through all records and save data to any other file. But what good would that be, if the fpt file is getting very large, say hundreds of megabyte, and you do that with every insert or update.

The questions were all answered as far as could be. I don't think HBugProject  has a sufficient reason for a refund of points. The general problem of file corruption of file based databases is not a VFP problem only. If he experiences corrupt fpt files and lost data often, then he's quite alone with this. Such reports are rather seldom. In my own 10 years of VFP experience I did never ever loose FPT contents. I had corrupted files, also fpt, but they didn't lost records even in the hex editor. There's of course nothing you can do about lost data than have a backup.

Bye, Olaf.
0
 

Author Comment

by:HbugProject
ID: 25686913
Olaf's considered solution is slowing the performance of our software down drastically when trying to make a call to EACH memo field during inserts and update triggers considering we have a lot of memo fields on various forms.
I am aware that network performance contributes a lot to whether data gets corrupted or not. Our software is distributed state wide to hundreds of different locations. It is impossible for me to know which sites have slower performing networks. Since we can't keep the data from getting corrupted on a poor network we have introduced automatic backups of the database to help reduce data loss.
I hope the Admin of this forum considers this a valid response.
0
 
LVL 30

Expert Comment

by:Olaf Doschke
ID: 25688445
Hello HbugProject,

okay, accepted. You could have said so earlier.

This is just some demo, not optimized for performance. you could easily also modify it to save many memos at once. Of course I already said that double storing data does take performance, quite drastically. There would be many ways to improve it, be it storing the backup locally and let a parallel process copy that to the server.

You already made your choice. Nevertheless you could spend these 500 points. You're expecting a bit much, if you only would accept a full blown solution, this is just a few forum posts, not a job.

Bye, Olaf.
0
 

Author Closing Comment

by:HbugProject
ID: 31535490
The provided answer will slow the program down drastically if you have a lot of memo files. For just a couple of memo files, this would work great.
0

Featured Post

Concerto's Cloud Advisory Services

Want to avoid the missteps to gaining all the benefits of the cloud? Learn more about the different assessment options from our Cloud Advisory team.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Microsoft Visual FoxPro (short VFP) is a programming language with it’s own IDE and database, ranking somewhat between Access and VB.NET + SQL Server (Express). Product Description: http://msdn.microsoft.com/en-us/vfoxpro/default.aspx (http://msd…
When trying to connect from SSMS v17.x to a SQL Server Integration Services 2016 instance or previous version, you get the error “Connecting to the Integration Services service on the computer failed with the following error: 'The specified service …
Want to learn how to record your desktop screen without having to use an outside camera. Click on this video and learn how to use the cool google extension called "Screencastify"! Step 1: Open a new google tab Step 2: Go to the left hand upper corn…
We’ve all felt that sense of false security before—locking down external access to a database or component and feeling like we’ve done all we need to do to secure company data. But that feeling is fleeting. Attacks these days can happen in many w…

705 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question