Not sure where really to put this question, so hopefully its ok here!
Ive been asked to write some backup software which will compare a file from yesterday and today and show me the difference on a binary comparison. Which I can do, but it means me storing yesterdays file, and compaing it to today which results in a large database.
Ive been looking at some software such as Backup Exec and Super Flexible File Syncronsiers which seem to do binary comparisons, but their databases arn't nearly as big as mine.
Does anyone have any suggestions on how to reduce my historic file database for comparing?
I did think of using checksums, and comparing the first 500Kb of a file and if the checksum isnt the same then backup the entire 500Kb again, but seems a strange way of doing it.