I'm attempting to write a windows service that will use the filesystemwatcher to monitor for file creations, changes, renames and deletions. When it detects any of those, it will then pass information such as Date / Time, owner, full path, file name and an MD5 checksum to a SQL database. Whenever a file is created, the MD5 checksum of the new file is then used to check all previous records in the database to search for any duplicates, if a duplicate is found, an e-mail is then sent to the appropiate people.
All of the above works fine, however, if I do a large file copy (17,000 files), the service sees two records, a creation of a new file and then a change once the copy of that particular file is complete. I'd expect to see 34,000 records in the database, however, i'm only seeing somewhere in the region of 150 records.
I'm just wondering whether the files are being copied quicker than the service can run the SQL query, in which case, how would I go about making the service run the same code in multiple threads whilst ensuring that all threads don't jump on the same file?
Thanks in advance.