I am allowing users on my website to place their vote. Their submissions go to my VOTE.CGI perl script. It opens the VOTESUB.DAT file to add/update their vote tally.
Now if I have millions of users accessing VOTE.CGI and thus VOTESUB.DAT all at the same time, wouldn't VOTESUB.DAT get all screwed up? I've got to hold back the user submissions and handle them one at a time so as to not screw up the numbers held with in it. Is this true?
If so I have thought about writing out a dummy file called HOLD.TMP everytime VOTE.CGI is started. So, everytime VOTE.CGI is started it checks for the existence of HOLD.TMP. If it exists then we know that another occurence of VOTE.CGI is still running right now. I must 'pause'(sleep) until this file has dissapeared (deleted by that previous instance of VOTE.CGI once it completes itself). When the file is gone immediately create a HOLD.TMP file (so as to pause any next VOTE.CGI instances) and now do the handling of the voter submission.
Is this the best method to handle multiple writes to a single file? Or is there another method to pause all clients on a web page so as to do stuff like this?
One more thing: is it true that the only way to update a file (ie my VOTESUB.DAT) is to read it all in (or in parts) and then write it all out, even if all you wanted to do is add a single value at the half way point in the file? So if So lets say VOTESUB.DAT looked like:
1: Hello Line 1
2: Hello Line 2
3: Hello Line 3
If I wanted to alter line 2 to read: Hello From Line 2
I MUST read in ALL of VOTESUB.DAT and write it back out again?!