I'm going to be developing an application shortly that will need to store most of its data publicly on the internet so that other instances of the software on other computers can access it. No instance of the software will have the privelege of acting as the server; they'll have to be peer-to-peer. My question is, how can I store this data such that everything can access it without a substantial delay?
The software is essentially going to be used to make equipment reservations, so each instance must always have an up-to-date copy of the reservations calendar.
This will be my first software that uses the internet, so I obviously don't yet know a whole lot about it. My first thought was to store the data on an FTP server /and/ locally on the hard drive, only accessing the FTP server when making new reservations or checking the calendar. But that won't work because there will always be a delay when logging into the server, and that's terribly inefficient. So what would be a better alternative?
I don't need a tremendous amount of security to this, either. As long as someone can't accidentally fumble along and screw up the data file, it's good enough.