Software Design Question - Internet Data

I'm going to be developing an application shortly that will need to store most of its data publicly on the internet so that other instances of the software on other computers can access it.  No instance of the software will have the privelege of acting as the server; they'll have to be peer-to-peer.  My question is, how can I store this data such that everything can access it without a substantial delay?

The software is essentially going to be used to make equipment reservations, so each instance must always have an up-to-date copy of the reservations calendar.

This will be my first software that uses the internet, so I obviously don't yet know a whole lot about it.  My first thought was to store the data on an FTP server /and/ locally on the hard drive, only accessing the FTP server when making new reservations or checking the calendar.  But that won't work because there will always be a delay when logging into the server, and that's terribly inefficient.  So what would be a better alternative?

I don't need a tremendous amount of security to this, either.  As long as someone can't accidentally fumble along and screw up the data file, it's good enough.

Who is Participating?

Improve company productivity with a Business Account.Sign Up

zaphod_ukConnect With a Mentor Commented:
OK - any kind of database product is out - unless you can find someone who knows something about firewalls to see if you can set up some kind of authentication that would allow specific, known traffic to get through? The uni might agree to that.

Assuming not, then what you seemed most concerned about is speed of access. I wouldn't think that processing a file pulled from an FTP server would take very long nor would writing it back again and server-side VBScript certainly has the facilities to allow you to do so.

I would imagine that you'd have to process the file twice... once to get the "current" information to display and again when the user makes the reservation to ensure that noone else has booked the same resource in the meantime.

You said that you wanted each machine to have it's own copy of the calendar but I would suggest that this could lead to problems since keeping multiple copies all in line with each other would be far more of a problem. Just have the one set on the FTP server.

If I'm oversimplifying the problem then let me know.
If you're resigned to using a central store for the information, I would suggest that you consider storing the data in something like a SQL Server database. The recent work I've done has used this setup and it's been very fast indeed.

I would have thought this would be far easier to code and you'll be much more certain of everyone seeing the updates quicker than having to use a file.
ArtineAuthor Commented:
That poses a bit of a problem, I'm afraid.  While it would be the ideal solution, I can't set up a server of any sort (SQL or otherwise) because of the network that this software will be running on: it has a nasty old firewall setup on it, and so even if I set up a SQL server or set one instance of the software as a server, the clients couldn't access it.
What Kind of Coding Program is Right for You?

There are many ways to learn to code these days. From coding bootcamps like Flatiron School to online courses to totally free beginner resources. The best way to learn to code depends on many factors, but the most important one is you. See what course is best for you.

Despite its name, SQL server is pretty much just a database so couldn't you install a copy on one of the web servers - or does the arrangement you described mean that no web server can see the other? Being on the same machine would mean even faster access, too!
ArtineAuthor Commented:
That poses a bit of a problem, I'm afraid.  While it would be the ideal solution, I can't set up a server of any sort (SQL or otherwise) because of the network that this software will be running on: it has a nasty old firewall setup on it, and so even if I set up a SQL server or set one instance of the software as a server, the clients couldn't access it.
ArtineAuthor Commented:
Hmm..  Don't know why that message got posted again; didn't mean for it to.

At any rate, I don't have access to any of the web servers.  This is at a university, and the television dept. needs some software to facilitate reserving equipment.  The university has a firewall set up so that all of the computers on the network are invisible to everyone else.  Crummy arrangement, if you ask me.
ArtineAuthor Commented:
You've definitely got the problem right, and you made an excellent point about each instance having its own copy of the software.  When I thought of that, I was thinking only of data redundancy: didn't want the people making the reservations--they're not the brightest crayons in the box--to freak out if something happened to the internet connection and the calendar was inaccessible.  But then again, it'd be even more of a problem if equipment was overbooked because each machine thought the same time slot was free.  So yeah, have to come up with some way to keep all the data on a remote server.  I guess I'll currently write it in a manner such that until a better idea comes along, I'll just use the FTP server idea.  (The uni definitely wouldn't consider opening up the firewall even for known traffic.  They're quite stubborn.)  If any inspiration hits you in the shower or something, pass it on.  Thanks for your ideas, though! :-)
Jumping in here.

You can create components and make them available on the server so that other users can connect to it. This component can then do anything on the server that you want. Database access, filesystem access etc.

On the client you can use DCOm to access the component and make calls to them.

This design is what you want it to do.

Another approach could be a web interface.

ArtineAuthor Commented:
CJ: But not being able to set up any kind of a server--due to the uni's firewall--how does that help me?
At first I was about to take offence at the suggestion I should take a shower!! :>   (Actually, I decided to have a bath.)

You mentioned at one point that the machines the users would access were Web servers so I figured that the solution you were proposing would be web-based. I still don't see that you'll have any problems there. It's getting to a central information source that's the issue.

You said that you weren't particularly interested in security, so I'd check with the Dba's to see if there's is SOME machine available which isn't behind a firewall that you could use.

Otherwise I think the FTP route is all you have available. If anyone else wants to jump in I'm quite happy to be proved wrong.
Yes, if the Rat may come into this hole.

A reservation system is going to have to store article/user/period triplets in a place where ALL clients can find them. So some sort of central storage is unavoidable, or at least follow this argument :

There are basically two senarios to be considered :

1. Fetch and Discard
2. Fetch and Update.

"Fetch and Discard" is the process of investigating a user or item or period. For example to see if something is free and finding out not and then going away. Most queries are of this form. Data integrity is always perserved by this type of query, since nothing changes.

"Fetch and Update" is the variation where the client decides to make a "reservation". Few queries are of this type. Data integrity is very difficult with this type of query - two people trying to do the same query at the same time.

All of this imples a central data storage with lock-out update. Be that SQL Server, Web Service or whatever. It breaks however your requirements that "No instance of the software will have the privelege of acting as the server".

The only way of enforcing your requirement is to break entities. That is each item for each period becomes a web object (in this case web address) You now only have the problem of allocating a user to it. This you could do by allocating a directory for object/period, which includes a small description file, and upload a file containg the user (for the allocation). A VB program could do this by using FTP protocol and refusing the upload if the file already exists(ie: object/period already allocated). This would be true peer-to-peer and the objects/periods could be placed literally anywhere on the web. You could make it a little more flexible if the file names included the periods, the web address being then only the object.

There are some problems with this approach: Firstly finding what is available is not easy. You don't know all the addresses and as soon as you start to catalog them you're back to server. Secondly FTP servers spawn on connection and there could be a race problem detecting the presence of files. And lastly this would be a highly unusual approach, which I would only recommend for research purposes. You'd be on your own as regards problems - you can't count on the wealth of experience available with client-server systems.

You should also be aware that any Uni system will allow HTTP request through their firewalls, so a conventional system using SQL and a Web Server would not be incompatible. If someone gave me that job I'd take Apache, Tomcat and mySQL.



No comment has been added lately, so it's time to clean up this TA.
I will leave a recommendation in the Cleanup topic area that this question is:
 - Answered by: CJ_S (50) and zaphod_uk (50)
Please leave any comments here within the
next seven days.


Question has been closed as per recommendation

CJ_S points for you here >

JGould-EE Moderator
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.