VFP9 options in the cloud (and elsewhere)

Posted on 2016-07-29
Last Modified: 2016-08-02
I am looking at options for keeping a 20 user VFP9 application on a LAN happy for the future.
Data volumes are not high, but it does handle accounting, inventory, scheduling, orders, EDI etc. for a manufacturing company.
About 150 tables in 6 dbc's, total size about 360 meg.

After a server crash that took a day to recover from we are looking at options for faster recovery.

We do not use client/server, so from what I read a cloud base for dbf's (as oposed to SQL Server) is an invitation to data corruption, unless a very fast connection is available? Our current connection is 55 down and 11 up and I suspect we can pay for faster. But I still just don't trust the internet. Could be wrong?

 We have had no corruption issues since day one, and I would like to keep it that way.

Not sure I have the energy, knowledge or time (66 years old) to rewrite (and test) everything to make it client/server.
Our current network supplier, who is also offering the cloud, will do a 'resilient cluster' for a local hardware solution.
1. Resilient cluster
2. DBFs on cloud (exes on local staions)
3. Rewrite for remote access on cloud
4. New software entirely.

 This company has done well for me for 40 years and vice versa, so I want to be sure they are in good shape. If that means stepping aside for entirely new software, or bringing someone on for a rewrite--fine by me!

I've taken a quick look at ActiveVFP and FoxInCloud and not sure they fit this situation.

 All thoughts welcome.
Question by:terrypba1
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
LVL 42

Accepted Solution

pcelba earned 250 total points
ID: 41735377
Did you try how VFP app runs in RDS (Remote Desktop Session)?  It is the option 1 from your list in fact.
User logs into the server, obtains his own Windows desktop and then he can run whatever application is available at the server... All other solutions require app update/rewrite.

Advantages are following:
- The application runs unchanged as ever before but data are stored on local harddrive from RDS point of the view.
- Everything is very fast without negative network influences. Network interruption cannot cause data damage. RDS session is still running even when user disconnects.
- The RDS may run on the server in your server room or in the cloud and you don't recognize any difference.
- Cloud service is scalable so to buy additional RAM, HDD, CPU is very easy

Disadvantages are common to all cloud implementations:
- Slow printing on local printers
- You'll need Terminal Server Windows license for each connected user
- Data are outside of your control and you are fully dependent on the cloud service and internet provider
- Backups are even more important than ever before
- Reliable cloud service is expensive and even such provider like IBM cannot ensure 100% functionality. Problem at provider side can
make your servers unavailable for hours or even days.
- You must be much more strict at the security field than ever before. Use VPN connection to the cloud and disable any public access.

DBF in cloud and EXEs on local workstations is a total nonsense. Shared DBF file access over internet is possible but not for 20 simultaneous users working on critical data... If you did not observe data corruption yet then this approach will bring it with 100% probability. Of course, VFP app can run as a multithreaded COM server which could be a solution for this topology but it would mean major app rewrite. ActiveVFP is working this way and I like this approach. I did not use ActiveVFP but in house developed solution serving 100 concurrent users on relatively weak hardware and it was working very reliably.

Options 3 and 4 are outside VFP world and depends just on you if you have resources for such change. You could try FoxInCloud and tell us your findings.

And BTW, you may meet 80 years old FoxPro programmers here. :-)
LVL 29

Assisted Solution

by:Olaf Doschke
Olaf Doschke earned 250 total points
ID: 41735493
Well, I'm a youngster with 47 years.

Pavel is right, that remoting the desktop is a solution, but for a single LAN and 20 users, why do you want to move to the cloud at all? Just because of this server crash? Mainly your dbf file server went down and it took long to get that up again? That doesn't need the cloud.

Moving to cloud services is the way the world goes today, but is it needed for anything? The first thing to think about is putting data up to the cloud. In most cases I think of company data as its value,maybe more so because I have to protect the formulas of cosmetical products. A hack of this data could mean exact product imitations, so this is data you really don't want to put into the cloud, no matter if all data transfer is secured, no matter if the database files themselves are encrypted, the cloud provider will have access to this data, too. Even if virtual machine files themselves are secured this all runs on some host system, from which an employee or hacker can do system image at any point in time with all in memory and cached data unencrypted.

The argument, which makes this point moot is, that hackers cannot only target system centers of cloud providers, they can also target your LAN, which most probably is even an easier target. Anway, I'm not at all an early adopter of the cloud. It has some pros in comparison with classic simple hosting in better scalability and stability, but in the end it's old wine in new bottles.

The mechanism making the cloud more stable is virtualisation and that's also possible through server virtualisation in your own LAN, so solving this problem does not necessarily need the cloud. And the security argument? Well, even with the data not on premise you have to secure your LAN anyway, so why not apply the general security measures locally, when you have to anyway?

Another aspect of it is the best performance and that depends on the way you cut the tiers of an application, the data tier, the business logic tier and the user interface tier. Putting a slower connection anywhere creates a local and remote end and slows things down and in the end there is only two practical choice for the cut:

The user interface is local and the rest is remote.

There are essentially two ways to do that with a web application or remote desktop. It doesn't matter much, if you have to transfer the graphics of the remote dektop to a client or html. HTML has the advantage to be able to accompany that with Javascript and make use of the clients computer power with some advanced and prettier user interface that way. But of course this needs a rewrite. FoxInCloud goes in that direction and has the downsides of such automatic UI recoding in HTML not boing ideal.

The other cut you actually plan to do in only putting data remote is the worst way to solve this. Then you have a time lag that is longer, even if you would have a 1Gbit internet connection. This time lag is bad for data access, not because the file integrity is at more risc, no, because the way VFP works is not profiting from good bandwidth but more from low latency, rushmore has many short questions and and roundtrips for determining the "bitmap" of records to fetch and that works very bad, even a LAN connection already is a bottleneck compared to the performance VFP can have with local DBFs, but that's a workable slowdown for even about 1000 LAN users.

There is a further solution, if you need to use the same app with all data in affiliates around the world, that makes no cut, but extends the local backend with replication. Data is replicated around the world, and this may include the cloud, there is perfectly encrypted point to point transfer and an intelligently setup replication can mean many places working with the same data without it being put into a cloud.

What you would have needed for your case is a redundant backup system serving the files and again replication is the basis for that. I don't see a reason to move to the cloud, unless you see a benefit of having data up to let customers also connect more directly.

A shop with the current orders in the cloud is not having the risc of product formulas online. Aside of he orders themselves, the users data, which needs the care to only show it to the right user - his own orders and their state, his own profile etc, you only have data which is public anyway.

In your case I would say EDI, scheduling, inventory and accounting data does not belong to the cloud, why should it be there?

What you can put into the cloud easily and without much thought about riscs is backups, when they are first encrypted. Encrypted locally and only decryptable locally, you can put everything into the cloud, and als recover from a hardware failure or even virus with healthy backup restores.

Bye, Olaf.

Author Comment

ID: 41735772
Excellent angles, pcelba and Olaf.
RDS sounds  well worth exploring.
The appeal of the cloud (more to the owners than to me) is the fact that server maintenance and upgrade issues 'disappear'. But a whole new set of concerns appear, of course, with a new company in charge of our IT, and the internet acting as our lifeline.
The redundant backup system is more expensive initially, but does keep things going as they currently are. (We do currently backup locally and to the cloud).
We don't currently need customer access to data, but do give 2 workers remote desktop access through VPN to their apps.
I did run one module through the Foxincloud adaptation assistant and projected that thousands of manual revisions would be needed to the entire project. Then the testing . . . I'm sure it's terrific for many purposes, but perhaps not mine.

Featured Post

Revamp Your Training Process

Drastically shorten your training time with WalkMe's advanced online training solution that Guides your trainees to action.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

These days, all we hear about hacktivists took down so and so websites and retrieved thousands of user’s data. One of the techniques to get unauthorized access to database is by performing SQL injection. This article is quite lengthy which gives bas…
As technology users and professionals, we’re always learning. Our universal interest in advancing our knowledge of the trade is unmatched by most industries. It’s a curiosity that makes sense, given the climate of change. Within that, there lies a…
Video by: Steve
Using examples as well as descriptions, step through each of the common simple join types, explaining differences in syntax, differences in expected outputs and showing how the queries run along with the actual outputs based upon a simple set of dem…
Both in life and business – not all partnerships are created equal. Spend 30 short minutes with us to learn:   • Key questions to ask when considering a partnership to accelerate your business into the cloud • Pitfalls and mistakes other partners…

730 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question