Go Premium for a chance to win a PS4. Enter to Win

x
?
Solved

MySql transactions--strategy for unreliable connections

Posted on 2014-01-14
4
Medium Priority
?
123 Views
Last Modified: 2016-05-29
I have developed a C++ Qt application which interfaces with a MySql database.  

Currently, the database is local== on the running the application.  I'm now scaling up to have multiple instances of the application running on several machines.

It's necessary to get all the data from the various clients into a single database that's used by an Apache/MySql/PHP setup that builds and updates web pages when the application does Sql inserts and deletes.  It's desirable to have the data in a single database immediately after applications do SQL transactions (call it "day of"), but that's not absolutely required.

The venues where the application will be run don't always have internet connectivity.  And, when they do have connectivity it's often unreliable.  Maybe only cell phone access and with weak signal.

So, I'm trying to devise a strategy that will always allow the application to run and save data locally, regardless of whether an external connection exists.  If a connection doesn't exist, local data could be sent to the remote server(s) at a later time when a reliable connection is available.

I see three scenarios:

1)   No connectivity of any kind "day of".  The application should do inserts to the local  database.  Data is uploaded to the web at a later time.

2)  Connectivity via wireless Lan is available, but that network isn't connected to the internet.   The wireless LAN could go down but the application still needs to keep working. Data is uploaded to a common server later when an internet connection is available.

3) A connection to the internet is available.  But, like the wireless LAN I don't want it to be a dependency for using the application "day of".  

Seems like this is probably a common situation, so I thought I'd ask for input on the best way to do it.  Here's what I'm thinking so far:

1)  Make sure inserts that use an auto-incremented id from a prior insert are included in a transaction with those inserts.  
2)  Check for success of every transaction--on fail, save the SQL text to a file for later use when server connectivity, becomes available.

So, I'm planning on having connections to up to three databases.  A local one, one on a local network, and one on the internet.  The latter two will have associated files that will have the SQL commands for all the transactions that didn't succeed.

Does this make sense?  Is there a better way?

Thanks

Dave Thomas
0
Comment
Question by:DaveThomasPilot
4 Comments
 
LVL 39

Assisted Solution

by:Aaron Tomosky
Aaron Tomosky earned 1000 total points
ID: 39789275
High level view:
Save everything locally, marked dirty by default
make something that sends the local stuff to the central db
mark things as clean when you receive a good return value

Periodically check for dirty stuff

If you always design this way, you never have to worry about different versions for different setups.
0
 
LVL 7

Accepted Solution

by:
Phil Davidson earned 1000 total points
ID: 40463821
MySQL isn't ACID compliant.  PostgreSQL is ACID compliant.  Mission critical databases have successfully ran on MySQL however.

It sounds like you have through about this a fair amount.  I think you'll succeed based on your preparation.  I would  read about

1) asynchronous replication and/or long-distance replication (which is designed for intermittent breaks in connectivity) for this project:  http://www.clusterdb.com/mysql-cluster/setting-up-mysql-asynchronous-replication-for-high-availability

2) MySQL clustering: http://dev.mysql.com/doc/refman/5.0/en/mysql-cluster.html
0

Featured Post

 [eBook] Windows Nano Server

Download this FREE eBook and learn all you need to get started with Windows Nano Server, including deployment options, remote management
and troubleshooting tips and tricks

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

As a financial services provider, your business is impacted by two of the strictest federal regulations on record: the Sarbanes-Oxley Act and the Gramm-Leach-Bliley Act. Correctly implementing faxing into your organization to provide secure, real-ti…
Files go missing when using DFS (Distributed File System) Replication and how to recover them and fix it.
Internet Business Fax to Email Made Easy - With eFax Corporate (http://www.enterprise.efax.com), you'll receive a dedicated online fax number, which is used the same way as a typical analog fax number. You'll receive secure faxes in your email, fr…
In this video, Percona Solution Engineer Dimitri Vanoverbeke discusses why you want to use at least three nodes in a database cluster. To discuss how Percona Consulting can help with your design and architecture needs for your database and infras…
Suggested Courses
Course of the Month13 days, 12 hours left to enroll

963 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question