Solved

MySql transactions--strategy for unreliable connections

Posted on 2014-01-14
4
99 Views
Last Modified: 2016-05-29
I have developed a C++ Qt application which interfaces with a MySql database.  

Currently, the database is local== on the running the application.  I'm now scaling up to have multiple instances of the application running on several machines.

It's necessary to get all the data from the various clients into a single database that's used by an Apache/MySql/PHP setup that builds and updates web pages when the application does Sql inserts and deletes.  It's desirable to have the data in a single database immediately after applications do SQL transactions (call it "day of"), but that's not absolutely required.

The venues where the application will be run don't always have internet connectivity.  And, when they do have connectivity it's often unreliable.  Maybe only cell phone access and with weak signal.

So, I'm trying to devise a strategy that will always allow the application to run and save data locally, regardless of whether an external connection exists.  If a connection doesn't exist, local data could be sent to the remote server(s) at a later time when a reliable connection is available.

I see three scenarios:

1)   No connectivity of any kind "day of".  The application should do inserts to the local  database.  Data is uploaded to the web at a later time.

2)  Connectivity via wireless Lan is available, but that network isn't connected to the internet.   The wireless LAN could go down but the application still needs to keep working. Data is uploaded to a common server later when an internet connection is available.

3) A connection to the internet is available.  But, like the wireless LAN I don't want it to be a dependency for using the application "day of".  

Seems like this is probably a common situation, so I thought I'd ask for input on the best way to do it.  Here's what I'm thinking so far:

1)  Make sure inserts that use an auto-incremented id from a prior insert are included in a transaction with those inserts.  
2)  Check for success of every transaction--on fail, save the SQL text to a file for later use when server connectivity, becomes available.

So, I'm planning on having connections to up to three databases.  A local one, one on a local network, and one on the internet.  The latter two will have associated files that will have the SQL commands for all the transactions that didn't succeed.

Does this make sense?  Is there a better way?

Thanks

Dave Thomas
0
Comment
Question by:DaveThomasPilot
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
4 Comments
 
LVL 39

Assisted Solution

by:Aaron Tomosky
Aaron Tomosky earned 250 total points
ID: 39789275
High level view:
Save everything locally, marked dirty by default
make something that sends the local stuff to the central db
mark things as clean when you receive a good return value

Periodically check for dirty stuff

If you always design this way, you never have to worry about different versions for different setups.
0
 
LVL 7

Accepted Solution

by:
Phil Davidson earned 250 total points
ID: 40463821
MySQL isn't ACID compliant.  PostgreSQL is ACID compliant.  Mission critical databases have successfully ran on MySQL however.

It sounds like you have through about this a fair amount.  I think you'll succeed based on your preparation.  I would  read about

1) asynchronous replication and/or long-distance replication (which is designed for intermittent breaks in connectivity) for this project:  http://www.clusterdb.com/mysql-cluster/setting-up-mysql-asynchronous-replication-for-high-availability

2) MySQL clustering: http://dev.mysql.com/doc/refman/5.0/en/mysql-cluster.html
0

Featured Post

Microsoft Certification Exam 74-409

Veeam® is happy to provide the Microsoft community with a study guide prepared by MVP and MCT, Orin Thomas. This guide will take you through each of the exam objectives, helping you to prepare for and pass the examination.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Foreword In the years since this article was written, numerous hacking attacks have targeted password-protected web sites.  The storage of client passwords has become a subject of much discussion, some of it useful and some of it misguided.  Of cou…
Load balancing is the method of dividing the total amount of work performed by one computer between two or more computers. Its aim is to get more work done in the same amount of time, ensuring that all the users get served faster.
Internet Business Fax to Email Made Easy - With eFax Corporate (http://www.enterprise.efax.com), you'll receive a dedicated online fax number, which is used the same way as a typical analog fax number. You'll receive secure faxes in your email, fr…
Sending a Secure fax is easy with eFax Corporate (http://www.enterprise.efax.com). First, Just open a new email message.  In the To field, type your recipient's fax number @efaxsend.com. You can even send a secure international fax — just include t…

710 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question