I'm looking for suggestions regarding how I can transfer MS SQL data up to a remote Linux/MySQL server, keeping the two databases in sync on a regular basis.
While that seems fairly straightforward, what makes this more problematic is that the structure of the remote database is *not* the same as the local database. The remote DB consists of customized X-Cart tables, that will have a product & sub-product relationship (to mimic the functionality of the local database, but implemented in a different manner).
Our current solution uses a custom windows application that checks for changes on the local copy, exports to csv, uploads via ftp, and initiates a php script on the remote location which takes over, importing and parsing the csv data. This method doesn't seem streamlined enough for my tastes though. If there are huge changes across the products (which number close under 20,000), the php script becomes a noticeable resource hog on the webserver. Also, we don't like relying on the middle step (windows app + csv file), it would be nice if we could transfer the result set directly to the remote server for the sync process, and if we could negate the need for the php script (might be difficult to remove that step), that would be golden.
1. Sync 2 databases (one-way sync), 1 remote MySQL, 1 local MS SQL, with differing data structures
2. Make process as straightforward as possible, removing need for local windows export executable, and removing need for php script if possible. (Optional goal)
Anyone have any ideas? :-) I have a fear that we're stuck with this method...