Hi everyone,
As the thread title suggests, I am trying to think of the best way to maintain and synchronise local and remote databases. To explain this more clearly, I'll outline the scenario and my thoughts for tackling the problem.
I have a website on a remote server, which obviously has a database containing customer information, orders, etc. This information is entered into the database via the website by customer directly through signing up, placing orders, etc.
Customers are also able to call the office to place orders. At the moment, this data is being stored in a local database, entirely separate from the website. This essentially means that two versions of the database are being maintained, which is a waste of time and money.
The obvious solution here would be for the office staff to enter the customer information and orders taken via telephone directly into the website database. However, what if the office internet connection goes down, the connection latency is too high or heaven forbid, the website goes down...
This made me think about setting up a duplicate of the website on a local server based in the office - well the administration interface at least. This would therefore allow the office staff to enter data directly into the local database, irrespective of whether there was an internet connection, etc. This then raises the issue of synchronising or combining the two databases...
My thought was that the local and remote databases could use different primary keys for all tables. For example, an auto-increment Id, and local or remote identifer, thus forming a composite primary key. Then when merging the records from the two databases, there would be no key clashes.
My question finally, is has anyone implemented anything similar, if so, how did you do it? Or can you think of any issues with my idea?
Thanks for your time and for reading this far,
R.