Hello,

I was hoping for some help in the right direction. I am creating a PHP/MySQL script and I was wondering what is the best way of being able to use it as a standalone application on a local computer.

It's a little complicated, but let's say it's an online shopping cart script. Let's say someone comes in the local store and purchases a product- I want the employee to be able to ring up that product using the website shopping cart script- simple enough.

However, I have some issues. I want it to be able to do some things that you can only do with a standalone application- for example when you click "Print", you don't have to mess with a printer dialog box- it would just work as a quick print button and start printing the job on your default printer.

Most importantly, let's say the store's Internet connection goes down for a few hours. The store employee should be able to still allow the customer to purchase the goods, ring it up through the system and have the info stored offline somehow (maybe a temp SQL database??)... Once the Internet connection is restored, all offline transactions can now be added to the online SQL database and updated.

I hope I explained this properly. I REALLY am in desperate need to find the best way of being able to do this. I have seen various PHP compilers, but do they allow you to use everything like Javascript, jQuery, and how do they communicate with your online MySQL database? What happens if your Internet connection is lost?

Any help is greatly appreciated!

Any suggestions???

If you are planning on using the system at a local level i.e, within a shop rather than across stores, you do not require to have access to the internet to store the files.

Consider a store that has multiple counters with a computer at each of the checkout counters. In most cases, these systems will be connected through a LAN of some kind. In that case, set up one of the systems to be a server and have all the others connect the the PHP Application on that system.

If there is only one system in the store then it becomes much easier since that system would be the server and the application would connect the the "localhost".

So how do you set up a system as a server? The easiest way is to install XAMPP. XAMPP is a package installer for Apache, MySql as well as some other cool stuff. Once this is done, you can create the DB using PHPMyAdmin and store the PHP files in the htdocs folder. All the other systems in the store can access the server system using the IP address of the system on which XAMPP was installed.

So in this case, you will not require an online mySQL DB with an internet access. As far as the Print is concerned, its a JavaScript thing so, I'm not so sure about the quickprint feature. Worth a try though.

Any other questions, please let me know.

Sudeepjd:

Thank you for your helpful information. I was thinking along the same line of thought as yourself with using a local app over a LAN, however the PRIMARY use of this script WILL be online. For instance, the store owner can login from any computer and view sales reports, etc. However, if the Internet goes down at the store I want them to still have access to at least some functions of the online script.

Without being online, they should still be able to checkout customers/make new sales transactions and have these stored somewhere (I was thinking some sort of temp mySQL DB). Once the connection to the Internet is restored, the new transactions are added to the main, online MySQL DB and they operate 100% online again (as they would 99% of the time).

My suggestion would be to do this the other way round. Run the main application at a local level over LAN since the main operations of the system will be at a local level. This would make the system much faster and less susceptible to downtime of either the internet connections or the web hosting server. Also if PHP were to be available during a downtime, it needs to be run from a local server.

In order that a store owner can see sales reports or stats online, think about periodically uploading the data to an online server. This can be done asynchronously via a timed system. You can extract required data using mysqldump or SELECT TO OUTFILE, upload the data to the server and insert it into the online DB for report generation and stats purposes.

The other way around i.e., is also possible, via the same method, but as I said before PHP might not be available on a local system during the downtime.

Do you think there is a way to make an application that just does the sales end- It stores the sales data and once every X hours it will upload the SQL data to your website?

This way, the online PHP application can have all the reports, etc. which will just be X hours delayed.

I know PHP/MySQL but nothing of any sort of standalone application programming such as C++, VB, etc. Which do you think this would be best compiled using?

It is possible to do it in Visual Basic, or atleast since I know the language it is possible to write the sales end application and then generate a mysql dump file which can then be uploaded to the server MySQL DB. The problem you will face though is that interfacing VB with a local DB would be a bit difficult especially if you need it to be MySQL. You may probably try MS-Access though but this would be complicating things.

It is easier to write both the Sales End program as well as the Server Reporting Program in PHP/MySQL and then make a link at an X delay interval. And since you know PHP it is one way to go. But there is always the risk of exposing your code to the client which is why you will need to choose how to balance both the factors.

To answer your question however, yes, it is possible but a little difficult.

Thank you for your help. I'll see where I can go with this unless someone else has any other ideas...

OK, here's my 2 cents worth (or maybe 3 cents)!

I remember some dialogue back and forth with someone a while back on this same topic. I don't think that we ever came to a really good answer. I have a done a little bit of work since then that has some of the same elements so I know how I would approach it, if I had to do it.

At the risk of repeating what has already been said, I believe that what you need is a client - server approach where you have a local client on every machine that does the basic functions. I think that you will want to keep the client as simple and as basic as you can and not try to have full functionality when the internet isn't available. Its main function will be to log the data and upload it to the server (when the internet / server is available). You could do that upload every time you get a transaction unless you can't get to the server; but there are reasons why you might not want to. You may need to batch the uploads so that process doesn't get in the way of the entry of sales info. Trying to run the two in parallel might add some extra complexity.

Where it gets a little bit tricky is the catchup processing where you need to re-sync your local log with the server after an outage. If you don't need to always be in-sync, then you could possibly have this as a periodic / end-of-day process where you spin through your local log and verify that every one of them was also recorded on the server (and add it if it wasn't). You'd probably want to do a full reconciliation, verify that there were no duplicates recorded and do some financial totals locally and on the server and compare them. You might also want to start a new log every day and only keep the old logs for a certain period of time. They provide you with an extra level of backup so if they don't take a lot of room, you could keep them a long time.

OK, the big question. How to do this without it becoming too complicated! My choice would be PHP on the server and AutoIt on the local machines. AutoIt is a free variant of Basic that is designed to support automation on Windows machines. That is your first 'limitation', it has to be Windows. It has a whole bunch of plugins that help you to do semi-magical things. One of those plugins provides an interface to drive Internet Explorer. There is one for FireFox as well but I believe that the one for Internet Explorer is more complete. So your second 'limitation' is that you need Internet Explorer. Usually not an issue! The advantages of Autoit are that it is free; it is relatively simple to learn and use; it can do almost anything on a Windows machine; most things have been done before so you can get plugins or specific code from their Forum; the Forum is very active and you can probably get some help if you need it; and, it can be compiled and doesn't have any special operating environment requirements.

I have used exactly this approach to develop some programs that retrieve information from Websites. It is the best screen-scraping tool that I have found. You want to go in the reverse direction and that isn't any more difficult. The Autoit IE interface works at a web page level. You give it a url and it gets to the page and gives you access to what is on the page. If it is a form, then you can fill the fields and click on the "submit" button. Your PHP is on the other end taking that data and updating a database or doing whatever else it needs to do. The PHP program will probably be returning simple responses like "Accepted" or "Failed" or error codes where necessary. The AutoIt program will be waiting for the submit to complete and it will check for the response and take appropriate action. It is like an automated operator and the pages from your site will probably be flashing on the screen as it does it's thing (but you might be able to keep that in the background by doing it in a separate Window). There can be a bit of lag while it opens IE and connects to the site so that would be a consideration in terms of what has to happen locally. It is single threading so you can't be entering a local sale while it is doing its thing with the server. That might be a reason why you might want to do it periodically rather than after every transaction. You could probably start a second process to do the upload while you are processing the next sale but then it starts to get more complicated.

So there it is. I have no doubt that this could be made to work because I have already done versions of this myself. It didn't take all that long to code and test them and it was quite pleasing to watch them flipping through pages on IE doing their thing. I've done a lot of Basic programming in the past and I have also used Autoit in the past but if you are a competent PHP programmer, you won't find the Autoit syntax too much different. It shouldn't take you long to feel comfortable using it. They provide an Editor with the package and I suggest that you use it as it brings everything together. Even though the need for it comes and goes, it is great to have the ability to do things in the local Windows environment and to have a relatively easy interface that allows data to be read from or written to a server.

My contribution to you was this relatively long post. I am not offering to give you the code or tutor you through it. I had to do that for myself and if you are the man for this job, then you will be able to do the same. If you do take this approach and if you do run into any glitches and have a very focussed question you can send me a PM and I will try to answer it.

Thanks again for the help so far. I've been thinking of something along those lines (running only the basics from the local machine) and everything else online. I thought about using something like XAMPP which will allow me to run PHP/MySQL from a Windows or a Mac computer very easily.

I already installed it and some first steps in this direction are looking promising. However, there may be some issue to try to get your localhost to connect with your online web server's MySQL database to upload transactions/sync the database.

Running this as PHP from the local machine will also allow the entire thing to load MUCH faster when you are at the store. When you are away, everything will look the same, only you will be browsing the files on the remote web server.

I think that you are adding a lot of extra complexity going that way but that is up to you. I could be wrong, but I think that you will have some real challenges trying to have a local PHP app trying to connect to a local database and to a remote server/database and still able to function properly even if it can't connect to the remote server (assuming that you figure out how to do that in the first place). In terms of installation effort and maintainability, there is a big difference between installing one or two standalone programs and an empty log file on each local machine versus installing and maintaining a server environment, PHP programs and a database on every machine. Those who don't stick to the KISS principle often live to regret it.

If you really really decide to go down that path, then I think that you will want to look at Uniform Server instead of XAMPP.

You don't need an online site. Set up your internet connection so that the store LAN will be accessible from outside through one certain port (which you set up). Establish a VPN from outside into your LAN, and the store owner may use this VPN to browse on your store-local installation.

If you're developing a web based application, than an architectural requirement is an internet connection. This really can't be avoided.

In the case of using lower-end connections a load balancing router with connections to two or more separate ISPs would be the best way to overcome the coincidental downtime. Most "business" grade carriers will have an SLA stating what their acceptable downtime is and what packet deliverability to expect. In most cases I'd wager you'll probably deal with power outages more frequently than internet outages. In which case you'll be without power anyways.

Personally if I was going to architecture the environment, I would have the database server in a remote data center. These facilities are designed with redundancy in mind to prevent power outages, hardware failure etc. If you're looking for additional redundancy than having your database replicated over several machines on a private lan in the data center would give you just that. Or if you're very worried you'd have it replicate to additional servers in multiple data centers.

So you already take precautions and have a plan for scaling if it would be necessary.
Internally the stores would deploy an application server with a web interface that was served locally to the point of sale terminals, and isolated from the external world. I particularly like the VPN access for remote connections here as smantscheff already suggested.

This internal application server would have its own database that would power the interface or the application and possibly receive updates from a centralized system if necessary. e.g. The store has multiple locations and and each location would than synchronize with a central database. In this scenario 95% of your data traffic would stay on the local network. Only transactions and payment information would be transmitted to the central database server(s) and payment gateway.

Now again we've already designed for redundancy. If the internal database server would fail, the application could be configured to immediately roll over to use the central database to handle product lookups etc etc. This would slow the system down as you have remote latency but essentially the system would not lose a beat.

Finally the biggest issue I see would be queuing up your transactions.
If you were to lose an internet connection, which should be addressed by redundant connections, pending physical loss of both connections, the system would lose access to its payment gateway. Which means you could not process credit card transactions.

To stay "online" you would need to accept all payment cards with the assumptions they are valid AND have sufficient funds etc. Then once connection to the gateway was restored you'd want to process this batch of transactions. Problem with this would be the storing of credit card information even for a temporary period of time.

I know in the states a lot of merchant agreements have legalize that essentially makes the merchant responsible for any breach of customer's credit information if it is determined the merchant is storing the credit card numbers etc. This is why gateways like Authorize.net you post the data to them via SSL and never store the full card number, but simply take their response with a reference id in case you would need to modify/refund the transaction.

If you really intend to store the credit card numbers or have the ability for the system to store them, even temporarily, than for liability purposes you would probably want to look into PCI Compliance (https://www.pcisecuritystandards.org/) for both your application and also the business practices.

Hope this helps.

Disclaimer: I scanned this thread but didn't read it like a book. However, I recently made an application for a store that does almost exactly what you are doing- a "Point of Sale" php application.

It took some testing of various possible implementations, but this is the setup I came up with:

-One computer on a LAN is the server with apache, mysql, and php installed. I used Uniform Server (XAMPP shouldn't really be used for production, IMHO) and I think that was probably the most hassle-free way to do that.

-The other computers access the URL of the local server to use the application, BUT...

-...For simplicity, I used Mozilla Prism, which is a nifty little tool that allows you to make a "mini-browser" in a few clicks. The reason I did this was so that there could be dedicated access to the program, instead of having to navigate to it through a regular web browser. Prism makes a shortcut with a customizable icon that you can have point to the root of the PHP application on the server. No address bar, so it feels like a genuine application you run from the desktop.

-Once a day (can be configured to do this more often), the database backs up to a remote server. This is good in case something happens to your local server or if you want to access the application/database remotely. You can do this with Cron on *nix or with the Task Scheduler in Windows.

Hmm... Chrishea,

I've installed and used Xampp using a server and a LAN configuration several times before. I have not tried out Uniform Server before I'll check it out as well. Thanks for the tip.

What you have done through AutoItScript (the catchup process) can easily be done via PHP. What I have done is at a specified interval of time (when the user is logged in of course) to dump the DB From the last sync to a .sql file, upload it and put it into the Online DB. I did try getting a local DB to connect to an online one via the Remote mySQL on the CPanel but it was a bit difficult and I resorted to the easier way out.

Thank everyone for the kind thoughts so far. What TySkby was working on sounds very similar to my project. I want the point of sale system to basically be an online app so that you can be anywhere and manage your store. The only snag I'm running into is the terminal part.

As TySkby mentioned, I need a way to store transactions locally if there is no Internet connection so they can be processed at a later time. Other than the "if there's no Internet connection" issues, just about everything else seems to not be such a big problem.

I was worried too about installing my PHP app (or even part of it) on local computers, but then I suppose I could encrypt the files with Zend or something.

The other issue I was thinking of (which I mentioned previously) is how to control the locally connected thermal printer (you don't want a printer dialog box popping up each time you have a transaction- you just want it to print) and also controlling the cash drawer opening function (which usually works off the thermal printer signals) all through the PHP program.

I believe that disabling the print dialogue box is not allowed in most browsers as it's considered a security issue (random websites printing porn without your consent is oft-cited when researching this issue).

However, that being said, there are different ways around that problem (as there often are). It usually is browser-specific. In my last post, I suggested Prism, but there are different ways of going about this depending on whether you're using Firefox, Chrome, IE, or Prism.

So: what browser are you going to use?

Hmm.
I used to have a server at the home office with a static ip and various open NAT ports, but it was non mission-critical and I'm a web app developer (aren't we all). My main motive was the ISP/Host were charging a bomb for database space.

I've subsequently gone cloud but during that time I dual (duplicate) hosted a shopping website with the primary database on the homebox. The code on the live server always looked for the homeserved database first, and used a local one on connection error. Not too difficult code - instead of 'or die' try a different connection.

Online payment processing on the other hand is entirely dependent on you being online.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.