Hey all!

I am writing an application that will read data coming in through a serial port (RS232) and write that data to a file. This file will then be parsed by a php script and the data will uploaded to a server. Is there a function that allows me to make the file auto-update after each data input?

If this is unclear, let me shed some light. Say you write a piece of data to a file using the << operator with a variable of type ofstream. If you open the file while the application is running, it will show the last data that was input. If you input another piece of data, it will prompt you to "reload" the file to be able to see the new data input. Is there a way to do this automatically so the application does not have to be shutdown every so often? This would also allow the php script to run non-stop as well.

tip.

You need to use a while loop to keep the programe runing to take all the input and write to disk by closing the file. That is you must have your file operation in a first while loop to reopen the file after every input/data recieved from the serial.

tip.

You need to use a while loop to keep the programe runing to take all the input and write to disk by closing the file. That is you must have your file operation in a first while loop to reopen the file after every input/data recieved from the serial.

Makes sense. The only problem I see is that when you use the .open("filename.filetype here) member function, it will open a new file with the file name given. That is, it will overwrite any file that has the same name and file type. I do not think closing and opening the file would be viable as I foresee it overwriting the current file.

std::ostream::open takes a second argument that says what you want to do with the file you're opening. std::ios::app opens the file in 'append' mode:

std::ofstream myFile;
myFile.open( "filename.txt", std::ios::app );

You can also cause data to be written to the file by calling flush , without closing the file each time you want to write the data to disk.

I think ravenous said it all ;-)

If there is no need to store the data for later use in the file you might do better to use a socket or named pipe. Your application can write to the one end of the pipe and the php script can read from the other end when data is available. In fact, the php script will block while reading the pipe/socket until data is available so there is no need to busy loop until that point.

If there is no need to store the data for later use in the file you might do better to use a socket or named pipe. Your application can write to the one end of the pipe and the php script can read from the other end when data is available. In fact, the php script will block while reading the pipe/socket until data is available so there is no need to busy loop until that point.

Could you point me in the direction of more information on this? This might actually work.

EDIT:

I have tried doing what ravenous suggested to no avail. If I have my file (in this case it is called log.txt) open while running the application, it will still prompt me to "reload" the file after more data has been written to it to be able to see that data. This is what I am trying to override in my code

Your code is editing data in memory, not on disk so in order for others to see the changes the data must be written to disk, and re-loaded by whomever needs it.

Windows: CreateNamedPipe
Linux: Pipes, FIFO, and IPC

Your code is editing data in memory, not on disk so in order for others to see the changes the data must be written to disk, and re-loaded by whomever needs it.

Windows: CreateNamedPipe
Linux: Pipes, FIFO, and IPC

Thanks for the info! This may be the way I go. However, is there anything special I have to do for cross-platform pipes? The info will be going from linux-based slave computers to a windows 2003 server running MySQL

at least paste some updated code so far ok?

at least paste some updated code so far ok?

This is all preliminary research for this project. I am still trying to figure out which route is going to be the best for my situation. In a nutshell, I have to read data coming in over an RS232 connection (this application will later be run on 6-7 machines simultaneously), write that data to a file (or to a pipe as I have just recently be told), and that data will be parsed by a PHP script and processed by the server

I once had to send data (weight of several silos) received on eight RS232 interfaces to a central server.
What I did was: Write data continuously to file(s). Every (configurable) time interval I closed the files and sent them via FTP to the server which itself was looking for new files every so and so minutes.
If you need real time data you could use sockets - I don't think there are cross platform pipes.(don't know for sure though)
If your server is MySQL you could send the data directly to it.
(http://www.mysql.com/downloads/connector/cpp/)

If you are dealing with multiple machines with varying platforms you may be best just opening a network socket and communicating that way. This would serve you best if at some point you needed to support machines hosted in various locations.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.