I have the following script snippet, adapted slightly from something I found online a couple of weeks ago:

ftp -i -n $HOST >$TODAY/ftp.log <<END_SCRIPT
quote USER $USER
quote PASS $PASSWD
lcd $TODAY
mget *
quit
END_SCRIPT

$TODAY is defined as such:

TODAY="`date +%m%d%H%M`"

It's just to create a unique directory based on the time-stamp. This will be scheduled to run every 15 minutes. Right now, the files are purged by the server every 4 days, so they're downloaded multiple times. I'd like to delete them after downloading; however, because files are added continuously at random intervals, I can't guarantee a file hasn't been added between an MGET and MDELETE; that could potentially lead to my missing a file.

What seems like it might work would be to do a DIR on the server and then pipe the results into a paired GET/DELETE commands, so that it only deletes files it has already downloaded, and by name, eliminating the danger. However, I don't know enough about scripting to do this myself. Can anyone help me do this, or suggest a better solution?

Thanks!

Note: files appear in a directory and can be ftp'ed even though another process is still writing to them.

Note: files appear in a directory and can be ftp'ed even though another process is still writing to them.

Okay? Yes, I knew that, but I'm not sure how it applies. The reason I don't want to use mget/mdelete is the potential for a file to be written after the mget but before the mdelete, thus being deleted without having been copied from the remote server to the local directory.

Why not just download a tool like wget, or similar, that's designed to automate recursive FTP mirroring?

You might want to check if there's something suitable in this list:

http://www.usinglinux.org/ftp/

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.