I have the following script snippet, adapted slightly from something I found online a couple of weeks ago:
ftp -i -n $HOST >$TODAY/ftp.log <<END_SCRIPT
quote USER $USER
quote PASS $PASSWD
lcd $TODAY
mget *
quit
END_SCRIPT
$TODAY is defined as such:
TODAY="`date +%m%d%H%M`"
It's just to create a unique directory based on the time-stamp. This will be scheduled to run every 15 minutes. Right now, the files are purged by the server every 4 days, so they're downloaded multiple times. I'd like to delete them after downloading; however, because files are added continuously at random intervals, I can't guarantee a file hasn't been added between an MGET and MDELETE; that could potentially lead to my missing a file.
What seems like it might work would be to do a DIR on the server and then pipe the results into a paired GET/DELETE commands, so that it only deletes files it has already downloaded, and by name, eliminating the danger. However, I don't know enough about scripting to do this myself. Can anyone help me do this, or suggest a better solution?
Thanks!