Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

Hi PerlMonks

This seems to be an interesting problem, which I'm sure some of you will have some thoughts on.

I have made a small script, I would like to include a built in updater for this script which connects to my server, downloads a file, and replaces the one on the server.

I can't think of the best way to do this, wget a .tar file and do a system command to overwrite them?
Or, read the file remotely and go through replacing the file on the server..

This isn't a vital question, just something that caught my interest :)

Thanks!

Replies are listed 'Best First'.
Re: Remote File Updating
by tmiklas (Hermit) on Apr 25, 2002 at 08:53 UTC
    In short... TMTOWTDI ;-)
    If your program is for example waiting for some input (realtime log parsing?) then you can use it as a trigger (with counter or time interval check) that fetches with LWP required files (some data your script needs by itself) and rewrite the disk file. Then just gently close all open descriptors and try exec() or even exit() (if your script would start again automagically ;-p).
    Another way - fork, sleep and send kill signal to the second process, it has to serve it with fetching with LWP, and so on... But remeber - $SIG{anything} routines should be as short as possible!
    You can also write another program that only fetches those files and send SIGUSRn to your script (and then it should (re)load those files)...
    I prefer to use 'at start' method - just chceck for a new version at startup. Just K.I.S.S. Simple solutions are the best solutions (IMHO) - even if it sometimes means to have a bunch of small scripts that work together ;-)

    Greetz, Tom.