I have a question about the best strategy for a problem I'm trying to solve. I am working on a rather large web-based database/content management project. Part of the goal is to make it as easy to install and use as possible - as well as make it useable in virtual hosting setups. What I'd like to do is have a script that is in the cgi-bin (or similar) directory be able to update itself as well as other .cgi scripts in that directory.
My initial thought is to use Net::FTP to grab the files from a pre-determined location (after determining whether an upgrade is available), create a backup directory to copy all existing files into, then grab the group of files from the server, and copy to the present location using Net::FTP. The primary issue I am dealing with here is permissions. I (or anyone with root access) can deal with it by changing ownership of the directory to the apache user. Also, obviously, making the directories world writeable is one other avenue (although I think not at all a good one.) But folks who have virtual host setups often cannot change the permissions on their files, or certainly not change the owner.
So, first, so I don't re-invent the wheel, any great Perl modules I should know about to do this sort of thing? And second, what would be the best strategy to deal with this?
Thanks!
Michelle
edited: Tue Jul 9 04:53:50 2002 by jeffa - title change
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |