wylie has asked for the wisdom of the Perl Monks concerning the following question:
My plan is then to add some code to the wanted sub-routine which will gzip the logs held in /home/www/*/logs and mail it to site owners. Now to my problem. The number of virtual hosts changes from week to week. I would like to totally automate this process using Perl. I need to run the find2perl command to get a complete lsiting of all the relevant directories but when I run this it creates new code each time with a correct directory listing but with an empty "wanted" sub-routine. Is it possible to run my find2perl command and add the Perl code I want to use to gzip and mail the logs to the "wanted" sub-routine at the same time?#! /usr/local/bin/perl -w eval 'exec /usr/local/bin/perl -S $0 ${1+"$@"}' f 0; #$running_under_some_shell use strict; use File::Find (); # Set the variable $File::Find::dont_use_nlink if you're using AFS, + # since AFS cheats. # for the convenience of &wanted calls, including -eval statements: + use vars qw/*name *dir *prune/; *name = *File::Find::name; *dir = *File::Find::dir; *prune = *File::Find::prune; # Traverse desired filesystems File::Find::find({wanted => \&wanted}, '/home/www/admin/logs/weekly', +'/home/www/nsms/logs/weekly'); exit; sub wanted { ; }
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Searching a directory tree
by davorg (Chancellor) on Aug 10, 2001 at 14:10 UTC | |
by wylie (Novice) on Aug 10, 2001 at 15:06 UTC | |
by davorg (Chancellor) on Aug 10, 2001 at 15:34 UTC | |
by wylie (Novice) on Aug 10, 2001 at 15:59 UTC | |
|
Re: Searching a directory tree
by larryk (Friar) on Aug 10, 2001 at 15:53 UTC |