dushu has asked for the wisdom of the Perl Monks concerning the following question:

I want to be able to open a site directory and read txt files from it.

For example: http://concam.net/kp/kp021020/kp021020ut112957x.txt

I would only need to searh through the files.
Is this possible?

Thanx,
Dushu

Replies are listed 'Best First'.
Re: reading hosted files
by Enlil (Parson) on Oct 25, 2002 at 00:18 UTC
    You might want to look at the module LWP and there are some further examples of use here, but you probably want to get the site directory first parse through it, getting a list of the .txt files and then get each one in turn, while searching them. If you can ftp to the site it might be easier to parse though anyhow by using Net::FTP and get a listing of the files and FTP them over for further use.

    -enlil

Re: reading hosted files
by DamnDirtyApe (Curate) on Oct 25, 2002 at 17:22 UTC

    Something like this...

    perl -MLWP::Simple -e 'print map { "$_\n"} grep { m/UMa/ } split "\n" +=> get( "http://concam.net/kp/kp021020/kp021020ut112957x.txt" )'

    Where the grep regex defines what you're searching for.


    _______________
    DamnDirtyApe
    Those who know that they are profound strive for clarity. Those who
    would like to seem profound to the crowd strive for obscurity.
                --Friedrich Nietzsche