in reply to fetching ftp site info

I don't really understand the problem and would choose the UNIX file listing over some HTML any day, for it's way easier to parse.

Couldn't you get the time of each file and figure out wheter it is new enough? You could use Date::Manip for example, to check wheter it's say "older than one week ago".

Could you please state exactly what your problem is, for it is unclear to me as of yet.

--
b10m

Replies are listed 'Best First'.
Re: Re: fetching ftp site info
by Anonymous Monk on Jan 27, 2004 at 14:59 UTC
    Sorry about not being clear. When I fetch the ftp site and put in a local text file I get the Unix file system listing. I want to fetch the ftp site into my local text file and get the html source code instead.
      I believe the reason is such --

      Because you will get the html source code only when you use an html client to view an ftp site. FTP by itself is doing what it is supposed to do... give you a listing. You are using a screwdriver to hammer a nail, and then getting surprised that it is going straight in instead of turning.

      In your code try using http:// protocol instead of ftp and you should get the html source code.

      The html source is created by your client. You can do a couple of things (not involving perl) like:
      $ wget -O temp.html 'ftp://ftp.symantec.com/public/english_us_canada/a +ntivirus_definitions/norton_antivirus_corp/'
      or
      $ lynx -source 'ftp://ftp.symantec.com/public/english_us_canada/antivi +rus_definitions/norton_antivirus_corp/'>temp.html
      and they will create the html source for you.

      I tried using lwp-request but it just gave me the unix listing.

      --

      flounder