jira0004 has asked for the wisdom of the Perl Monks concerning the following question:
Hi all,
Okay, here is the situation:
I am a developer and I can post data files on a production server through a staging process. I don't have access to the directory where the files are posted via FTP, so I can't just log onto the production server via FTP and enter the command ls . However, I have posted a number of data files in the given directory and in sub-directories under the given directory so that now I don't know what files I've posted.
I want to write a Perl script maybe using the LWP module that allows me to give the directory part of the URL and generates a list of all of the files available via that directory.
This program would work as follows:
Command:
prompt> perl http_ls.pl http://www.moonpie.com/
Output:
The following files exist under the URL http://www.moonpie.com/ :
frodo.htmlThe Perl script would use some thing like LWP to get a listing of all of the files available given the base URL, http://www.moonpie.com/ , in the above example.
Note that http://www.moonpie.com/ is not my URL so do not visit it as I don't know what it is, I just used it as an example.
I don't need some one to write the code for me (although I suppose you could if you wanted to). I do need some one to let me know if this is possible and if so, which modules/protocols do I use to get a listing of files via HTTP.
I have checked through the LWP documentation but it doesn't seem that LWP sopports an ls type operation. Maybe this just isn't supported via HTTP (which case I am just out of luck). Anyone know if the operation that I am trying to do is supported via HTTP? And if so, how I do it?
Any pointers, advice or insight any one has would be greatly appretiated.
Thanks, Regards,
Peter Jirak
|
|---|