O monks,
For my work, I end up every 6 months needing to do a particular task that requires tediously clicking through my employer's web interface to a database to retrieve 100 different data records. I'm interested in automating the process. Strangely, this web interface doesn't seem to use https anywhere, just plain http. A casual search turned up Web::Scraper, as well as a ton of other CPAN modules. Web::Scraper looks cool, but there's not much documentation. I'm also not clear on what modules would make it convenient to handle both GET and POST, as well as cookies. The canonical example that everyone seems to have in mind is ebay auctions, but that only requires going to a particular public URL and retrieving the result, without the need for any POST or cookies. For my application, I need to be able to log in with my username and password via POST, and store a cookie.
Any suggestions? Any good examples of code that does this kind of thing?
Thanks!
Ben| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |