Beefy Boxes and Bandwidth Generously Provided by pair Networks
laziness, impatience, and hubris

Cookies, JavaScript and User Agent Problems

by koolgirl (Hermit)
on Oct 25, 2011 at 00:28 UTC ( #933506=perlquestion: print w/replies, xml ) Need Help??

koolgirl has asked for the wisdom of the Perl Monks concerning the following question:

Dear Monks,

OK, I am using a web scraper, to loop through several pages of about 20 or so different links to individual records. I have three phases, get the pages - separate the individual records, strip those records.

All was going rather nicely, until I hit these darn sites with javascript, and I'm thinking the problem (from what research I've done?) is the cookies - not really the javascript, so this is my scraper here:

#!/usr/bin/perl -w use strict; use LWP::Simple; use LWP::UserAgent; use HTTP::Cookies; my $counter = 0; my $max_page = 49; # Maximum page number (number of pages displayed p +er search) my $ua = LWP::UserAgent->new; my $cookie_jar; ## This is a small program that uses a user agent (robot) module to ge +t web pages ## of multiple record links, and store each web page in it's own seper +ate file. while ($counter <= $max_page) { ## loops through all pages of records #$ua->cookie_jar(HTTP::Cookies->new(file => "$ENV{HOME}/.cookies.t +xt")); getstore(' +eweb/docSearchResults.jsp?page=' . $counter . '&searchId=0', ' +ml' . $counter) or die 'Unable to get page'; $ua->cookie_jar({}); $counter++; print "created test.html $counter\n"; } # end while

The two cookie jar lines are what I've been playing around with, not sure if they should be in or out of my loop, so one is commented out - also, I tried these two different methods, an empty jar, then the %ENV version. (I also have tried to use HTTP::Request because I was researching about using the two together( which that didn't seem to help ) and have tried every combination I can think of or find).

Now - what happens with these sites, is that I pass it the query string, with the search results (the page number looped - which has been working fine thus far), as you can see above, but it keeps giving me the log-in page over and over. I've been doing a lot research and reading, and I think this is a cookie problem. I have read the docs on all the modules I'm using, and I really need to find something that works with one of these modules, as I'm having issues with installation right now, another topic, but that is why I haven't tried WWW::Mechanize. I have tried a lot of different combinations, but I keep coming back to these two lines of code I'm trying above with the cookie jars, because based on all I've read, they should work?

Please...a little help....thanks


Replies are listed 'Best First'.
Re: Cookies, JavaScript and User Agent Problems
by planetscape (Chancellor) on Oct 25, 2011 at 00:47 UTC

    IMHO, you need to see what's actually being sent between an actual browser and the site you're scraping. I use:

    to find out this kind of information, then code accordingly.


Re: Cookies, JavaScript and User Agent Problems
by keszler (Priest) on Oct 25, 2011 at 01:01 UTC

    Your first, commented out, cookie_jar line is fine; it should be outside the loop. I'd put it right after you create $ua. What seems to me to be missing is the bit where you log in. It's likely that the site only assigns session cookies, so you need to login each time you start scraping.

      Thank you. I am trying to create that, but see - there's either a log in with name/password, or a press this button to log in publicly, so that kind of throws a wrench in the mix....

        ** Sorry, I wrote this before I saw that you already mentioned WWW-Mechanize

        If the login is a problem, then you could simply use WWW-Mechanize. Generally speaking you can usually figure a way with WWW-Mechanize to get through a login screen, after a bit of playing around ... And you can easily retrieve data on any subsequent pages then.

Re: Cookies, JavaScript and User Agent Problems
by Util (Priest) on Oct 25, 2011 at 23:12 UTC
    You are creating $ua, but LWP:Simple already has a $ua. You need to import $ua instead of creating your own, since yours will not get used by LWP::Simple.
    use LWP::Simple qw( $ua getstore ); # ... $ua->cookie_jar( HTTP::Cookies->new( file => $cookie_path, autosave => 1 ) );
    This is documented in LWP::Simple, but only by a single line of text, with no example.

Log In?

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://933506]
Approved by planetscape
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others having an uproarious good time at the Monastery: (7)
As of 2022-08-08 07:19 GMT
Find Nodes?
    Voting Booth?

    No recent polls found