in reply to Fetching Web Pages using
I hope what you're trying to do is still ethical...
Some web pages check all kind of parameters. I once had the problem that I wanted people to be able to send me SMS via e-mail. I used a perl script that read the e-mail and access my phone company's website to fill in a form to submit the SMS. I started out being honest, giving a nice and true user-agent name (with my e-mail adress) etc., but made more and more changes until the site accepted my submissions.
Well, to cut this short:
my $agentname = 'Mozilla/4.0 (compatible; MSIE 5.5; Windows NT 5.0)'; my $ua = new LWP::UserAgent; $ua->agent($agentname); my $request = GET $send_url; # Fake IE... $request->header('Accept' => 'image/gif, image/x-xbitmap, image/jp +eg, image/pjpe g, application/vnd.ms-powerpoint, application/vnd.ms-excel, applicatio +n/msword, */*'); $request->header('Accept-Language' => 'en-us'); $request->header('Referer' => $referer); my $response = $ua->request($request);
It seems likely that you need to accept Cookies as well. Take a look at what you are getting -- maybe you need to reaccess the page because you got redirected together with a cookie:
if ($res->is_redirect()) { my $loc = $res->header('Location'); # create new request like above # and reaccess site }
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Fetching Web Pages using get
by Baz (Friar) on Aug 02, 2002 at 12:31 UTC | |
by crenz (Priest) on Aug 02, 2002 at 18:18 UTC | |
by Poblachtach32 (Acolyte) on Aug 02, 2002 at 13:32 UTC | |
by Baz (Friar) on Aug 02, 2002 at 14:26 UTC |