I hope what you're trying to do is still ethical...
Some web pages check all kind of parameters. I once had the problem that I wanted people to be able to send me SMS via e-mail. I used a perl script that read the e-mail and access my phone company's website to fill in a form to submit the SMS. I started out being honest, giving a nice and true user-agent name (with my e-mail adress) etc., but made more and more changes until the site accepted my submissions.
Well, to cut this short:
my $agentname = 'Mozilla/4.0 (compatible; MSIE 5.5; Windows NT 5.0)'; my $ua = new LWP::UserAgent; $ua->agent($agentname); my $request = GET $send_url; # Fake IE... $request->header('Accept' => 'image/gif, image/x-xbitmap, image/jp +eg, image/pjpe g, application/vnd.ms-powerpoint, application/vnd.ms-excel, applicatio +n/msword, */*'); $request->header('Accept-Language' => 'en-us'); $request->header('Referer' => $referer); my $response = $ua->request($request);
It seems likely that you need to accept Cookies as well. Take a look at what you are getting -- maybe you need to reaccess the page because you got redirected together with a cookie:
if ($res->is_redirect()) { my $loc = $res->header('Location'); # create new request like above # and reaccess site }
In reply to Re: Fetching Web Pages using
by crenz
in thread Fetching Web Pages using
by Baz
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |