Aldebaran has asked for the wisdom of the Perl Monks concerning the following question:
Greetings good people
I've been trying everything I can to download the images off a webpage. Although the content is ideological, I can assure you that I am not a fascist.
I redirected the output of this to get a list of the images: $ cat hitler1.pl
#!/usr/bin/perl -w use strict; use WWW::Mechanize; my $domain = 'http://www.nobeliefs.com/nazis.htm'; my $m = WWW::Mechanize->new; $m->get( $domain); my @list = $m->dump_images(); print "@list \n"; $
my list doesn't do anything there (why not?) and I redirected output to text1.txt
then I try to download these images with getstore and got only html as output and then this gives me jpg of zero size. What gives? $ cat hitler5.pl
#!/usr/bin/perl -w use strict; use LWP::Simple; open FILE, "text1.txt" or die $!; my $data; my $url; my $text; my %params; while (<FILE>) { $text = $_; $url = 'http://www.nobeliefs.com/nazis/' . $text; $data = LWP::Simple::get $params{URL}; $text =~ s#images/##; print "$url\n"; print "$text\n"; open (FH, ">$text"); binmode (FH); print FH $data; close (FH); } $
Thx for your comment and happy easter.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: downloading images from a webpage
by blakew (Monk) on Apr 06, 2012 at 21:47 UTC | |
by Aldebaran (Curate) on Apr 07, 2012 at 18:33 UTC | |
by Aldebaran (Curate) on Apr 09, 2012 at 05:44 UTC | |
by blakew (Monk) on Apr 09, 2012 at 15:54 UTC | |
by Aldebaran (Curate) on Apr 14, 2012 at 00:56 UTC |