Sj03rd has asked for the wisdom of the Perl Monks concerning the following question:
Hello, I'm trying to crawl webpages that consists of nothing but simple text. However, when saved to the hard drive, the page has become unreadable and consists of nothing but scrambled text (such as í½isÛH²úùøW 4ísä etc). Does anybody know what goes wrong? Thanks!
use WWW::Mechanize; my $mech = WWW::Mechanize->new( autocheck => 0 ); @arraydatad = split(/\//, $arraydata[4]); $filename = test; $filecrawl = "http://www.sec.gov/Archives/edgar/data/1000045/000114420 +4-06-005708.txt"; $mech->get($filecrawl, ':content_file' => $filename);
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Scrambled text in downloaded webpages
by Corion (Patriarch) on Aug 01, 2012 at 10:53 UTC | |
by daxim (Curate) on Aug 01, 2012 at 11:43 UTC | |
by Sj03rd (Initiate) on Aug 01, 2012 at 11:56 UTC | |
|
Re: Scrambled text in downloaded webpages
by daxim (Curate) on Aug 01, 2012 at 11:36 UTC | |
by Sj03rd (Initiate) on Aug 02, 2012 at 08:20 UTC |