SergioQ has asked for the wisdom of the Perl Monks concerning the following question:
So I have searched for how to scrape webpages with a Perl script when the URL uses JavaScript, therefore LWP::Simple will not do.
The closest I've come to is finding WWW::Mechanize::Firefox yet I can't get around the "Failed to connect to , problem connecting to "localhost", port 4242" error. From what I've seen MozRepl is needed et no longer around?
All I want is to create a Perl script that runs from the command line, is given a URL and can scrape all images. What I get now however is just the thumbnails since (I'm guessing) that JavaScript does the rest after the page loads. The best example I can give is that if I go to a URL that searches the web for movie posters, what I pull down with LWP::Simple is nothing compared to what I get if I manually go to the browser and "View Source." That's where the meat all is, and am wondering if there's a workable solution for a newbie to Perl?
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: How to have Perlscript scrape images from a URL that has Javascript?
by marto (Cardinal) on Dec 16, 2019 at 07:37 UTC | |
|
Re: How to have Perlscript scrape images from a URL that has Javascript?
by harangzsolt33 (Deacon) on Dec 16, 2019 at 07:50 UTC | |
by soonix (Chancellor) on Dec 16, 2019 at 09:45 UTC | |
by harangzsolt33 (Deacon) on Dec 16, 2019 at 13:49 UTC | |
by Anonymous Monk on Dec 16, 2019 at 10:27 UTC |