for many years I have used WWW::Mechanize with excellent results to scrape a weekly magazin website for offline consumption on my Paĺm-pilot.
Now all of a sudden they have changed their table of content ...
Where there used to be nice html there is now an ugly mix of html-fragments with a lot of javascript mixed in that builds the html dynamically (a lot of document.write) and unfortunately completely breaks my conversion scripts...
So what I want now is a way to capture the html that the javascript generates - i.e. a tool that interprets the javascript and saves the resulting document-html in a file.
Any ideas on how to achieve this?
In reply to getting rid of javascript by morgon
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |