in reply to Extract Web Page with Images and HTML to PS

Easiest way to download web pages is to use wget. You'll probably find it preinstalled on most Linux systems.

And to convert PS I often use Firefox's "print to file" feature. It's just a shame that (as far as I know) that can't be controlled programatically.

--
<http://dave.org.uk>

"The first rule of Perl club is you do not talk about Perl club."
-- Chip Salzenberg

  • Comment on Re: Extract Web Page with Images and HTML to PS

Replies are listed 'Best First'.
Re^2: Extract Web Page with Images and HTML to PS
by prostoalex (Scribe) on Jul 20, 2006 at 21:43 UTC
    wget would get you the ascii returned by the server, but not the images, etc. If he's generating a snapshot of a Web site, he needs some rendering as well, I'd assume.

      If you use the -p option with wget then you get all the images too.

      --
      <http://dave.org.uk>

      "The first rule of Perl club is you do not talk about Perl club."
      -- Chip Salzenberg