in reply to website archiver

I think you're looking for the "wget" command. Myself I tend to do this, but if you're doing this for archival purposes you might prefer to do it differently (e.g. without "-k" of "-H", and maybe without "-l"):
wget -r -l 8 -w 100 -k -p -np -H <URL>
Briefly what these options do (read the man page):
-r is recursive -l is the max depth -w is the wait, seconds between retrievals -k convert-links for local viewing -p gets all "page-requisites", e.g. images, stylesheets -np "no parent", means to avoid following links to levels above the starting point. -H Enable spanning across hosts when doing recursive retrieving.