A way to capture a webpage and first layer of links?

Is there a way to save everything on a webpage and also everything on each page that its links go to? I assume there might be a way if they are all on the same site and I just want to go one level in the links, so to speak. I think I heard of a way to do this, but I don’t recall. Thanks. :stuck_out_tongue:

In IE you can use the “make available offline” option and pick how deep you want to go. But what you can do with it afterwards beyond looking at it is a mystery to me.

You’re looking for software such as :

http://www.spidersoft.com/
Save entire Web sites - view offline
Backup selected webpages or entire websites to hard disk or CD. Archive to ZIP or CHM (HTML Help) files.

The ideal tool for all Web researchers. Browse and save collections of complete Web pages into Web archive files (.mht), for fast and efficient, offline viewing.

or:

http://www.bluesquirrel.com/products/whacker/index.html

First layer = Direct links to other pages, but not the links those links hold?
Good luck saving Yahoo! then, or SD!!