Free(-ish) tool to pull downloads from all links on a web page

As the title suggests …

Imagine I’ve browsed to a page which has a bunch of links to non-page files. Instead they link to docs or images or whatever. I want to download each of those items in turn to a folder on my PC.

And since the document’s names are often cyrptic to useless, I’d like the filename on the PC to include the verbiage from the <a> tag.

So ideally I’d have a dialog box which asks which extensions to save and whether to include the link text in the saved file name. Supply answers, click [OK] and away it goes.

This would be easy enough to dev up but I’m expecting somebody has already done it.

Any suggestions?

[aside]
IIRC we had a thread a year or more ago where somebody asked for something close this & got some decent answers. But I can’t Google it out of the boards since all the terms I can think of are waaay too common. Quick, how many threads include the word “download”? Answer: ~ 2300
[/aside]

I think you’re describing DownthemAll.

The Firefox add-on Download Them All will do this.

Flashget does that too

Can it also sort my [del]porn[/del] files by content? And masturbate for me? I’d do it myself, but I’m busy. :smiley:

And, while we’re here, if you want the entire website, you can use HTTrack. I’ve used that for when there is an index page(s) that organize the files better than I could, like, say, an image gallery.

Thanks all.

And no, it’s not for pr0n. Although I half-suspect my denial will be seen as proof of guilt by some.

Another recommendation for the “DownThemAll” Firefox addon. I have had very good success with it.

Huh. So am I the only person who’s first thought was wget?

pavuk or nothing, infidel! :wink:

Nah, wget is a great tool. I use it quite often myself.