I'm looking for a way to get only the images from a website. I've been looking all over the forums and haven't found the correct steps on how to do this, so please forgive a complete noob for asking here.
Just looking for the ability to do
Scroll website and all subpages of the site and pull file type of .jpg
It's probably a really easy setting I need to do, so apologies once more.
BlackWidow scans websites (it's a site ripper). It can download an entire website, or download portions of a site.
2 posts • Page 1 of 1
Basically, you would set the filters not to scan everything, then add a filter to follow all links, and another to add only jpg files. Here are the filters. Copy the block of text and click on the "Paste Settings" button in the Filters window. Set the web site URL and scan...
Code: Select all
[BlackWidow v6.00 filters] [ ] Expert mode [ ] Scan everything [x] Scan whole site Local depth: 0 [ ] Scan external links [ ] Only verify external links External depth: 0 Default index page: Startup referrer: [ ] Slow down by 2:2 seconds 4 threads [x] Follow .* using regular expression [x] Add \.jpg$ from URL using regular expression [end]
Your support team.