Filter to get images from website (noob question)

BlackWidow scans websites (it's a site ripper). It can download an entire website, or download portions of a site.
Post Reply
stacyi
Posts: 1
Joined: Fri Jan 18, 2013 3:30 am

Filter to get images from website (noob question)

Post by stacyi » Fri Jan 18, 2013 3:33 am

I'm looking for a way to get only the images from a website. I've been looking all over the forums and haven't found the correct steps on how to do this, so please forgive a complete noob for asking here.

Just looking for the ability to do

Goto website
Scroll website and all subpages of the site and pull file type of .jpg

It's probably a really easy setting I need to do, so apologies once more.

User avatar
Support
Site Admin
Posts: 1720
Joined: Sun Oct 02, 2011 10:49 am

Re: Filter to get images from website (noob question)

Post by Support » Fri Jan 18, 2013 10:22 am

Basically, you would set the filters not to scan everything, then add a filter to follow all links, and another to add only jpg files. Here are the filters. Copy the block of text and click on the "Paste Settings" button in the Filters window. Set the web site URL and scan...

Code: Select all

[BlackWidow v6.00 filters]
[ ] Expert mode
[ ] Scan everything
[x] Scan whole site
Local depth: 0
[ ] Scan external links
[ ] Only verify external links
External depth: 0
Default index page: 
Startup referrer: 
[ ] Slow down by 2:2 seconds
4 threads
[x] Follow .* using regular expression
[x] Add \.jpg$ from URL using regular expression
[end]
Your support team.
http://SoftByteLabs.com

Post Reply