Excluding a website during a scan

BlackWidow scans websites (it's a site ripper). It can download an entire website, or download portions of a site.
Post Reply
Posts: 3
Joined: Sat Oct 05, 2013 12:07 pm

Excluding a website during a scan

Post by LAWMANM16 »

How do you exclude a website when fuskering a site that has multiple external links to other sites?
User avatar
Site Admin
Posts: 1904
Joined: Sun Oct 02, 2011 10:49 am

Re: Excluding a website during a scan

Post by Support »

Uncheck "Scan external link" and it will only scan the site itself without external sites. But if you want to scan external site but not all of them, you can add a filter not to scan certain site. If you selected "Scan everything", click on Add Rule and enter the site not to scan, like google.com for example and this will not scan any URL which contain google.com anywhere in the URL. If you have not selected "Scan everything", then it will not scan external site and you must add filters of links to scan and files to add to the structure.
Your support team.
Post Reply