Hello,
I need some advice on which product would help best help me accomplish what I want to do.
I monitor a number a sites that regularly publish articles / catalogs (pdf format or flipbook or web). I would like to be able to automatically detect when a new article has been published (e.g. by checking timestamp from last scan and filtering site for pdf or jpegs with a newer timestamp) and then download the article do a specific folder per website. The scan should be performed once per day through my list of sites. Each site may also need a custom filter / script to resolve the right URL to download the article. For articles published on the web, I would like to do a screenshot of the entire website and save it to my local drive.
Please could you advise if BlackWidow is best suited for this task or should I be looking at BrownRecluse? Do you offer any paid assistance to write scripts, for example if I provided the cases I am looking at?
Where can I find more detailed tutorials for both products?
Thank you
Website monitoring and download
Re: Website monitoring and download
BrownRecluse is the way to go. As for the screenshot, it can't do that. BlackWidow can but you have to manually browse to the site and click a button to take a snapshot of the entire page, not the entire site. We do provide paid assistance, usually, we charge nothing if you make your own script and need some fixing, advice etc.
Your support team.
http://SoftByteLabs.com
http://SoftByteLabs.com