Based on your requirements, I recommend that you try out HTTrack
Website Copier. It is free, available for both Windows and Linux,
supports multiple site downloads at once, supports password protected
websites, and has configurable robots.txt support. You can get it
- HTTrack Website Copier
Follow the instructions in the Quickstart Guide to create a new
- HTTrack manual: How to start, Step-by-step
Once you create a new project, you will be allowed to configure the
various options for it. In the options window, click on the 'Spider'
tab, and set the spider option to 'no robots.txt rules'.
- HTTrack manual: Spider Options Panel
Hope this helps.
If you need any clarifications, just ask!
Google Search Terms Used
HTTrack ignore robots.txt