![]() |
|
![]() | ||
|
Subject:
Download entire directory from URL
Category: Computers > Internet Asked by: lspy1-ga List Price: $2.00 |
Posted:
01 Feb 2005 19:50 PST
Expires: 03 Mar 2005 19:50 PST Question ID: 467318 |
I need to download all the images in a folder from a website. Say: http://www.rrrrr.com/images/whatever.jpg I need all the images in the /images/ directory. I tried countless software programs that claim to automate it, but they all fail, and I think its a 403/forbidden error, because the site sees the request as a non-HTTP request. AM I doing something wrong? Is this even possible, without me manually having to go to each page, and saving each image? We're talking a few thousand images. Is there an easy way to do this? |
![]() | ||
|
There is no answer at this time. |
![]() | ||
|
Subject:
Re: Download entire directory from URL
From: anotherbrian-ga on 02 Feb 2005 02:12 PST |
The website may be running a program that is desinged to prevent you from doing just this. There are also ways of naming images They don't want you swiping their entire collection (this is common with porn sites). You might be able to use wget, thats the standard unix command for making a local copy of a website, You could post the address of the site and I could try wget. |
Subject:
Re: Download entire directory from URL
From: lspy1-ga on 02 Feb 2005 11:23 PST |
www dot juno dot co dot uk |
Subject:
Re: Download entire directory from URL
From: nelson-ga on 02 Feb 2005 20:58 PST |
You mean www.juno.co.uk ? |
Subject:
Re: Download entire directory from URL
From: anotherbrian-ga on 03 Feb 2005 01:36 PST |
I was able to grab the /images/ directory from www.juno.co.uk. However it only had 14 files in it. They are the pictures that make up the graphic poritons of the site, not the pictures of the CDs, which is what I'm guessing you want. Those files reside on a different server (images.juno.co.uk). The problem is that when I wget the image server it won't return the files because thare are no html pages on that server that reference the images. I'm guessing that the way to solve this is to write a script that will crawl the html files in http://www.juno.co.uk/covers/ for the string "http://images.juno.co.uk/full/*.jpg" and then download the images to a local directory. Unfortinatly, I don't have the knowledge to do this. But I'm sure that there are some computer geeks at your local college that could it. |
If you feel that you have found inappropriate content, please let us know by emailing us at answers-support@google.com with the question ID listed above. Thank you. |
Search Google Answers for |
Google Home - Answers FAQ - Terms of Service - Privacy Policy |