Hello;
I have about a thousand images (.jpg) with the image source path on
the web: img source= HTTP:// www......
I have all the paths (URL) for the images in an excel file. How can I
batch save them on my computer so that the images now reside on my
hard drive instead of on the web. My concern is that if the website
goes down, I might loose all the images.
I know that I can individuality open each image in Internet Explorer
and then save it. However, this would be too time consuming. I would
like to enter the URL of all the images into a database and batch save
them ( images) all to my desktop.
Thanks |
Request for Question Clarification by
rainbow-ga
on
07 Aug 2006 13:26 PDT
Please take a look at this and let me know if it fits your needs:
Image Quick Saver
http://forusoftware.com/imagequicksaver.htm
Best regards,
Rainbow
|
Clarification of Question by
havequestions-ga
on
07 Aug 2006 13:37 PDT
No, this software needs the images to be open to be able to select
them and then batch download them.
I only have a database of the URL of where the images are stored.
Thanks
|
Request for Question Clarification by
tutuzdad-ga
on
07 Aug 2006 14:44 PDT
Try Medusa and let me know if it works for you:
"You start by entering a starting URL and Medusa searches for the file
types you are interested in on this page and all pages found up to a
given depth.
If a file has a thumbnail the thumbnail is downloaded and displayed in
Medusa; making it easy to choose which files to download. Found files
are grouped by the servers they are located on in a very well-arranged
manner which is useful for downloading e.g. series of images.
You can download files to your default download folder really easy by
simply double-clicking on them. Files can also be downloaded by drag
'n drop and copy and paste in a familiar Explorer-style way. The
multithreaded architecture allows for many concurrent downloads and
searches on pages for files. This means that you will save a lot of
the time it would take to surf the web for the files traditionally."
http://www.softpile.com/Internet/Search_Tools/Review_06853_index.html
tutuzdad-ga
|
Request for Question Clarification by
tutuzdad-ga
on
07 Aug 2006 14:46 PDT
Sorry, forgot to add the homepage:
MEDUSA
http://www.candego.com/candego.shtml
tutuzdad-ga
|
Request for Question Clarification by
sycophant-ga
on
07 Aug 2006 18:35 PDT
Hi,
Another option is 'wget' - a common and widely-used Unix application
that is also available for Windows.
It can be downloaded here:
http://www.christopherlewis.com/WGet/WGetFiles.htm
Information on using wget can be found at Wikipedia:
http://en.wikipedia.org/wiki/Wget
Essentially, if you save your Excel file with the paths to a plain
textfile wiht one line per entry (called urls.txt for example) you
could be able to run wget as such:
wget.exe -i urls.txt -o log.txt
This would attempt to get every URL in urls.txt and would log the
results in log.txt
wget is a command-line application, so only suitable if you are
somewhat comfortable with the Windows command prompt.
Alternatively, a GUI version of 'wget' is available, it is called
'WinWGet' and is available here:
http://www.cybershade.us/winwget/
Again, by saving or copying the Excel URL list to a text file you can
use that 'input file' as the list of URLs to download.
Let me know if either of these options seem suitable.
Regards,
Sycophant
|
Clarification of Question by
havequestions-ga
on
08 Aug 2006 10:13 PDT
Hello Sycophant:
I dowloaded the GUI version of 'wget' andused it to input a text filr
( a file in which I cut and pasted each URL .. 1 per line and used the
recursive mode to run the code. However, I keep getting the following
error. Can you plrase let me what it is that I am doing wrong.
-12:10:00-- http://and/
=> `index.html'
Resolving and... failed: Host not found.
--12:10:02-- http://settings%5chp_administrator%5cdesktop%5cmanage.txt/
=> `index.html'
Resolving settings%5chp_administrator%5cdesktop%5cmanage.txt...
failed: Host not found.
C:/Documents: No such file or directory
No URLs found in C:/Documents.
FINISHED --12:10:02--
Downloaded: 0 bytes in 0 files
|
Clarification of Question by
havequestions-ga
on
08 Aug 2006 11:07 PDT
Is there a macro that can be written to be used in the excel file
such that will save each image in a folder on the desktop
automatically ( given that each URL is in a separate field in the
excel file? It should save the image with the original name as in the
URL).
|