Google Answers Logo
View Question
 
Q: save multiple images simultaneouely ( No Answer,   2 Comments )
Question  
Subject: save multiple images simultaneouely
Category: Computers > Software
Asked by: havequestions-ga
List Price: $5.00
Posted: 07 Aug 2006 13:04 PDT
Expires: 06 Sep 2006 13:04 PDT
Question ID: 753529
Hello;

I have about a thousand images (.jpg) with the image source path on
the web: img source= HTTP:// www......

I have all the paths (URL) for the images in an excel file. How can I
batch save them on my computer so that the images now reside on my
hard drive instead of on the web. My concern is that if the website
goes down, I might loose all the images.

I know that I can individuality open each image in Internet Explorer
and then save it. However, this would be too time consuming. I would
like to enter the URL of all the images into a database and batch save
them ( images) all to my desktop.

Thanks

Request for Question Clarification by rainbow-ga on 07 Aug 2006 13:26 PDT
Please take a look at this and let me know if it fits your needs:

Image Quick Saver
http://forusoftware.com/imagequicksaver.htm

Best regards,
Rainbow

Clarification of Question by havequestions-ga on 07 Aug 2006 13:37 PDT
No, this software needs the images to be open to be able to select
them and then batch download them.

I only have a database of the URL of where the images are stored.

Thanks

Request for Question Clarification by tutuzdad-ga on 07 Aug 2006 14:44 PDT
Try Medusa and let me know if it works for you:

"You start by entering a starting URL and Medusa searches for the file
types you are interested in on this page and all pages found up to a
given depth.

If a file has a thumbnail the thumbnail is downloaded and displayed in
Medusa; making it easy to choose which files to download. Found files
are grouped by the servers they are located on in a very well-arranged
manner which is useful for downloading e.g. series of images.

You can download files to your default download folder really easy by
simply double-clicking on them. Files can also be downloaded by drag
'n drop and copy and paste in a familiar Explorer-style way. The
multithreaded architecture allows for many concurrent downloads and
searches on pages for files. This means that you will save a lot of
the time it would take to surf the web for the files traditionally."

http://www.softpile.com/Internet/Search_Tools/Review_06853_index.html

tutuzdad-ga

Request for Question Clarification by tutuzdad-ga on 07 Aug 2006 14:46 PDT
Sorry, forgot to add the homepage:

MEDUSA
http://www.candego.com/candego.shtml

tutuzdad-ga

Request for Question Clarification by sycophant-ga on 07 Aug 2006 18:35 PDT
Hi, 

Another option is 'wget' - a common and widely-used Unix application
that is also available for Windows.

It can be downloaded here:
http://www.christopherlewis.com/WGet/WGetFiles.htm

Information on using wget can be found at Wikipedia:
http://en.wikipedia.org/wiki/Wget

Essentially, if you save your Excel file with the paths to a plain
textfile wiht one line per entry (called urls.txt for example) you
could be able to run wget as such:

wget.exe -i urls.txt -o log.txt

This would attempt to get every URL in urls.txt and would log the
results in log.txt

wget is a command-line application, so only suitable if you are
somewhat comfortable with the Windows command prompt.

Alternatively, a GUI version of 'wget' is available, it is called
'WinWGet' and is available here:
http://www.cybershade.us/winwget/

Again, by saving or copying the Excel URL list to a text file you can
use that 'input file' as the list of URLs to download.

Let me know if either of these options seem suitable.

Regards,
Sycophant

Clarification of Question by havequestions-ga on 08 Aug 2006 10:13 PDT
Hello Sycophant:

I dowloaded the GUI version of 'wget' andused it to input a text filr
( a file in which I cut and pasted each URL .. 1 per line and used the
recursive mode to run the code. However, I keep getting the following
error. Can you plrase let me what it is that I am doing wrong.



-12:10:00--  http://and/
           => `index.html'
Resolving and... failed: Host not found.
--12:10:02--  http://settings%5chp_administrator%5cdesktop%5cmanage.txt/
           => `index.html'
Resolving settings%5chp_administrator%5cdesktop%5cmanage.txt...
failed: Host not found.
C:/Documents: No such file or directory
No URLs found in C:/Documents.

FINISHED --12:10:02--
Downloaded: 0 bytes in 0 files

Clarification of Question by havequestions-ga on 08 Aug 2006 11:07 PDT
Is there a macro that can be written to be  used in the excel file
such that  will save each image in a folder on the desktop
automatically ( given that each URL is in a separate field in the
excel file? It should save the image with the original name as in the
URL).
Answer  
There is no answer at this time.

Comments  
Subject: Re: save multiple images simultaneouely
From: jammiprotein-ga on 05 Sep 2006 21:09 PDT
 
this tools can help you do that.
step by the following
1, download the tools (http://www.netants.com/en/download.html) and setup it
2, copy all your URL and resave as a TXT file. (you can code an excel
MACRO to create this file. and you can write code to save the URLs to
DB,too)
3, run the tools,click menu "files-->import list..." and select above txt file.
   specify your save path,ok.   
   the tools will download all image you specified to the path.
Thanks.
Subject: Re: save multiple images simultaneouely
From: forusoftware-ga on 11 Sep 2006 08:15 PDT
 
1)you can edit a html page ,which content is:
<img src=http://..../001.jpg><br>
<img src=http://.../002.jpg><br>
--------------------------
2)save it to a.htm
3)open a.htm by internet explorer
4)save the whole html page as ...

Important Disclaimer: Answers and comments provided on Google Answers are general information, and are not intended to substitute for informed professional medical, psychiatric, psychological, tax, legal, investment, accounting, or other professional advice. Google does not endorse, and expressly disclaims liability for any product, manufacturer, distributor, service or service provider mentioned or any opinion expressed in answers or comments. Please read carefully the Google Answers Terms of Service.

If you feel that you have found inappropriate content, please let us know by emailing us at answers-support@google.com with the question ID listed above. Thank you.
Search Google Answers for
Google Answers  


Google Home - Answers FAQ - Terms of Service - Privacy Policy