Google Answers Logo
View Question
 
Q: Can a program be written to allow web site to search networked hard drives? ( No Answer,   1 Comment )
Question  
Subject: Can a program be written to allow web site to search networked hard drives?
Category: Computers > Software
Asked by: istion-ga
List Price: $10.00
Posted: 15 Jun 2006 16:44 PDT
Expires: 15 Jul 2006 16:44 PDT
Question ID: 738544
I have a photo website designed in php. It shows previews of photos
taken by customers.  Only previews have been uploaded as I don't know
how I could ever handle a site containing full sized originals
uploaded to same location.

What i want is the ability to click on the file name of the preview
and it would search through lots of different hard drives on the
network where the webserver resides.  These drives would be on windows
computers networked to the webserver. These photos are currently
contained on three different machines and 10+ different hard drives.

What I would want is a search for the file name and a return with
properties and thumnails of the photos found with that name... then
the ability to download the full sized original of around 10 megs
each.

Is this possible?

Know anyone who could do it?

How does Stock agencies do it? How do they store and allow full sized
downloads of photos all at least 10 megs in size.

Any help or steering would be greatly appreciated.

Clarification of Question by istion-ga on 15 Jun 2006 16:45 PDT
Apache server.  Php4.  Mysql.  Windows 2000.
Answer  
There is no answer at this time.

Comments  
Subject: Re: Can a program be written to allow web site to search networked hard drives?
From: frankcorrao-ga on 15 Jun 2006 18:46 PDT
 
Yes, this is certainly possible.  I have no idea specifically how
stock phote sites do it, but I can think of lots of ways to do it. 
There will be some trade-offs between simplicity and speed.  Leaving
aside secure network design, something like the following would work,
and should perform reasonably well without being too complicated:

1) have a mysqldb  on the webserver that will store photo
metrics,including the location, and thubmnails.

2) write a program that will periodically crawl the mapped network
drives for pictures.  It get the details and thumbnails and stores
them in the mysqldb.  It can be a simple batch script that is run
every 5 minutes by windows scheduler or cron.  Essentially a very less
sophisticated version of the google desktop crawler.

3) Your php site will access the mysql db to present the basic info
the user.  If the user chooses to download the photo, your php can
copy the photo from wherever the mysql says it is to the webserver and
send it to the user (or stick it in the webpage)

4) write another script that will periodically clean whatever
directory you copy the large photos to in step 3

This is a very naive way to do it, but unless you are looking for an
industrial strength solution, i think it would work ok.  It would work
better with a app server to cache the picture info and field requests
rather than just a crawler, and I would not store everything on some
world accessible webserver.  You want to have some kind of
demilitarized zone between your webserver and your network.  But if
its that serious of a project, you are best off hiring some profession
developers.

Important Disclaimer: Answers and comments provided on Google Answers are general information, and are not intended to substitute for informed professional medical, psychiatric, psychological, tax, legal, investment, accounting, or other professional advice. Google does not endorse, and expressly disclaims liability for any product, manufacturer, distributor, service or service provider mentioned or any opinion expressed in answers or comments. Please read carefully the Google Answers Terms of Service.

If you feel that you have found inappropriate content, please let us know by emailing us at answers-support@google.com with the question ID listed above. Thank you.
Search Google Answers for
Google Answers  


Google Home - Answers FAQ - Terms of Service - Privacy Policy