I have a list of 600 URLs (trade journal sites, portals, etc.) focused
on a specific topic. I would like to build a search engine or use the
functionality of an existing search engine to enable visitors to my
Web site to search only these 600 sites. In other words, I would
like to create a search engine for only a specific list of URLs.
For example, a visitor to my site would type in the term "General
Motors" into the search box, and the search engine would search only
these 600 URLs for Web pages containing those words.
I would like to know if:
1) There are any inexpensive (under $1000, preferably under $500),
existing tools that would allow me to do this. I am not a programmer
and do not know Perl, Java, C# or any other language, and I would like
to avoid having to build my own Web spider, etc.
2) If there are no existing solutions, is there any way to leverage
the functionality of Google to search only these 600 sites. I know
this is possible for up to ten URLs, but I do not know of a way to
limit a Google search to only a list of 600 URLs.
Thank you. |