![]() |
|
![]() | ||
|
Subject:
Technical capabilities of Google/search engines
Category: Computers > Internet Asked by: roelof-ga List Price: $5.00 |
Posted:
25 Sep 2002 05:46 PDT
Expires: 25 Oct 2002 05:46 PDT Question ID: 68854 |
I am in search of obtaining all the links from a site to other sites. E.g. what links do CNN.com have on their entire site to other sites? I know it can be done spidering the site yourself and extracting the links but since Google already have the source to the entire site its something they can do. They don't. Who does? | |
| |
|
![]() | ||
|
There is no answer at this time. |
![]() | ||
|
Subject:
Re: Technical capabilities of Google/search engines
From: thomasbonk-ga on 05 Oct 2002 01:28 PDT |
roelof-ga, there is an article in the German Linux-Magazin that presents a Perl script that searches for all references to a web page (http://www.linux-magazin.de/Artikel/ausgabe/2002/10/perl/perl.html?print=y). The Perl script uses the Google Web Service (://www.google.com/apis/). The article is in German but the Google Language Tools (://www.google.com/language_tools) might help you... HTH. Yours, Thomas |
If you feel that you have found inappropriate content, please let us know by emailing us at answers-support@google.com with the question ID listed above. Thank you. |
Search Google Answers for |
Google Home - Answers FAQ - Terms of Service - Privacy Policy |