|
|
Subject:
Dynamically produced sub-domains
Category: Computers > Programming Asked by: stevepoulson-ga List Price: $27.00 |
Posted:
02 Feb 2004 19:53 PST
Expires: 03 Mar 2004 19:53 PST Question ID: 303013 |
Hello, how can I get my internal pages found by google. The internal pages are dynamically produced by tracking the response from a 404 error, then querying the database with the error name. Google seems to be stopping before going to any of the dynamically produced pages. Each dynamically produced page has its own dynamically made sub-domain so an "allinurl" search only finds one other link. Any ideas, or shoudl I go and make the sub-domains hardwired instead of dynamic. This will involve a little extra work but querying the database is a really nice thing to be able to do as no work is needed when new clients/pages are put into the directory. Please help | |
|
|
There is no answer at this time. |
|
Subject:
Re: Dynamically produced sub-domains
From: paulrobinson-ga on 05 Feb 2004 15:23 PST |
Is your robots.txt file telling Google not to visit those pages? |
Subject:
Re: Dynamically produced sub-domains
From: robertskelton-ga on 05 Feb 2004 16:55 PST |
These days most of Google's spam problems come from dynamically produced pages. The nature of search engines (record what is there and report on it later) means that they will always have an aversion to sites that vary depending on who is looking at them, and when. Show us your URL and you might get some constructive comments or answer. |
Subject:
Re: Dynamically produced sub-domains
From: stevepoulson-ga on 05 Feb 2004 17:16 PST |
Thanks for your comments, To Paul Robinson, I am pretty sure that I don't have a robots.txt file rejecting google. In the log files As A sample 2004-02-02 12:57:54 203.164.170.107 - 10.0.1.3 80 GET /testbanner2.prx - 200 Mozilla/4.0+(compatible;+MSIE+5.5;+Windows+NT+5.0;+OptusIE55-31) 2004-02-02 12:58:00 203.164.170.107 - 10.0.1.3 80 GET /clientimages/974/banner_boats.gif - 200 Mozilla/4.0+(compatible;+MSIE+5.5;+Windows+NT+5.0;+OptusIE55-31) 2004-02-02 16:36:21 66.150.40.221 - 10.0.1.3 80 HEAD /index.prx - 200 InternetSeer.com 2004-02-02 18:57:48 64.68.82.137 - 10.0.1.3 80 GET /ss/clienturl.prx 404;http://www.searchsmart.com.au/robots.txt 200 Googlebot/2.1+(+http://www.googlebot.com/bot.html) 2004-02-02 18:57:49 64.68.82.137 - 10.0.1.3 80 GET /index.prx - 200 Googlebot/2.1+(+http://www.googlebot.com/bot.html). To Robert Skelton The URL is www.searchsmart.com.au also the all the clients have their own dynamic sub-domains such as www.tuffstuff.s2s.com.au, everyone has their own sub-domain that is not listed within "IIS". I hope these descriptions are better for you to make your comments, sorry they weren't to begin with. Regards Steve |
Subject:
Re: Dynamically produced sub-domains
From: kurok-ga on 10 Feb 2004 13:09 PST |
There is 2 cases: 1 - google robot - must stop read page if WWW server response 404 not 200 resolve: try to configure web server (rewrite module) to analyse URL before got by server and rewrite to dynamic page with analys soft -> result: response 200 and your page 2 - google try to not store dynamic page from 3 level domain resolve use SSI all pages will be html - but in html you can use included script (perl) that output into your html (google index this page as static) |
If you feel that you have found inappropriate content, please let us know by emailing us at answers-support@google.com with the question ID listed above. Thank you. |
Search Google Answers for |
Google Home - Answers FAQ - Terms of Service - Privacy Policy |