Hi Aki,
1. There is nothing in the Google guidelines against a re-directed
URL. However, there is no use in having two unless people are trying
to guess your URL. For example, when Compaq and HP merged, perhaps
they created a new site called compaqhp.com. Most of their customers
didn't know they merged, so they needed to redirect compaq.com and
hp.com to the new site. That is where you would need it. You don't
have to, and you don't want to - because it can confuse search
engines.
Look at this search results page:
link:peyb6E3C9CQJ:www.weddingbythesea-hawaii.com/
It has links to your weddingbythesea-hawaii site, and lists two pages
from mydearhawaii. One of those has a PageRank of zero:
www.mydearhawaii.com/packages.html
This is probably caused by "duplicate content", a Google no-no. The
weddingbythesea-hawaii version is not in the Google index. So you have
some small problems created byy the two domains.
2. The easy way is to just include a robots.txt file at mydearhawaii
that tells search engines to go away. Then if you re-submit that site
to Google, it will be removed from Google's index in due course. It
only needs to contain the following:
User-agent: *
Disallow: /
and be placed at:
mydearhawaii.com/robots.txt
3. Ranking is highly dependent upon incoming links. To retain good
ranking you need to contact every site that links to mydearhawaii, and
ask them to change the link to weddingbythesea-hawaii.
Incoming links to mydearhawaii:
Google
://www.google.com/search?hl=en&lr=&ie=UTF-8&oe=UTF-8&newwindow=1&safe=off&q=link:peyb6E3C9CQJ:www.weddingbythesea-hawaii.com/
AlltheWeb
http://www.alltheweb.com/search?q=%2Blink.all%3Amydearhawaii.com+-site%3Amydearhawaii.com&c=web&cs=utf-8&co=1&no=off&l=any
(Note that Google has combined links to both sites)
Then resubmit weddingbythesea-hawaii to Google every fortnight for the
next two months, to make sure it re-indexes your site after removing
the other one.
4. Additional insight! Do your best to distance yourself from
Build-More-Pages.com - Google can penalize sites that participate in
link farming. Try to get the link from this page removed:
http://www.just-wedding-links.com/hawaii_beach_wedding.html
Best wishes,
robertskelton-ga |
Request for Answer Clarification by
gox7-ga
on
26 Jun 2003 01:45 PDT
Dear Sir,
Thank you for your reply.
Please clarify the removal process specifically.
Where exactly place the following?
User-agent: *
Disallow: /
and be placed at:
mydearhawaii.com/robots.txt
Then, place the above where in mydearhawaii.com?
Where do I place robots.txt?
Please email the sample html passage to remove mydearhawaii.com
Regards,
Aki
PS How long does it take to be removed?
|
Clarification of Answer by
robertskelton-ga
on
26 Jun 2003 02:53 PDT
The robots.txt is just those two lines:
User-agent: *
Disallow: /
...saved as a text file. It's as easy as pasting them into Notepad and
saving it as robots.txt, then uploading it to the top level of your
site (not in a directory), so that when the GoogleBot looks for the
file mydearhawaii.com/robots.txt it will find it.
How long it will take is impossible to predict. Google's update cycle
has been erratic lately as they have been doing a lot of adjustments
and improvements. After submitting the URL, the GoogleBot will visit
sometime in the next month, and tag your site as one to be removed
from the index. Typically the index is updated each month. So it will
somewhere between a couple of days and 2 months.
|
Request for Answer Clarification by
gox7-ga
on
26 Jun 2003 04:34 PDT
Unfortunately, I still don't understand precisely what to do.
Please list what to do step by step.
The removal process on Google website is just as unclear.
PS How did you find the link farming
http://www.just-wedding-links.com/hawaii_beach_wedding.html
I didn't even submit my urls to them.
It is unfair I should be penalized for something I didn't initiate.
I want to find more of those, and remove my urls from them.
Regards,
|
Clarification of Answer by
robertskelton-ga
on
26 Jun 2003 16:24 PDT
A couple of questions to help me explain the process step-by-step...
Are you using a Windows PC and are you familiar with the NotePad
program?
Are you responsible for the creation of pages on your websites, and if
so, how do you upload them to your server?
Regarding the link farm, I found it by accident while checking to see
if your content was duplicated anywhere. I didn't find any others. It
appears to be a one-off, with several links to your sites and many to
about.com - a site that would be unlikely to participate in link
farms. I guess they must have an automated way to grab links from
search results, so they can populate new link pages.
|
Request for Answer Clarification by
gox7-ga
on
26 Jun 2003 16:42 PDT
Hello,
Yes, I am the webmaster using DreamWeaver on Windows 2000.
I upload the site providing FTP info to the server.
I am not familiar with Notepad.
I can do cut & paste though.
Everything you have to do is inside the html code, right?
I appreciate your help.
Aki
|
Clarification of Answer by
robertskelton-ga
on
26 Jun 2003 16:54 PDT
It is very important not to use Dreamweaver to create the robots.txt
file. It is a plain text file. Although there are many text editors
that can do the job, NotePad is a standard program that is on all
Windows installations by default.
On my PC it is located at:
C:\WINDOWS\NOTEPAD.EXE
It is also available from the Start Menu under Accessories.
1) Run Notepad and open a new file
2) Cut and paste these two lines. Make sure they are appearing as two
seperate lines.
User-agent: *
Disallow: /
3) Save the file as robots.txt in the directory where you keep your
website files
4) Upload it to your server, so that it is located at
mydearhawaii.com/robots.txt
5) Go to ://www.google.com/addurl.html and submit mydearhawaii.com
6) In due course the GoogleBot will visit your site. It automatically
checks to see if there is a robots.txt, and obeys the instructions,
which in this case is: all search engine robots go away, don't index
any part of this site
7) This will lead to mydearhawaii.com disappearing from Google's
index, most probably sometime within the next 2 months
|
Request for Answer Clarification by
gox7-ga
on
05 Jul 2003 02:47 PDT
Hello,
How about 301 code in html header?
Will this clarify the redirecting issue completely for Googlebot?
If so, please email exact html header
I should put.
Thanks
|
Clarification of Answer by
robertskelton-ga
on
06 Jul 2003 02:00 PDT
Hi again,
301 is the method Google recommends for when you permanently move your
site from one URL to another:
://www.google.com/webmasters/4.html
I could not find any solid evidence that you should also do this for
duplicate sites, although there doesn't seem to be any reason why you
can't.
Search engine expert Jill Whalen is fine with the idea:
http://www.cre8asiteforums.com/viewtopic.php?t=1535
Search engine expert Brett Tabke seems unsure of the answer:
http://www.webmasterworld.com/forum10003/3847-1-15.htm
This article explains how to do it, just keep in mind that your server
needs to be running Apache.
http://www.tamingthebeast.net/articles3/spiders-301-redirect.htm
This appears to be a .php way to do it if you aren't running Apache:
http://www.webmasterworld.com/forum3/11541.htm
Here's a server header checker:
http://www.searchengineworld.com/cgi-bin/servercheck.cgi
Although a 301 redirect is an option in your situation, I have
generally found that keeping things simple tends to work best with
search engines.
|