The question is not what you are doing wrong, but rather how Google
search results have changed.
Last November a large number of websites lost their good Google
rankings overnight, and it became a very hot topic amongst search
engine optimizers and webmasters. Pages didn't just drop out of the
top 10, they plummeted to below 100.
I spent a week soliciting examples from webmasters for analysis, and
having discussions with search engine experts. I believe I worked out
what was going on, but my opinion is just one of many, and all we can
ever have are educated guesses, because Google keep their search
algorithm secret (even from us at Google Answers).
So, here is what I think has happened, followed by links to the opinions of others.
Florida Update / Filter
Last November Google updated its index, and it has been dubbed by
webmasters "The Florida Update". At the same time Google introduced a
filter to combat the use of artificial link strategies to boost
Before the update, having a single phrase in every link to your site,
and using that same phrase in on-the-page optimizing, was an extremely
powerful strategy, albeit in a grey-area of Google's guideines.
With the Florida update there have been a huge number of sites
targeting commercial keywords drop out of the search results. Debate
is continuing, and an agreement on what has caused this is far from
being reached. I have been in discussions involving many experts in
the field, and this is my own take on what has happened:
1) Google uses the words in and around links to a page as a ranking factor
2) Either due to the introduction of stemming in the search results,
or just a general desire to make results more relevant, Google
introduced a filter to dampen the affect link text has on sites that
are using link text to influence search results
3) Link text works great for non-commercial results, so the filter
only affects commercial keywords (possibly those that are bidded on in
4) The filter dampens the affect of all links pointing to a page,
internal or external
5) The filter only kicks in when it sees the same keyphrase repeated
over and over again
6) The filter might then also look for on-the-page optimizing for that
phrase, and possibly determine whether a page is being overly
7) Sites affected by the filter almost always drop a few hundred
spots. I think that this is because link text was the only factor that
had them so high in the first place, but it could also be a set
Late in January 2004 there was a new update. Some new sites are now
also being affected by the filter.
Your Site's Ranking
To see the filter in action, visit the search tool at:
Enter your search terms and see in the top list where they would've
listed without the filter, and in the second list the current results.
Note that your site is far from being the only one affected!
The trigger for the filter (in my opinion, but I've yet to see a case
that doesn't fit) is link text. Google places a lot of emphasis on the
words in and around links to a page, as representative of the page's
content. Traditionally this has been used by webmasters to improve
search ranking. It still works in general, but no longer for many
The reason your site previously ranked well was link text containing
the keywords. For example, on your page for Hindu Wedding Cards there
is a link at the top of the page (obscured) that points to itself.
Having a link point to the page someone is already on is a bit dodgy.
In this instance it is:
<a title="Hindu Invitation Cards" href="hindu_wedding_cards.htm">Hindu
On the left side is an image that is also a link refering to the page
it is on, and also with a title of "Hindu Wedding Cards". Unless you
have a menu comprised of images, the only linked images on your page
should be your site's logo pointing to your home page.
Google's software will have decided that this could be an attempt to
artifically influence search result rankings, and has filtered your
page from the search results for that phrase.
Based on the new filter, and the way Google works out search ranking,
the "trick" of having links to a page using your targeted keywords is
on its way out.
The answer in your case is to greatly vary the keywords used in your
internal links, and to reduce the amount of internal linking within
your site, and remove all links that refer to the page they are on.
While this should stop the filter from kicking in, it will also remove
the most likely reason your site was ranking well previously.
The filter has effectively eliminated any ranking advantage your site
previously had over your opposition, creating a level playing field.
To rank highly now you will need to concentrate on the classic
attributes - lots of useful content and many genuine links pointing to
Search Engine Watch - Florida Google Dance Resources
Analysis and Implications of Hilltop Algorithm
WebProWorld :: Google's Response to Florida Update
Your site also has pages with duplicate content, such as:
Google Guidelines say:
"Don't create multiple pages, subdomains, or domains with
substantially duplicate content."
This is unrelated to your query, but is something your site could be
penalized for in the future.
I have tried to summarize what is a tricky and complicated situation.
If any of the above is unclear, or if you have further questions on
this topic, just ask for a clarification and I'll get back to you.