Hi again IONWeb ~
Cross-browser compatibility is a problem for web designers. Netscape
4.x handles some of the font codes, but other than that, it falls
apart miserably. It certainly doesn't do inline very well, inline
frames worse, and the best you can do for absolute placement is to use
tables, which browsers have no problem browsing.
===================================
CSS & Cross-Browser Compatibility
===================================
Some web designers feel it's not too important to design for Netscape
4.x, as it amounts to less than 2 or 3 percent of their visitors.
Others either use a combination of tables and CSS. Still others use
what is referred to as the "@import hack for css/Netscape". Elegant
Hack Weblog has a good example of the "@import hack" here ...
- http://www.eleganthack.com/blog/archives/00000019.html
Here's another page from the Newmoon.nl site, "Netscape 4.x CSS
problems and solutions"
- http://www.newmoon.nl/pip/code/css_netscape4.php
Interestingly enough, amznVibe found a workaround which he described
in Webmaster World in this discussion thread, "New Hack for CSS and
NN4! Great NN4 workaround", which is here ...
- http://www.webmasterworld.com/forum83/518.htm
Any of the above hacks will help to some extent with cross-browser
compatibility for your visitors. And they all will allow for browser
crawling. I would specifically stay away from inline frames, since you
don't need them with CSS, and I'd be a bit inclined to remember that
some browser defaults may keep your visitors from seeing what you
intend.
There is a marvelous page - actually, a contributive experiment in CSS
design. It's lovely to look at and gives you some good ideas about
what does - and doesn't - work. That site is CSS Zen Garden, and the
premise is the total separation of content from structure. As such it
looks great. Except, of course, for older browsers, in which case,
they get to see a plain text version of the content. It also doesn't
look "quite" the same in Mozilla as it does in Netscape 7.x, or in IE,
either, if there are different defaults set. But the concept is
wonderful, and what the contributors have done is worth studying, if
only to get an idea of what you can do with CSS. Take a look for
yourself here ...
- http://www.csszengarden.com
What I have found that *does* work, despite all the diehard CSS
fanatics who say it's cumbersome and so yesterday is to use tables for
the structure, CSS to handle the content fonts, sizes, etc., and at
least I know we can get the site crawled and indexed properly by
search engines with no problem and it looks pretty much the same
across all browsers at all defaults, browser window sizes, and monitor
resolutions.
It's not so much being lazy as it is more a question of no real
savings in coding (and subsequently file size, etc.) when you have to
start using hacks to accommodate different browsers, especially when
there is still a way in HTML, (the use of tables), to make it look and
to still be available to search engines.
=====================
Robots.txt
=====================
In regards to the robots.txt file. If you are new to this method of
allowing or preventing indexing, just use a simple statement like
this...
User-agent: *
Disallow:
Save it as robots.txt and place it in the Root of your directory, that
is the only place that it works. If anything, it will minimize 404
errors you may be seeing for the robots.txt requests.
If you wish to exclude some content from being indexed, here's an
excellent tutorial by Search Engine World ...
- http://www.searchengineworld.com/robots/robots_tutorial.htm
Make sure you validate it with the Robots Text Validator!
- http://www.searchengineworld.com/cgi-bin/robotcheck.cgi/url
Search terms used:
- CSS + Netscape
- import hack for css/Netscape
- Google + robots.txt
=============
Summary
=============
CSS is a valuable tool, but unfortunately, things start to fall apart
with different browsers because of interpretation. Even current
browsers, such as Netscape 7.x, Mozilla 1.4 or 1.5 (and the Netscape
browser is built on the Mozilla's Gecko engine), and IE 5.x and 6.x
don't all look the same.
You can design using tables in HTML 4.01 Transitional, and it *does*
look like you wish across the greatest percentage of browsers in use.
There is no problem with search engine crawlers, either, but the
choice is yours.
You asked for advice, I'd do it in tables, and if you want, use server
side includes if appropriate. Anything the browser presents to the
viewer, including includes, are crawlable (is that even a word?) by
the search engine bots.
As for robots.txt, the simpler you can keep it, the better. Robots.txt
don't *really* prevent any crawler, but the search engine crawlers are
nice guys and will not crawl what you prefer they NOT index. The
simple robots.txt is above with links to further information if you
want to exclude any portion of your site.
Good luck in your redesign,
Serenata
Google Answers Researcher |