Google's quality guidelines for webmasters include the following specific advice:
"Don't create multiple pages, subdomains, or domains
with substantially duplicate content."
What are Google's Quality Guidelines:
Google, in common with other search engines, does not disclose the
precise effect that duplicate content will have on your search engine
placement. However, it's obviously unwise to deliberately violate such
I presume you want to have the same content available from both URLs
as a convenience to a websurfer who might omit or include the hyphen
when typing in the URL. In that case you can choose one of the
addresses as the one that you wish to be indexed under.
Users who type in the other address can still reach your content if
you configure your site to use an HTTP "301 redirect" to deliver them
to the other address.
In the Google Sitemaps Blog, there are some additional tips regarding
"If your site runs on an Apache server, you can do this using
an .htaccess file. You can also use a script ... Once you
implement the 301 redirect, it may take some time for
Googlebot to recrawl the pages, follow the redirects, and
adjust the index.
"If your pages are listed under both versions of the domain,
don't use our URL removal tool to remove one version of the
pages. Since the pages are at the same physical location for
both versions of the domain, using the URL removal tool will
remove both versions from the index.
"We also suggest that you link to other pages of your site
using absolute, rather than relative, links with the version
of the domain you want to be indexed under ... and, wherever
possible, make sure that other sites are linking to you using
the version of the domain name that you prefer."
Inside Google Sitemaps: A few questions from our Google Group:
The webconfs.com SEO toolset states:
"301 redirect is the most efficient and Search Engine Friendly
method for webpage redirection. It's not that hard to implement
and it should preserve your search engine rankings..."
How To Create Redirects
Alternatively, you could use your robots.txt file to make sure that
crawlers can only reach one of your domains:
"If you do create duplicate domains, we suggest using a
robots.txt file to block our crawler from accessing all but
your preferred one."
Google Information for Webmasters:
I trust this provides the information that you are seeking. If not,
please request clarification.
Google Search Strategy:
google webmaster guidelines duplicate domains