I wonder if a multilingual site that has pretty much the same content on different sites and only a different navigation is considered "duplicate content" ?
Do you think I should use a robots.txt to block all robots accessing the sites in a different language?
If so do you think that this robots.txt wildcard could work:
I'm saving the language in my URL, every domain ends in
"?lang=en"
Help is appreciated!
Olli
Do you think I should use a robots.txt to block all robots accessing the sites in a different language?
If so do you think that this robots.txt wildcard could work:
Code:
User-agent: *
Disallow: /*?$
Disallow: /*?
Disallow: /*lang=en
Disallow: /*lang
I'm saving the language in my URL, every domain ends in
"?lang=en"
Help is appreciated!
Olli








