I've always been seeing people in the domain industry write about how Google is able to recognize websites with duplicate content and penalize them in SERPs, while uniqueness and freshness (plus other factors) are the key to successful rankings.
To be honest I found many sources which state duplicate content isn't really a problem and that's just a widespread myth. The only case which may represent an issue is when your website is entirely made up of content carbon copied from other websites.
What's the joining point of the two theories?
To be honest I found many sources which state duplicate content isn't really a problem and that's just a widespread myth. The only case which may represent an issue is when your website is entirely made up of content carbon copied from other websites.
What's the joining point of the two theories?
Code:
http://www.webpronews.com/topnews/2009/09/16/google-busts-the-duplicate-content-myth









