Duplicate content in the SEO space has been an important topic for as
long as I've been in the industry. I've covered it here countless
times.
Today, I wanted to share how Google's John Mueller says what "effects you'd see with content duplication within a website are."
John says there are two main issues with duplicate content:
(1)
Google's algorithms will choose one URL to show for the content in
search. Maybe it won't choose the URL you'd choose. If you have a
preference, make it known (through redirects, rel=canonical, internal
links, etc).
(2) Depending on the amount of duplication (is each
piece of content hosted 2x, 20x or 200x?), it can happen that the
process of crawling is too much for the server, or that new/updated
content isn't picked up as quickly as it otherwise might be.
John
said in most cases, in cases of "reasonable amount of duplication" and
with a "reasonably strong server", these are not issues. John said in
these cases, "neither of these are real problems. Most users won't
notice the choice of URL, and crawling can still be sufficient."
Read More:http://goo.gl/qFnwYP
0 comments:
Post a Comment