Increase traffic to our blog is one of the main objectives of all those who are dedicated to blogging professionally, duplicate content is one of the most common SEO problems that prevent us from achieving or hinder the traffic organic growth , adversely affecting our site .
Then I explain because, as detected and how you can solve.
What is duplicate content? Texts are repeated on multiple pages, either on your own blog, or elsewhere (hence the problem involving the well-known publishing "press releases" or literature similar). Google wants information original, diverse and unique to provide a good experience to its users, so if in their scans within one, some or many pages with content (title and text) similar or equal simply filters out and excluded from their results from being displayed.
This can happen for various reasons, some of them are:
In external sites to your blog: When someone copy and paste the full text of your blog (though the copier will make some modifications, the browser detects the copy). This is called (or at least that is detected) Plagiarism, often it is done without malice, because there is a popular idea that the information on the website is free, and for lack of knowledge because the problem is, in which the search engine penalizes both sites being unable to recognize who is the original owner of the text, or worse can sometimes recognize as the original author to the site you copied penalty you, horrible situation but it has happened.
When using a system of content syndication (RSS, Feed burner, etc.), we must always make sure that it only an extract of our content, because otherwise these systems can publish the full text will be considered duplicate.
Within your own blog: When using different categories, tags or pages to a single text file. When you archive a text in different sections at the same time, it will appear at different URLs, is a common problem or mistake on blogs. Or when different pages within the blog content or description name and the same.
Tools to detect duplicate content and solve the problem
Google Webmaster Tools : If you use this tool to monitor the status of your blog by Google (if you have not already done so I recommend you give Register your blog in Google Webmaster Tools Webmaster or if you use Google in Spanish, is simple and very effective ).
Login to your account and accessing the "optimization" and then "Improvements HTML" and reviews the duplicate title tags and duplicate meta descriptions. The tool tells you the amount of existing copies and which pages found, so you can review and resolve.
Google Analytics: Using this tool to measure and monitor traffic statistics of your blog (if you do not, I recommend doing so, as it is excellent and gives you very valuable information and benefits), login to your account and go to "Content" and then click "Site Content" and finally "landing pages". The key is to look for suspicious URLs and pages that receive less organic traffic than they should.
To track duplicate content outside of your site, you can also use CopyScape or Plagiarisma.com
And once detected the problem how to solve it?
It is really a cumbersome problem and must be filled with patience to solve manually one by one and errors in the event that your site is multi-year, if it's new you may not have a lot of duplicate content and deliver better early problem.
If you Google Webmaster said internal duplicate content issues on your blog, you need to be corrected or page titles and Meta tags, manually one by one.
In the event that the copy is on an external site, you will have to contact you with their managers and ask them to remove your content on their sites. Communicates via email and try (though you're very annoying) to head amicably, as there is always the possibility that it was unintended.
Some things you can do to avoid duplicate content:
Later we will continue talking about this issue and possible solutions, as it has been one of the main problems that bloggers are presenting the new algorithm change Google Panda.
Then I explain because, as detected and how you can solve.
What is duplicate content? Texts are repeated on multiple pages, either on your own blog, or elsewhere (hence the problem involving the well-known publishing "press releases" or literature similar). Google wants information original, diverse and unique to provide a good experience to its users, so if in their scans within one, some or many pages with content (title and text) similar or equal simply filters out and excluded from their results from being displayed.
This can happen for various reasons, some of them are:
In external sites to your blog: When someone copy and paste the full text of your blog (though the copier will make some modifications, the browser detects the copy). This is called (or at least that is detected) Plagiarism, often it is done without malice, because there is a popular idea that the information on the website is free, and for lack of knowledge because the problem is, in which the search engine penalizes both sites being unable to recognize who is the original owner of the text, or worse can sometimes recognize as the original author to the site you copied penalty you, horrible situation but it has happened.
When using a system of content syndication (RSS, Feed burner, etc.), we must always make sure that it only an extract of our content, because otherwise these systems can publish the full text will be considered duplicate.
Within your own blog: When using different categories, tags or pages to a single text file. When you archive a text in different sections at the same time, it will appear at different URLs, is a common problem or mistake on blogs. Or when different pages within the blog content or description name and the same.
Tools to detect duplicate content and solve the problem
Google Webmaster Tools : If you use this tool to monitor the status of your blog by Google (if you have not already done so I recommend you give Register your blog in Google Webmaster Tools Webmaster or if you use Google in Spanish, is simple and very effective ).
Login to your account and accessing the "optimization" and then "Improvements HTML" and reviews the duplicate title tags and duplicate meta descriptions. The tool tells you the amount of existing copies and which pages found, so you can review and resolve.
Google Analytics: Using this tool to measure and monitor traffic statistics of your blog (if you do not, I recommend doing so, as it is excellent and gives you very valuable information and benefits), login to your account and go to "Content" and then click "Site Content" and finally "landing pages". The key is to look for suspicious URLs and pages that receive less organic traffic than they should.
To track duplicate content outside of your site, you can also use CopyScape or Plagiarisma.com
And once detected the problem how to solve it?
It is really a cumbersome problem and must be filled with patience to solve manually one by one and errors in the event that your site is multi-year, if it's new you may not have a lot of duplicate content and deliver better early problem.
If you Google Webmaster said internal duplicate content issues on your blog, you need to be corrected or page titles and Meta tags, manually one by one.
In the event that the copy is on an external site, you will have to contact you with their managers and ask them to remove your content on their sites. Communicates via email and try (though you're very annoying) to head amicably, as there is always the possibility that it was unintended.
Some things you can do to avoid duplicate content:
- Do not use the same title and / or description in more than one page.
- The text of each page should be unique across your blog and the Web.
- When you copy a quote from another site, always include a link to the original.
- Never copy an entire page, but in case you decide to do so ask before, includes a link to the source, and denies access to the search engine result page (you can use the rel = nofollow)
Later we will continue talking about this issue and possible solutions, as it has been one of the main problems that bloggers are presenting the new algorithm change Google Panda.
Websites that want to receive traffic from their local markets, duplicate content is one of the most common SEO problems that prevent us from achieving
ReplyDeleteYes, Duplicate content would be a major threat for you.
ReplyDeleteJust by reading this article I can say I have added some value to my SEO knowledge. Great one
ReplyDeleteGreat explanation about duplicate content and solutions. This is very useful. Thanks for sharing.
ReplyDelete