Study Shows 29% Of Sites Face Duplicate Content Issues & 80% Aren’t Using Microdata

Recently, Raven Tools conducted a study that has uncovered some major on-page elements that are being overlooked.  One of the biggest offenders that was pointed out in the study was duplicate content.  The results have identified that 29 percent of websites have duplicate content, and 80 percent of websites don’t use microdata.

For more detail on the study, Raven’s Site Auditor Tool was used to identify 4 billion issues over 200 million page crawls.  In case some of you aren’t familiar with the Site Auditor Tool, it is an industry-leading on-page reporting tool that takes a look at code and search issues when implemented across a site.  According to the factors, as determined by Raven, the average site as 4,500 issues related to search.


As stated before, the biggest issue that was discovered by Raven’s tool was the constant issue of duplicate content.  The Site Auditor tool found that 22 percent of title tags were duplicates, and 17 percent of meta descriptions were duplicates, as well.

Although microdata has been a big thing lately, apparently using it hasn’t been.  Despite its popularity among the industry, a mere 20 percent of sites are using microdata.  Last year, there was a survey that showed 36 percent of results in the SERPs were displaying Schema markup.  Because microdata may play a factor in determining, here’s to hoping that the number will increase when the next survey takes place.


And as if this was a big surprise, it was found that Google is holding 83.13 percent of sites, which is using Google Analytics for tracking.  In previous reports, it was shown that Google Analytics was only being used 53 percent and 59 percent.  Google’s hold on tracking has obviously gained some serious traction as of late.

Head over to Raven for full report details.

The images used in this post are from Raven Tools.


Original Source by 

Leave a Comment

Your email address will not be published.