How Duplicate Content Can Totally Kill Your Website

There are many ways to create content that will attract visitors to your site and get it ranked high in Google for tons of long tail keywords. You could design an infographic, shoot a video, record a podcast, and more. However, nothing beats good ol’ text. Writing articles is still the best way to convey information to your readers.

However, writing isn’t fun – at least to most people. You are much more likely to enjoy recording a video or podcast of yourself speaking than sitting in front of your PC and putting your fingers to work.

As a result, many internet marketers and bloggers take the easy way out – copying articles from other sites and publishing them on their sites on a large scale.

This results in tons of duplicated articles, and it’s a sure recipe for destroying a site’s potential or whatever you’ve achieved with it so far.

Want some proof? Let me tell you a short story of how duplicate content took its toll on me.

How Duplicate Content Almost Crippled My Blog Network

I run a fairly large blog network for one of my clients. He wanted to rank for some huge national terms in a highly competitive niche, so I built a network for him.

Right now, he’s ranking high for some of the top keywords in his niche, and making a killing as a result. These keywords don’t get a ton of search traffic per month, but a single lead could mean thousands of dollars in your pocket when the deal is closed.

The blog network I built for him is also responsible for the huge success he has achieved maintaining his reputation online. I simply directed some links from this network to some websites optimized to rank for his name and business and we were able to push down all the negative reviews showing up whenever we Googled his name or business.

The network consists of hundreds of blogs built on aged, expired domain names. I also integrated a smaller network into mine to boost his rankings even further.

I provided templates and instructions to the freelancer who built the smaller (supplementary) network. Therefore, I saw no need to go through it and make sure it was built properly. I assumed all was well.

As you already know, blog networks require content, and you can’t realistically buy unique content for hundreds of blogs. Even though the cash was available, it didn’t seem like it was worth it since there were way cheaper alternatives to get content on those blogs.

The cheapest solution was, of course, reusing articles already published on the web. So, that’s how we started using duplicate content on hundreds of blogs and it was a smooth ride for the short period of time it lasted.

As time went by, we started losing blogs. They were being removed from the search results at a faster than normal rate. The smaller network was affected as well.

screenshot_4800

Of course, if you own a PBN, you may lose one or two blogs once in a while. It’s perfectly normal and not a cause for concern.

Nonetheless, I was losing as many as 5 blogs per week from my network and about 20 per week from the smaller network. It was an alarming deindexing rate and I knew I had to do something about it.

So, I began to seek out the culprit.

I started with the domain names. I double-checked each of them to make sure they were actually spam free since they were bought from an expired domain seller. It was a tedious process as I had to do this for hundreds of blogs. My VA assisted me and we eventually completed the check. Each domain was clean in the end, so the digging continued.

Next, we assessed the unique IPs each blog was hosted on. None of them were blacklisted, so we went ahead to compare the deindexing rate of blogs recently added to the network and those added months and years ago. We couldn’t find anything unusual. New and old blogs were getting deindexed.

After a good deal of thinking and brainstorming, we decided to visit each site and have a look. That was when we noticed the simple culprit that had us undergoing lots of strain and mental exhaustion to discover it – duplicate content.

After this discovery, it became clear that copying and pasting articles was no longer a viable content strategy, so we decided to spin each article and make them unique before using them.

This change was rolled out to both existing and new blogs. It took a while but we finally got it done and voila, the problem was solved. I’m no longer losing 30 blogs per week, just the occasional 1 or 2.

I actually developed my own article spinner and it’s what I used to resolve this problem. It’s an incredible spinner and works better than most of the spinners on the market. You can check it out at SpinnerBros.com.

The True Definition of Duplicate Content

Not everything copied from one site to another constitutes duplicate content to search engines. For example, copying one of your favorite quotes and using it word-for-word within your article won’t result in any problems for your site.

In the same vein, copying a customer review, for example, and using it in a product review article on your blog won’t cause any issues.

I like to call duplicate content any piece of written content that isn’t at least 50% unique. So, you could copy and paste an entire paragraph of content from another site and your site would be fine.

You could also publish ten copied articles (with attribution links obviously) on a site containing 100 unique quality articles and your site would have no problems.

That said, when you publish 50 copied articles from start to finish on your 100% duplicate content blog, that’s likely to land you in trouble both with the original authors and Google.

What about Viral News Blogs?

Why are sites like ViralNova and BuzzFeed getting tons of traffic?

I have a popular viral news publishing service, Trending Traffic. I and my team have used this very platform to grow sites to hundreds and thousands of visitors due to the fascinating nature of viral content.

It worked quite well, and we’re still using it today, along with thousands of clients who are pleased with the service.

It worked, and it’s still working. The big question is why?

We didn’t depend on Google for traffic (even though they did send us a good portion of the traffic those sites received).

You see, as long as you’re linking back to the original authors of the content you’re duplicating and they aren’t complaining and you aren’t depending on Google, you’ll be fine.

We primarily used social platforms like Facebook to grow our viral sites, and Facebook doesn’t care about duplicate content. However, Google does.

This is why we advise our users to curate their viral articles before publishing them. This means put a spin on the articles by adding your own thoughts to them before publishing them. This is exactly what sites like BuzzFeed do and look where it has gotten them:

buzzfeed

 

That’s 14.8 Million visitors per month from search engines alone in just 6 years of being online. They were getting more than 500 thousand visitors per month just after their first year in existence, according to SEMRush.

Curating, instead of duplicating, works best and can double or even triple your traffic because you’ll be getting traffic from Google as well since your articles are not 100% duplicated.

Do Auto Blogs Work The Same Way?

Auto blogs are the worst example you can think of. Tons and tons of duplicate articles are published on them on a daily basis and they aren’t even checked for accuracy, nor do they have the viral element that attracts visitors from social platforms.

Therefore, they rely on Google and other search engines for their traffic. This used to work and still works, especially if the blog has a lot of link authority (high domain authority or citation/trust flow).

Another big question…why?

How Does Google Treat Duplicate Content?

Have you ever published an article only to find out that it was scraped off your site and the scraper site is outranking yours for your target keywords?

Perhaps it’s never happened to you, but it’s happened to tons of webmasters. You can Google it. It doesn’t matter whether your content was indexed first. Anyone can copy and outrank you with your own content any day as long they build more link authority to their piece of content.

Extreme example…

If Amazon.com republished a review of a product originally published on your blog, which do you think would rank higher in Google, the reproduced article on Amazon or the original article on your blog?

Question’s answered. This is one of the reasons why auto blogs still work.

Are Penalties Being Dished Out for Duplicate Content?

google-flyswatter-penalty-600

If you’re involved in what I like to call soft duplication (copying quotes, single paragraphs, curating, etc.), you have nothing to worry about.

Whenever Google notices two identical articles on the web, they don’t penalize either sites for having the same content. They simply rank one above the other.

If there are many copies of the article on different websites, the majority of those will be archived in the supplemental results. This means you won’t ever find them in the search results even if you click to the last page.

Nevertheless, if you’re involved in ‘hard’ or ‘heavy’ duplication, that’s where things get interesting!

Google’s constantly on the lookout for auto blogs and other sites publishing 100% duplicate content. These sites are removed from the index (in other words, deindexed) when they get on Google’s radar.

Duplicate content is a form of spam, but it happens on the web all the time for varying reasons. If you look like you’re spamming, your site will get deindexed.

How Duplicate Content Kills Your Site

If you’re filling your auto blog with duplicate content, as already mentioned above, it can get your site removed from the search results. This will likely lead to complete loss of traffic (if you’re dependent on Google) and earnings, if any.

Duplicate content can also prevent your site from getting significant traffic from Google. If you’re mostly copying articles from sites that are more authoritative than yours, you’ll probably never outrank them for their own content. Therefore, you won’t get as much traffic as you should.

These apply to new websites and old websites alike.

If you’re running a 5-year old authority site and you suddenly switch to a duplicate content publishing schedule, don’t expect Google to spare your site.

Authority sites that contain mostly duplicate or low quality articles usually don’t get deindexed. Instead, a site-wide penalty is applied to them and their search traffic plummets overnight.

This reminds me of the panda update that wiped out many article directories and content farms.

How to Solve Duplicate Content Issues

The only way is to start publishing unique content or curate copied articles before using them. If you don’t have the cash to hire a writer or you think it’s not worth it, curate the articles. If you don’t have the time for that, use an article spinner.

Article spinners are my favorite and the cheapest method of getting unlimited unique content to use as you like without worrying about duplicate content problems.

Conclusion

So, I’ve done my best to clear up some of the confusion regarding what constitutes duplicate content and explain how it can kill your site. However, I’m in no way saying you should avoid using it. It’s totally fine if that’s what works for you, but it’s risky.

I strongly recommend that you go the spinning route or take the time to curate articles. It’s definitely worth it.

If you have any questions, just leave a comment below.

Leave a Reply

Your email address will not be published. Required fields are marked *

− 2 = 5