The impacts of Google’s Panda 4.0 algorithm update continue to be uncovered.
One of the assumed consequences of the recent update to the search giant’s algorithm was that it would place even more emphasis on quality copy for websites. It also has been standard rule-of-thumb that up to about 25% of a website’s copy could be duplicate, whether scraped/lifted/whatever-your-verb-of-choice from elsewhere online, as long as the rest was original.
However, Glenn Gabe, in writing for Search Engine Watch, has determined that Panda 4.0 appears to have dramatically reduced that acceptable amount of duplicate copy even further.
“[T]he body of the page wasn’t copied in its entirety. There were just pieces of the content that were copied word for word. I saw this across several sites experiencing a steep drop in traffic due to Panda 4.0.”
For a while now, Google has frowned upon websites that lifted large chunks of copy from other websites – whether affiliated with their own business or not. However, Gabe’s findings may indicate that even small amounts of duplicate copy may result in a website getting dinged in search results.
Again we see an algorithm update that favors websites that “do it right”, meaning websites that take the time to write original, quality copy and following SEO best practices.
The one area where this may have unintended consequences is for websites of retail businesses that sell 3rd party products. For example, if a business sells Widgetco-brand hot tubs, they likely currently rely on Widgetco’s product specs to sell their products. If Panda 4.0 really cracks down on the amount of copy lifted from 3rd parties – even copy shared between official business partners – retailers may be required to rewrite fresh copy for each product spec.
But, generally, for businesses that take the time to build quality websites that win traffic by offering quality content and answering the questions search engine users have, Panda 4.0 is another welcome update.