In this blog we are going to address some of the main questions we are asked when discussing search engine optimisation with clients.
What is an “Algorithm” and how does this affect my website? What is “Google Panda” or “Google Penguin”? – Why does Google use animal names to refer to SEO and what does it all mean? We will attempt to simplify and answer these questions here in our latest blog.
What is Google Panda?
“Google Panda” was first released back in February 2011, and this update was named “Panda” internally after one of their engineers – his name was Panda.
The Google Panda release was targeted primarily at websites that had duplicate/repeated content and also targeted content farms on the web in order to protect the websites out there with genuine and unique content.
It is always SO important to create your own unique and quality content that is cohesive and informative. The key to SEO isn’t to stuff pages with keywords in the hope that Google will pick up your website, but to deliver content that actually means something – content of value. Because of this it is vital to spend the time and effort to write good quality web page content.
Just so you know as part of our SEO service we do the content for you. Our experienced team hand-write every last piece of content that goes into our customer’s websites, ensuring that they achieve the highest positions with purposeful, informative and quality page content.
What does “Spinned” or “Re-Written” articles mean? Did Google Panda target these websites as well?
“Spinned” or “Re-Written” page content (or articles) is a process where content is lifted from a website in order for it to be vaguely re-written or to be “spun” using software. “Spinning” articles or content is pretty much similar to re-writing content or articles, however it is done automatically through software – the problem here is that article spinning will only create many different versions of the original unique article. It is similar to saying the same thing in many different ways and most spinning software only replaces the original words with synonyms.
Google is fully aware of this unethical practice and can detect spun or duplicated web page content a mile off. Search Engines are always on the look-out for spun content and any website that has this offending content will have their online credibility and reputation seriously damaged by downgrading your web site to a lower rank on results pages.
More importantly I cannot see why anyone would want to actually copy or spin someone else’s web page content – surly your business is unique and you would want to spend the time to create your own text that is a direct reflection of you and your business rather than using someone else’s page content?
Whilst in the Panda release article duplication or spinning wasn’t so frowned upon, in subsequent Google releases this has now become a big no-no, and offending websites that don’t have at LEAST 80% unique content will be pushed down the ranks.
In summary, the main points that the Google Panda release addressed were:
- Creating unique content is king.
- Don’t copy, re-write or spin content or articles from other websites to reuse.
- Don’t spam blogs or use “auto blogs” to write duplicated content.
- Take the time and effort to ensure your website content is totally unique.
- Be careful you don’t “over-optimise” your pages (with keyword stuffing, etc).
Google released a number of “Penguin” updates in this significant step forward in the world of SEO. The first release was back in April 2012. This update was aimed at penalising websites that were in violation of Google Webmaster Guidelines by using known “Black Hat” or incorrect SEO methods.
Among the list of “Black Hat” methods were link schemes, creation of duplicated and spun content. This release went live on April 24th, 2012, however Google didn’t release its official name for this release until two days later.
The differences between Penguin and the previous Panda updates was based on de-ranking websites that provided a bad or poor user experience. Even down to the smaller aspects like page content and its positioning could potentially push a website down search engine results pages.
Alongside the new penguin updates, the sounds of the previous panda release and the continued focus on eliminating (as much as possible) content duplication, link farming and auto-blogging was still resonating.
Since Google first released their Penguin update, there has been subsequent releases that address other issues. These included using hidden text or hidden links – this is a process where words or links were added to the website in the same colour as the background (white links or text on a white background for example), placing text behind an image, using tiny unreadable fonts and using CSS to hide text.
Penguin 2.0 went live in May 2013 and Matt Cutts (Google’s premier SEO engineer) promised that this release would ensure that any SEO company that were flouting the rules would come up against the far more in-depth updates – he wasn’t wrong.
This release brought a mass of changes and was heavily focused on link-farming, comment spamming on blogs and pages where commenting functions were enabled, guest posting spam and article marketing sites to name but a few.
Penguin 2.0 also focused on the quality of page content, its links and the overall “authority” of the website. “Authority” was considered another one of Google’s bigger changes where Google wants to help web sites and their owners become an authority in their respective niche and will rank those sites higher if they show promise of this. Some of the ways to assert your authority is to include social sharing of information and connect your business to Google+.
Google also advised in their update to the Penguin release that your page content is created for a target audience. This has been something that Google has been telling everyone to do for years and years, however in the earlier days search engines found it difficult to distinguish between good and bad page content, but the Penguin 2.0 release shows real progress with the validation of page content and what makes good content remarkable from other (bad) content.
Good quality online content must be shared and absorbed if it is to gain any credibility and more importantly, visibility in search engine results. If content is of high quality then search visibility happens naturally throughout the SEO process.
A point to consider here is that the Penguin 2.0 release specifically targets websites (and pages) that are getting a large proportion of links from untrusted, third-party websites (link farms, etc) – this is generally an ear-mark for what would then be considered as low quality content.
In summary, the main points that the Google Penguin releases addressed were:
- Link-farming – this was addressed and far more in-depth updates were put into place to prevent this sort of activity.
- Authority – build a website with purposeful, unique content providing compelling content that is both worthy of reading and shows that you are knowledgeable (or an authority) in your area.
- Google+ – build trust and respect with Google by associating your website with Google+.
- Use only high-quality links – the RIGHT linking continues to be a fundamental aspect of Google’s algorithms, so linking to info-graphic resources, guest blog links, genuine links in forums and bulletin boards are still considered safe, although don’t over-do it – work on building links over a period of months rather than weeks.
Do EVERYTHING you can to remove bad links from your website. Vaccoda Design can help with this process.
What is an Algorithm?
Okay, I want to keep this as simple and as easy to understand as possible, and try to prevent blinding everyone with science.
Very simply put, an “Algorithm” is a step-by-step automated (or computer processed) process for calculations and data processing. When this is applied to search engines and finding what you are looking for, when you type in a search query into Google or any other search engine, there is a series of near instantaneous processes to return and display the best possible and closest matches to your search query.
SEO plays a big part in this process, therefore ensuring that websites are correctly optimised in order for search engines to provide the user with the best possible and most relevant search results available.