About two weeks ago Matt Cutts warned us that the next Google Penguin update was anticipated to be rolled out ‘sometime in the next few weeks’.
In the video he responds to the question ‘What should be we expect in the next few months in terms of SEO from Google?’ and points out that one of the reasons that they (Google) don’t make ‘what’s coming’ announcements is that plans can and do change. But with that disclaimer he outlined some of what the Google engineers are working on and he described the Penguin 2.0 update as a webspam change that is expected to go a bit deeper and have more of an impact than the original Penguin roll out.
Penguin 2.0 Rolled Out
Mr Cutts then announced from his blog that the next generation of the Penguin webspam algorithm had begun to roll out during the afternoon of the 22nd of May 2013. He states that around 2.3% of English-US queries are affected to a degree that a regular search user may notice. He also tells us that the change has already finished rolling out for other languages worldwide and that languages with more webspam will see more of an impact.
The original Google Penguin update in April 2012 targeted sites that exhibited signals indicating spammy tactics had been used to manipulate the search rankings. Typically these tactics included gaining lots of links to specific site pages using targeted keywords as anchor text. Some of the sites which used this tactic received the widely reported Google link warning message via Google webmaster tools.
“We’ve detected that some of the links pointing to your site are using techniques outside Google’s Webmaster Guidelines. We don’t want to put any trust in links that are unnatural or artificial, and we recommend removing any unnatural links to your site. However, we do realise that some links may be outside of your control. As a result, for this specific incident we are taking very targeted action to reduce trust in the unnatural links. If you are able to remove any of the links, you can submit a reconsideration request, including the actions that you took.”
Recovering sites from the impact of this assessment has been a long and hard struggle for many. Later in 2012 Google introduced their disavow tool which enabled webmasters, who were making efforts to clean up their site’s backlink profiles, to disavow spammy links which they had been unable to remove or change. Many believe that the link-data, gathered by the Google disavow tool, underpins the Penguin 2.0 webspam update and Matt Cutts has spoken of a completely different and sophisticated link analysis system which is in the early days of development.
Although the original Penguin roll out had an enormous impact, spammy, low quality link building has continued to be effectively employed in many industries. One area in which this has been abundantly apparent and noted by Mr Cutts is the Payday Loans sector in the Google UK search results.
Brief examination of the sites and pages which were ranking prominently for queries such as ‘Payday loans’ and ‘short term loans’ before the Penguin 2.0 update quickly showed that spammy links had been used to drive relatively fresh domains to the top of the SERPs. The top ranking site in the UK for these queries was a domain which was only one month old but had a high number of inbound links using these keywords as anchor text. Re-examining the search results subsequent to the roll out shows this site ranking at around #70 for the term ‘payday loans’ and outside the top 100 for the term ‘short term loans’, both of which were previously returning the site in the #1 spot.
Glancing at the anchor text used in backlinks clearly highlights the queries which had been targeted.
The rate at which links have been acquired shows the heavy focus on link-building since the site launch.
Relevance and Authority
Examination of just a few of the pages hosting links to this domain confirms the low quality, non-editorial nature of these links.
This example shows that although many websites were affected by Penguin 1.0 (and subsequent 1.1 and 1.2 updates) the impact was not universal. Link spam continued to be an effective tactic used by some to push fresh domains to the top of the search results in competitive sectors.
Matt Cutts has stated that the Payday loans sector in the Google UK results is being tackled in two ways and that, by the end of the summer, link spammers will stand less chance of ranking. He has talked about going upstream in order to deny the value of links to spammers and it is clear that Google aims to stamp out the use of low quality link building tactics, tools and services.
It looks like this first iteration of the Penguin 2.0 webspam algorithm update has already had a significant impact on this sector in the Google UK search results.
Are You Ready for Penguin 2.0?
Although you may think that your site complies with Google’s goals and is free from any whiff of spammy links are you certain that this is how it looks to the search giant? You may have never purchased any links, or advertorials which pass pagerank, or used tools like SENuke to gain links, but this doesn’t necessarily mean that your site is free from risk.
Has your site suffered a sudden decline in ranking visibility? Have you noticed a sudden drop in your site traffic? Here are just a handful of questions to ask yourself along with a few recommendations that will help you to assess the Penguin-compliance status of your website.
- Is your site submitted to Google webmaster tools and verified?
- If Google is important to your website then it’s essential to have submitted and verified ownership of your site in Google webmaster tools.
- Do you know what your site’s backlink profile looks like?
- Auditing your site’s backlinks is highly recommended. This is most effectively carried out using multiple tools in order to gain a comprehensive assessment. Recommended tools include the Google Webmaster Tools inbound link report, AHrefs.com, MajesticSEO.com and the excellent SEOMoz Open Site Explorer tool.
- How does your site compare with others in your sector?
- Consider aspects of your site’s usability, performance, authority, engagement and the frequency of updates compared with competitor websites. A great tool for monitoring competitor site updates is changedetection.com.
- What are you doing to fulfil the needs and desires of the visitors attracted to your site?
- Who are your target audience?
- Why would they visit your website?
- What is the intent that drives their visits?
- How does your website support their desires?
- What do you do to encourage visitors to share your content with their friends?
- What value does your site offer over and above that which visitors would expect?
What is Penguin 2.0 Looking For?
Over the forthcoming weeks and months stories will be reported of the Penguin 2.0 winners and losers. As for Penguin 1.0 (and 1.1 and 1.2) these stories will help to identify the specific signals and tactics which the webspam algorithm is using to identify sites that have engaged in practices which don’t comply with Google’s guidelines.
The tactics and practices targeted by Penguin 2.0 will more than likely include some of the following.
- Advertorial style backlinks which pass PageRank
- Use of link networks
- Over use of specific keywords as anchor text
- Low quality, easily acquired links (forum profile links, blog comment links, sponsorship links etc.) often sold in ‘link packets’
- Links from non-relevant sites and pages
- Multi-tiered link-building
- Links from low authority sites and pages
- Non-editorial links
- Links from spammed bookmarking sites
- Links from low quality, automated content
- Blackhat SEO practices
- Spammy practices intended to manipulate search rankings
It’s clear that the Penguin 2.0 update has already had a dramatic impact on certain high-profile sectors of search in which aggressive, spammy tactics were continuing to be used to manipulate rankings.
If your site is one of those affected by this latest update and you know that you have engaged in these practices then it’s a fair cop. You will need to decide whether recovering your site or site’s from the Penguin slap is a worthwhile, cost-effective strategy. But if you have been adversely affected by this update and you don’t know why then a detailed site audit and investigation is what you will need.
How important are spelling and grammar to Google? And do they affect your search results positions?
Back in late 2011 Google’s Matt Cutts released a video explaining how the search engine prioritises content with correct spelling and grammar over stuff that’s full of mistakes. Clever, that.
If you fancy taking a look, here’s a link to Matt Cutts’ video.
Apparently good editing also has a positive effect on search engine rankings. So it’s worth spending time getting your thoughts in order – introduction, beginning, middle, end, conclusion – to create clear, logical arguments instead of simply brain-dumping a load of random information and hoping for the best.
What about ‘reading levels’?
Google also breaks content into three reading levels: basic, intermediate and advanced.
If you operate B2C it’s usually a good idea to create basic level content that appeals to a broad range of people with varying reading and comprehension skills. My site contains 64% basic level content, 36% intermediate and no advanced-level copy. Because I work B2B I often find intermediate level plain English fits the marketing bill best.
What about your site?
Here’s how to identify your site’s reading levels
- go to http://www.google.com/advanced_search
- type your site or page url into the site or domain box half way down the page, under the Then narrow your results by… header
- further down in the same section you’ll find a reading level option. Choose annotate results by reading levels from the drop-down list then press the advanced search button
- Google returns a page of results showing the reading level for each page, headed by a graph showing how your content is split between the three levels
If you’re not 100% confident in your writing, grammar, spelling and editing skills, get a freelance writer on the case. The same goes if you can’t string a sentence together without blinding readers with complex corporate speak.
How does Google decide your website’s search results positions? Via a series of incredibly complicated mathematical algorithms. So it helps to give them what they need and abide by the rules.
The thing is, Google’s rules aren’t obvious. They don’t publish ‘how to’ information about what they do and don’t want us to do. It’s a matter of educated guesswork, based on trial and error, established by brave SEOs who work at the uncomfortably pointy end of online marketing.
Fierce competition is another issue. The online marketing landscape is extremely competitive, so much so that if there’s a way to circumvent the accepted way of doing things, people will give it a go. Which is what appears to have happened to a certain link network, which nosedived last week. As a result online marketers who placed too much reliance on it ended up losing search results positions.
Google has sent out warning emails informing site owners that that their back link activity is probably dubious. But it isn’t all bad news. This is the perfect time to educate ourselves about the realities of SEO.
SEO has always bee n a moveable feast and some link building methods inevitably come close to the edge of what search engines regarded as acceptable. SEO is a risky business by nature, and wise search engine optimisers take great care not to put all their link building eggs in one basket.
If the network’s demise has caused your business website to drop search positions, a knee-jerk reactions won’t help. Nor will playing the blame game. Try overhauling your content so it’s as best it can be. Attempt to de-activate the links you think are causing issues. Then submit your site to Google for reconsideration. And bear in mind for the future that this kind of thing is more or less inevitable unless you spread your inbound link building load.
Make sure you build links using a wide variety of methods and tools, not just a handful, and you’ll reduce the risk… probably!
(Thank you to http://www.sxc.hu/profile/benipop for the fab free image)
Google’s official search blog has announced a grand total of forty algorithm changes that either came about during February or are scheduled for March 2012.
Overall, Google says that “Each individual change is subtle and important, and over time they add up to a radically improved search engine”. Great stuff. But what do the latest changes mean for searchers? And what about website owners?
Here’s a quick ‘n’ dirty look at a few of the adjustments most likely to affect British Google users
- you’ll notice more locally-relevant predictions in YouTube based on your location. There’s a new system to find results from a user’s city more reliably, so Google can detect when queries and documents are local to the user. Updated country associations for urls deliver greater accuracy. And there have been improvements to local search results
- consistent sized thumbnails on the results page makes for a better visual experience. New images will be delivered faster and there’s improved detection for ‘safe image search’
- the disabling of two old classifiers relating to ‘query freshness’ means freshness is more important than ever. New signals have been applied to help Google ‘surface’ fresh content even faster. The latest generation Panda update makes the search engine more accurate and sensitive to recent changes. And because Google has consolidated some of the signals it uses to detect when a new topic is trending, they can compute in real time… as if you needed another excuse for creating fresh, unique content!
- shopping-rich snippets have been launched globally for the first time, to help searchers identify which sites are likely to have the most relevant product, highlighting product prices, availability, ratings and the number of reviews
- the way Google evaluates links has changed. They’re switching off their old link analysis system and ‘re-architecting’ it
Notice any patterns?
Here’s what appears to be important in Google’s eyes right now:
No surprises there, then!
…or risk putting your foot in your mouth!
In a recent interview Dan Russell, Google’s intellectual heavyweight tech usability expert, revealed something really interesting.
Apparently Google staff were amazed to find out how few people knew about the control F function, which lets you search a web page or document by keyword and find the information you need faster. I’m one of the 90% of searchers who had never heard of it.
It just goes to show how dangerous it is to make assumptions about your target market!
luckily Google makes a genuine effort to keep a finger on the pulse of ordinary searchers. It’s Russell’s job to “understand what people do when they search online” using classical analytics, search anthropology and analysis of users’ eye movements millisecond by millisecond.
And they act upon their insights. Shortly after the discovery, Google improved their service by adding new control +F features to their Chrome browser. Good stuff.