Back in late 2011 Google’s Matt Cutts released a video explaining how the search engine prioritises content with correct spelling and grammar over stuff that’s full of mistakes. Clever, that.
If you fancy taking a look, here’s a link to Matt Cutts’ video.
Apparently good editing also has a positive effect on search engine rankings. So it’s worth spending time getting your thoughts in order – introduction, beginning, middle, end, conclusion – to create clear, logical arguments instead of simply brain-dumping a load of random information and hoping for the best.
What about ‘reading levels’?
Google also breaks content into three reading levels: basic, intermediate and advanced.
If you operate B2C it’s usually a good idea to create basic level content that appeals to a broad range of people with varying reading and comprehension skills. My site contains 64% basic level content, 36% intermediate and no advanced-level copy. Because I work B2B I often find intermediate level plain English fits the marketing bill best.
What about your site?
Here’s how to identify your site’s reading levels
- go to http://www.google.com/advanced_search
- type your site or page url into the site or domain box half way down the page, under the Then narrow your results by… header
- further down in the same section you’ll find a reading level option. Choose annotate results by reading levels from the drop-down list then press the advanced search button
- Google returns a page of results showing the reading level for each page, headed by a graph showing how your content is split between the three levels
If you’re not 100% confident in your writing, grammar, spelling and editing skills, get a freelance writer on the case. The same goes if you can’t string a sentence together without blinding readers with complex corporate speak.
The thing is, Google’s rules aren’t obvious. They don’t publish ‘how to’ information about what they do and don’t want us to do. It’s a matter of educated guesswork, based on trial and error, established by brave SEOs who work at the uncomfortably pointy end of online marketing.
Fierce competition is another issue. The online marketing landscape is extremely competitive, so much so that if there’s a way to circumvent the accepted way of doing things, people will give it a go. Which is what appears to have happened to a certain link network, which nosedived last week. As a result online marketers who placed too much reliance on it ended up losing search results positions.
Google has sent out warning emails informing site owners that that their back link activity is probably dubious. But it isn’t all bad news. This is the perfect time to educate ourselves about the realities of SEO.
SEO has always bee n a moveable feast and some link building methods inevitably come close to the edge of what search engines regarded as acceptable. SEO is a risky business by nature, and wise search engine optimisers take great care not to put all their link building eggs in one basket.
If the network’s demise has caused your business website to drop search positions, a knee-jerk reactions won’t help. Nor will playing the blame game. Try overhauling your content so it’s as best it can be. Attempt to de-activate the links you think are causing issues. Then submit your site to Google for reconsideration. And bear in mind for the future that this kind of thing is more or less inevitable unless you spread your inbound link building load.
Make sure you build links using a wide variety of methods and tools, not just a handful, and you’ll reduce the risk… probably!
(Thank you to http://www.sxc.hu/profile/benipop for the fab free image)
Overall, Google says that “Each individual change is subtle and important, and over time they add up to a radically improved search engine”. Great stuff. But what do the latest changes mean for searchers? And what about website owners?
Here’s a quick ‘n’ dirty look at a few of the adjustments most likely to affect British Google users
- you’ll notice more locally-relevant predictions in YouTube based on your location. There’s a new system to find results from a user’s city more reliably, so Google can detect when queries and documents are local to the user. Updated country associations for urls deliver greater accuracy. And there have been improvements to local search results
- consistent sized thumbnails on the results page makes for a better visual experience. New images will be delivered faster and there’s improved detection for ‘safe image search’
- the disabling of two old classifiers relating to ‘query freshness’ means freshness is more important than ever. New signals have been applied to help Google ‘surface’ fresh content even faster. The latest generation Panda update makes the search engine more accurate and sensitive to recent changes. And because Google has consolidated some of the signals it uses to detect when a new topic is trending, they can compute in real time… as if you needed another excuse for creating fresh, unique content!
- shopping-rich snippets have been launched globally for the first time, to help searchers identify which sites are likely to have the most relevant product, highlighting product prices, availability, ratings and the number of reviews
- the way Google evaluates links has changed. They’re switching off their old link analysis system and ‘re-architecting’ it
Notice any patterns?
Here’s what appears to be important in Google’s eyes right now:
No surprises there, then!
In a recent interview Dan Russell, Google’s intellectual heavyweight tech usability expert, revealed something really interesting.
Apparently Google staff were amazed to find out how few people knew about the control F function, which lets you search a web page or document by keyword and find the information you need faster. I’m one of the 90% of searchers who had never heard of it.
It just goes to show how dangerous it is to make assumptions about your target market!
luckily Google makes a genuine effort to keep a finger on the pulse of ordinary searchers. It’s Russell’s job to “understand what people do when they search online” using classical analytics, search anthropology and analysis of users’ eye movements millisecond by millisecond.
And they act upon their insights. Shortly after the discovery, Google improved their service by adding new control +F features to their Chrome browser. Good stuff.
Google routinely update their ranking algorithms. These are the computational procedures that they use to determine where web pages should be ranked in response to a searcher’s query. Most of these updates are subtle and go completely unnoticed by Google users but on the 24th of February they launched a major algorithm update in the United States that affects approximately 12% of search queries. As yet this hasn’t hit UK sites but it is anticipated that within about 3 months it will.
This algo change is intended to reward high quality sites with better rankings and reduce the rankings of lower quality sites. The effects that have been seen so far have been neatly collated by SEOMOZ: Googles Farmer Update – Analysis of Winners and Losers.
Based on these, we have some guesses about what signals Google may have used in this update:
- User/usage data – signals like click-through-rate, time-on-site, “success” of the search visit (based on other usage data)
- Quality raters – a machine-learning type algorithm could be applied to sites quality raters liked vs. didn’t to build features/factors that would boost the “liked” sites and lower the “disliked” sites. This can be a dangerous way to build algorithms, though, because no human can really say why a site is ranking higher vs. lower or what the factors are – they might be derivatives of very weird datapoints rather than explainable mechanisms.
- Content analysis – topic modeling algorithms, those that calculate/score readability, uniqueness/robustness analysis and perhaps even visual “attractiveness” of content presentation could be used (or other signals that conform well to these).
Why is this update referred to as Panda or Farmer?
There is a tradition in the SEO industry to name key algo updates and the term ‘Farmer’ was first proposed by Danny Sullivan from Search Engine Land as the update appeared to target content farms.
However, in an interview conducted by Wired magazine with Amit Singhal and Matt Cutts from Google it was pointed out that the internal name used by them was ‘Panda’ as this is the name of one of the key engineers responsible for the change.
How is Google assessing quality?
In developing this algo change Amit Singhal stated that Google used their standard evaluation system that they have developed. This involves sending documents (web pages) to outside testers along with a whole host of questions like: Would you be comfortable giving this site your credit card? Would you be comfortable giving medicine prescribed by this site to your kids? Do you consider this site to be authoritative? Would it be okay if this was in a magazine? Does this site have excessive ads? And others along the same lines.
At the same time they released the Google Chrome add on that enables users to block websites from their search results. They say that they didn’t use the data provided by this add on in this algo change but they found that there was an 84% overlap between sites that people had blocked and those which had been downgraded by the algo update and this gave them confidence that their change was headed in the right direction.
They have looked for signals that provide an indication of quality and it is thought that these include more emphasis on user behaviour along with a revision to the document level classifier.
Google can readily track click through rates (CTR) from the organic SERPs. They can also determine how long a visitor spends on a site page by detecting whether a visitor immediately clicks the back button to return to the search results and by collating data provided by the Google toolbar. Any toolbar that includes a Google page rank meter will be feeding information back to Google regarding the users web activity.
Some time ago Google filed a patent application titled: Method and apparatus for classifying documents based on user inputs and it is highly likely that Google is now using these signals, in combination with other factors, to provide an indication of quality.
What is the Document Level Classifier?
Back in January Google announced that they had launched a redesigned document level classifier that would make it harder for spammy content to rank prominently in their search results.
A document level classifier is basically a program that will look at various attributes of a web page and use probabilistic analysis to classify that page. It is now thought that Google’s document level classifier is looking at a multitude of page attributes and signals to determine whether or not a page should be considered as web spam.
What does this algo change mean for your website?
UK site owners have yet to experience the impact of the Panda update and many are naturally very concerned. It’s worthwhile looking at some of the sites that have so far experienced reduced rankings as a result of the update and considering whether your website is anything like them. Spend some time assessing the quality of your site content; is it unique, is it valuable, is it engaging etc.
If you are ranking prominently in the Google SERPs is it because your site is delivering quality content that has naturally attracted valuable and relevant backlinks? And if you do attract traffic from Google organic SERPs are you getting a good click-through rate? When visitors arrive on your site are they staying and do they move on to other pages within your site?
Basically, if your site pages are the best they can possibly be then Google will be working for you.
How to make your site pages the ones that Google wants to rank prominently
- Focus on your visitors – not the Google search engine technology. This means focusing on what people want when they come to your site. If you run an ecommerce site focus on what people buy and the value you are providing to your visitors.
- Pay attention to your online business strategy which should be all about your visitors and prospective customers.
- Don’t get hung up on achieving #1 positions in the Google SERPs. It’s generally more beneficial to have multiple keyterms ranking in the top 5 or top 10 positions than having a few generic terms ranking at position 1. And it’s far more time consuming to achieve and retain #1 positions than it is to achieve and retain multiple top 5 or top 10 positions.
- Don’t waste time looking for shortcuts, tricks and loopholes. Shortcuts are only likely to bring short term gain and could result in issues arising with the search engines.
- Pay attention to your key site metrics including the number of visitors coming to your site each month, how many pages from your site are indexed by the search engines, the content on your site that is proving to be the most popular etc. Look closely at the paths that lead to conversion and determine why these are working.
- Use your analytics data to identify new keywords that will provide genuine value.
- Ensure that all pages on your site adhere to SEO best practices. This means:
- Concise, people-friendly URLs;
- Descriptive, accurate page titles;
- Powerful ‘calls to action’ in your meta descriptions;
- Content that is well structured and good for both visitors and search engines;
- Effective use of heading tags (h1, h2, h3 etc.);
- Content that provides value to visitors and contains relevant keyterms;
- Keyterms emboldened or emphasised (using em or strong tags);
- Images are implemented correctly with accurate ALT descriptions;
- All pages load quickly for all visitors (even those with slow internet connections);
- Internal links make effective use of appropriate anchor text.
- Provide content that will naturally attract attention and links. Sometimes called link-bait or magnetic content. Creating and publicising attractive, valuable content is the safest way to garner valuable backlinks.
- Make sure that it is easy for visitors to share your content via social networks by presenting prominent sharing icons (i.e. Facebook, Twitter, Stumbleupon, Reddit, Digg etc.)
This is a guest post by Tony Goldstone who is Head of Search at Fresh Egg.