Tag: google

What happens when Google doesn’t trust you?

| February 1, 2014 | 0 Comments

Just a quickie….

Great article about Google penalties

Here’s an excellent article by Alex Graves on David Naylor.co.uk. It talks about how flouting the search engine’s link building guidelines can land you in deep poop, and can even completely eliminate your brand from the organic rankings.

If you’d like some reliable pointers about what not to do, you’ll appreciate this: Irwin Mitchell Suffer Google Whiplash As Google Penalty Knocks Them Out of Index.

 

Google in data collection hot water

| January 20, 2014 | 0 Comments

Google's logo
Google’s data collection antics have finally landed it in hot legal water. As reported by The Telegraph on 16th January:

“The High Court ruled on Thursday that Google can be sued by a group of Britons over an alleged breach of privacy, despite the company being based in the US and claiming that the case was not serious enough to fall under British jurisdiction.

Google faced a group action by users of Apple’s Safari browser who were angered by the way their online habits were apparently tracked against their wishes in order to provide targeted advertising. But because Google is based in the US they needed to seek the court’s permission to bring the case in the UK, something which the search company claimed was inappropriate.

That claim has now been thrown out, as Mr Justice Tugendhat, sitting at London’s High Court, ruled that the UK courts were the “appropriate jurisdiction” to try the claims. Had the decision gone the other way, the claimants would have had to take their case to the US courts.

“I am satisfied that there is a serious issue to be tried in each of the claimant’s claims for misuse of private information,” he said.”

I bet the Big G is concerned. Brits tend to be more cynical, less forgiving than our US counterparts, and the search engine could find itself in real trouble. Don’t get me wrong, I admire Google. But I think their data collection methods are at best greedy, at worst sinister.

Is it time consumers took control of data collection?

Is it time to change the way search engines collect consumer data? Should internet users own their data and have full control over what’s done with it? Consumer data is commercially valuable stuff. Why should Google be able to gather and use it without paying us a penny? They’re big questions, and it’s about time they were answered.

Putting consumers in charge of their own data would be a revolutionary step. But it could also be the best thing that ever happened to marketing.

Turning marketing on its head – Revolution!

Here’s what happens now. Google collects data about our shopping and surfing habits, which it uses to drive ‘targeted’ advertising and search results. Facebook does the same, as do many others. And you don’t get a penny in return for increasing advertisers’ profits.

Targeting is the marketer’s holy grail. If you can pinpoint the goods and services folk want to buy and put them in front of the right people at the right point in the buying cycle, the world is your oyster. At the moment targeting is a blunt instrument, not much better than informed guesswork. Advertisers might come close now and again. But let’s get real – nobody knows what you want and don’t want better than you.

Imagine how much better advertising and marketing would work if people could choose the kind of ads they were presented with based on their needs, wants and desires? A kind of self-targeting, it might work something like this:

  • you are given a simple, universal yes/no choice about whether or not you want your personal data to be used for advertising and marketing online
  • if you tick the ‘no’ box, nobody collects, messes with or uses your data, not even Google
  • if you tick the ‘yes’ box, you are given a choice of subjects, products and services to opt in or out of, choosing those you’d like to know more about or might eventually want to buy
  • Google pays you for the privilege of knowing what subjects you are and aren’t interested in
  • you re-do your opt in/out and products/services choices annually to keep them current and relevant, and claim a repeat fee

In a scenario like this, which turns marketing on its head by giving us the choice of how our data is or isn’t used, marketers should get much better conversion rates because the people who click through their ads are more likely to buy, having self-targeted by opting in.

What do you think?

If you’ve ever wondered why an internet search for a garden shed results in a flurry of adverts and pop-ups for garden sheds and related stuff, now you know.

Are you happy to let Google collect and use your data without paying a penny for it? Are you OK with advertisers and marketers using your data to make assumptions about your buying habits and interests, or would you rather be in the driving seat?

Is Your Website Ready for Penguin 2.0?

| May 23, 2013 | 0 Comments

About two weeks ago Matt Cutts warned us that the next Google Penguin update was anticipated to be rolled out ‘sometime in the next few weeks’.

In the video he responds to the question ‘What should be we expect in the next few months in terms of SEO from Google?’ and points out that one of the reasons that they (Google) don’t make ‘what’s coming’ announcements is that plans can and do change. But with that disclaimer he outlined some of what the Google engineers are working on and he described the Penguin 2.0 update as a webspam change that is expected to go a bit deeper and have more of an impact than the original Penguin roll out.

Penguin 2.0 Rolled Out

Mr Cutts then announced from his blog that the next generation of the Penguin webspam algorithm had begun to roll out during the afternoon of the 22nd of May 2013. He states that around 2.3% of English-US queries are affected to a degree that a regular search user may notice. He also tells us that the change has already finished rolling out for other languages worldwide and that languages with more webspam will see more of an impact.

Penguin 1.0

The original Google Penguin update in April 2012 targeted sites that exhibited signals indicating spammy tactics had been used to manipulate the search rankings. Typically these tactics included gaining lots of links to specific site pages using targeted keywords as anchor text. Some of the sites which used this tactic received the widely reported Google link warning message via Google webmaster tools.

“We’ve detected that some of the links pointing to your site are using techniques outside Google’s Webmaster Guidelines. We don’t want to put any trust in links that are unnatural or artificial, and we recommend removing any unnatural links to your site. However, we do realise that some links may be outside of your control. As a result, for this specific incident we are taking very targeted action to reduce trust in the unnatural links. If you are able to remove any of the links, you can submit a reconsideration request, including the actions that you took.”

Recovering sites from the impact of this assessment has been a long and hard struggle for many. Later in 2012 Google introduced their disavow tool which enabled webmasters, who were making efforts to clean up their site’s backlink profiles, to disavow spammy links which they had been unable to remove or change. Many believe that the link-data, gathered by the Google disavow tool, underpins the Penguin 2.0 webspam update and Matt Cutts has spoken of a completely different and sophisticated link analysis system which is in the early days of development.

Payday Loans

Although the original Penguin roll out had an enormous impact, spammy, low quality link building has continued to be effectively employed in many industries. One area in which this has been abundantly apparent and noted by Mr Cutts is the Payday Loans sector in the Google UK search results.

Brief examination of the sites and pages which were ranking prominently for queries such as ‘Payday loans’ and ‘short term loans’ before the Penguin 2.0 update quickly showed that spammy links had been used to drive relatively fresh domains to the top of the SERPs. The top ranking site in the UK for these queries was a domain which was only one month old but had a high number of inbound links using these keywords as anchor text. Re-examining the search results subsequent to the roll out shows this site ranking at around #70 for the term ‘payday loans’ and outside the top 100 for the term ‘short term loans’, both of which were previously returning the site in the #1 spot.

Anchor Text

Glancing at the anchor text used in backlinks clearly highlights the queries which had been targeted.

short term loans uk link anchor text

Short Term Loans – Link Anchor Text

Link Velocity

The rate at which links have been acquired shows the heavy focus on link-building since the site launch.

Short Term Loans Link Velocity

Short Term Loans – Link Velocity

Relevance and Authority

Examination of just a few of the pages hosting links to this domain confirms the low quality, non-editorial nature of these links.

Short term loans example link

Short Term Loans – Link Example

Link Spam

This example shows that although many websites were affected by Penguin 1.0 (and subsequent 1.1 and 1.2 updates) the impact was not universal. Link spam continued to be an effective tactic used by some to push fresh domains to the top of the search results in competitive sectors.

Matt Cutts has stated that the Payday loans sector in the Google UK results is being tackled in two ways and that, by the end of the summer, link spammers will stand less chance of ranking. He has talked about going upstream in order to deny the value of links to spammers and it is clear that Google aims to stamp out the use of low quality link building tactics, tools and services.

It looks like this first iteration of the Penguin 2.0 webspam algorithm update has already had a significant impact on this sector in the Google UK search results.

Are You Ready for Penguin 2.0?

Although you may think that your site complies with Google’s goals and is free from any whiff of spammy links are you certain that this is how it looks to the search giant? You may have never purchased any links, or advertorials which pass pagerank, or used tools like SENuke to gain links, but this doesn’t necessarily mean that your site is free from risk.

Has your site suffered a sudden decline in ranking visibility? Have you noticed a sudden drop in your site traffic? Here are just a handful of questions to ask yourself along with a few recommendations that will help you to assess the Penguin-compliance status of your website.

  1. Is your site submitted to Google webmaster tools and verified?
    • If Google is important to your website then it’s essential to have submitted and verified ownership of your site in Google webmaster tools.
  2. Do you know what your site’s backlink profile looks like?
    • Auditing your site’s backlinks is highly recommended. This is most effectively carried out using multiple tools in order to gain a comprehensive assessment. Recommended tools include the Google Webmaster Tools inbound link report, AHrefs.com, MajesticSEO.com and the excellent SEOMoz Open Site Explorer tool.
  3. How does your site compare with others in your sector?
    • Consider aspects of your site’s usability, performance, authority, engagement and the frequency of updates compared with competitor websites. A great tool for monitoring competitor site updates is changedetection.com.
  4. What are you doing to fulfil the needs and desires of the visitors attracted to your site?
    • Who are your target audience?
    • Why would they visit your website?
    • What is the intent that drives their visits?
    • How does your website support their desires?
    • What do you do to encourage visitors to share your content with their friends?
    • What value does your site offer over and above that which visitors would expect?

What is Penguin 2.0 Looking For?

Over the forthcoming weeks and months stories will be reported of the Penguin 2.0 winners and losers. As for Penguin 1.0 (and 1.1 and 1.2) these stories will help to identify the specific signals and tactics which the webspam algorithm is using to identify sites that have engaged in practices which don’t comply with Google’s guidelines.

The tactics and practices targeted by Penguin 2.0 will more than likely include some of the following.

  1. Advertorial style backlinks which pass PageRank
  2. Use of link networks
  3. Over use of specific keywords as anchor text
  4. Low quality, easily acquired links (forum profile links, blog comment links, sponsorship links etc.) often sold in ‘link packets’
  5. Links from non-relevant sites and pages
  6. Multi-tiered link-building
  7. Links from low authority sites and pages
  8. Non-editorial links
  9. Links from spammed bookmarking sites
  10. Links from low quality, automated content
  11. Blackhat SEO practices
  12. Spammy practices intended to manipulate search rankings

It’s clear that the Penguin 2.0 update has already had a dramatic impact on certain high-profile sectors of search in which aggressive, spammy tactics were continuing to be used to manipulate rankings.

If your site is one of those affected by this latest update and you know that you have engaged in these practices then it’s a fair cop. You will need to decide whether recovering your site or site’s from the Penguin slap is a worthwhile, cost-effective strategy. But if you have been adversely affected by this update and you don’t know why then a detailed site audit and investigation is what you will need.

Spelling and grammar on planet Google

| August 21, 2012

How important are spelling and grammar to Google?

And do they affect your search results positions?

Back in late 2011 Google’s Matt Cutts released a video explaining how the search engine prioritises content with correct spelling and grammar over stuff that’s full of mistakes. Clever, that.

If you fancy taking a look, here’s a link to Matt Cutts’ video.

Apparently good editing also has a positive effect on search engine rankings. So it’s worth spending time getting your thoughts in order – introduction, beginning, middle, end, conclusion –  to create clear, logical arguments instead of simply brain-dumping a load of random information and hoping for the best.

What about ‘reading levels’?

Google also breaks content into three reading levels: basic, intermediate and advanced. If you operate B2C it’s usually a good idea to create basic level content that appeals to a broad range of people with varying reading and comprehension skills. My site contains 64% basic level content, 36% intermediate and no advanced-level copy. Because I work B2B I often find intermediate level plain English fits the marketing bill best.

What about your site?

Here’s how to identify your site’s reading levels 

  1. go to http://www.google.com/advanced_search
  2. type your site or page url into the site or domain box half way down the page, under the Then narrow your results by… header
  3. further down in the same section you’ll find a reading level option. Choose annotate results by reading levels from the drop-down list then press the advanced search button
  4. Google returns a page of results showing the reading level for each page, headed by a graph showing how your content is split between the three levels

If you’re not 100% confident in your writing, grammar, spelling and editing skills, get a freelance writer on the case. The same goes if you can’t string a sentence together without blinding readers with complex corporate speak.

SEO consternation as Google bites back

| March 28, 2012

How does Google decide your website’s search results positions?

It does it via a series of incredibly complicated mathematical algorithms. So it helps to provide what the search engine wants and abide by the rules. 

The thing is, Google’s rules aren’t obvious. They don’t publish ‘how to’ information about what they do and don’t want us to do. It’s a matter of educated guesswork, based on trial and error, established by brave SEOs who work at the uncomfortably pointy end of online marketing.

Fierce competition is another issue. The online marketing landscape is extremely competitive, so much so that if there’s a way to circumvent the accepted way of doing things, people will give it a go. Which is what appears to have happened to a certain link network, which nosedived last week. As a result  online marketers who placed too much reliance on it ended up losing search results positions – catastrophic stuff.

Bad backlinks can mean Google warnings

Google has sent out warning emails informing site owners that that their back link activity is probably dubious. But it isn’t all bad news. This is the perfect time to educate yourself about the realities of SEO.

SEO has always been a moveable feast. Some link building methods inevitably come close to the edge of what search engines regarded as acceptable. SEO is a risky business by nature, and wise search engine optimisers take great care not to put all their link building eggs in one basket.

If the network’s demise has caused your business website to drop search positions, a knee-jerk reactions won’t help. Nor will playing the blame game. Try overhauling your content so it’s as best it can be. Attempt to de-activate or disavow the links you think are causing issues. Then submit your site to Google for reconsideration. And bear in mind that this kind of thing is more or less inevitable unless you spread your inbound link building load.

Make sure you  build links using a wide variety of methods and tools, not just a handful, and you’ll reduce the risk… probably!

 

(Thank you to http://www.sxc.hu/profile/benipop for the fab free image)