SEO predictions for 2017

I predict there’s going to be lots of prediction posts…

Seriously though, amongst the click bait guff there’s a few good ones. Here’s one from Rand for instance.

#8 is on the money in my opinion:

#8: 2017 will be the year Google admits publicly they use engagement data as an input to their ranking systems, not just for training/learning

The implications are clear: User Experience will be the path to better rankings.

Website engagement and Google rankings

Thought provoking article by Larry Kim about how Google may (or may not) use web site dwell time as an indirect ranking signal.

Summary: there’s some correlation items, and personal viewpoints added, but nothing is concrete. That’s not to say it isn’t actionable – just thinking about how to improve your site engagement, and then implementing improvements, can only be a good thing.

As is often the case with well written articles, the comments have further gems. Consider for example the concerns raised by Mark Traphagen and then Larry’s response. Both really useful.

Local SEO Myths

A useful list from Joy Hawkins on some common local seo misconceptions.

#4 is still the main misconception: that posting on Google+ will improve your rankings.

This is no longer the case. I say ‘no longer’ because it is true that in the early days of G+ there was a benefit to updating your G+ profile. And that continued for a certain extent with G+ Authorship. But those days are long gone.

Search Results Ranking Based on Satisfaction?

An interesting find by Bill Slawski of a newly published Google patent:

A newly published Google patent application describes technology that would modify scoring and ranking of query results using biometric indicators of user satisfaction or negative engagement with a search result. In other words; Google would track how satisfied or unsatisfied someone might be with search results, and using machine learning, build a model based upon that satisfaction, raising or lowering search results for a query. This kind of reaction might be captured using a camera on a searcher’s phone to see their reaction to a search result.

The immediate questions related to this are around how much access you’d give Google to look at your reactions…

But I wonder what our search tools will look like in a few years anyway – will we still be peering at a search result listing on our phone, or will bots and voice input have become more the norm by then?

Link thinking

A common assumption made in the SEO world is that links are a key ranking factor (and they are). But Jon Cooper thinks through the assumption, with some useful thought-bytes from industry luminaries. (And BTW good to be reading a blog post from Jon again – it’s been a while.)

From his conclusion:

Tomorrow is not a guarantee. As we’ve seen, Google can move very quickly. With that said, even if Google decided this very morning to move away from links as a significant factor, I highly doubt they could make a major change within a ~12-18 month timeframe, just because links are so foundational to their search engine.


The real threat is more foundational than links. Justin Briggs explained it best in his response earlier. The aspect of ranking a page organically in Google’s results has slowly declined in value, both because of other SERP features & search ads. There’s still a ton of money to be made, but we should work like we’re living on borrowed time.

Organic is just one channel in your inbound marketing – don’t rely on it too much… borrowed time indeed.

How Google reinforces, then manages to overcome consensus bias

From Aaron Wall at SEObook:

Consensus bias is set to an absurdly high level to block out competition, slow innovation, and make the search ecosystem easier to police. This acts as a tax on newer and lesser-known players and a subsidy toward larger players.

Eventually that subsidy would be a problem to Google if the algorithm was the only thing that matters, however if the entire result set itself can be displaced then that subsidy doesn’t really matter, as it can be retracted overnight.

Whenever Google has a competing offering ready, they put it up top even if they are embarrassed by it and 100% certain it is a vastly inferior option to other options in the marketplace.

The whole post is (as usual) an excellent read.

Surprising Google Search ‘card’ error

Google search card error

It’s strange to see Google’s card results being so out of date.

As I mentioned on Twitter the other day, here’s an example of a search on the term ‘austrade ceo’. The current Austrade CEO Bruce Gosper has been at the helm since February 2013. And yet, Google shows this ‘card’ for Peter Grey (who retired in November 2012).

What’s even stranger though is that all the top results on the search term are correct – it’s only this card that is wrong.

It’s not as though it’s a small site, or a little known executive – this is the CEO of a major Australian Government site. Google, sometimes you surprise me.

(Disclaimer: Austrade is a client of mine.)

Austrade CEO Bruce Gosper

This Is What Losing Money Looks Like

The following graph shows the daily traffic for a quality consumer information site (I was brought in to analyse the issue this week):

Heartbreak City

The pain starts in early May and then turns to heartbreak as traffic declines over the following month. From almost 20K visits to under 2K visits daily. That’s a 90% drop in traffic. And for this site – where the business model revolves around referring visitors to suppliers and taking a commission – that’s a massive drop in income.

Here’s the similar trend in terms of their rankings (ie number of terms they rank for) drop:

Organic rankings drop

So, what happened in early May? Well, that’s when they went live with a brand new site. New design, new CMS, new site layout and a new hope…

What caused the drop then? I’ll spare you the investigative process (was it the CMS, a site architecture issue, a Google algorithm change, a site content change, a penalty?) and simply cut to the chase – the main problem is they didn’t put in redirects from the old site pages to the new pages.

For a site like this one (with close to 5000 pages) it’s absolutely crucial that key technical SEO considerations such as redirects are implemented – they simply aren’t optional. I say key technical SEO consideration because SEO is a mix of technical configuration and art. Even if you are dubious about the ‘art’ side of SEO, don’t neglect the technical side – things like redirects, robots.txt files, sitemap.xml files, Title tags, proper caching and expiry settings – things that most web developers should understand.

Sadly though, it wasn’t until late July that someone realised the redirects were missing and started putting them in place (and only a subset at that):

Redirects put in place a month too late

But by then it was too late – the damage was done and any chance of getting the traffic back quickly is small.

Here’s why.

In an ideal world here’s how Googlebot would work:

Hmmm, I’m seeing a lot of 404s on the site, but I know the site is OK because the home page is fine, and the first level of links are all crawlable. Plus I’ve had a good history of crawling this site. I wonder if they’ve moved to a new system and are using a new URL structure. I’ll check. Yes, it looks as though they are. I’ll use some of my infinite crawling and indexing resource to go through the entire new site and intelligently update the old links I have in my index with the new ones.

Sadly though, that’s not how it works. Here’s a more accurate picture of how Googlebot works:

WTF? I’m getting lots of 404s on this site. I’ll check back tomorrow and see if the errors are still occurring. Yep, looks like there are still lots of errors. Well I’m not going to waste much of my finite crawling and indexing resource on this site. In fact I’m going to significantly reduce how much resource I allocate the site, and also drop the thousands of pages that are giving me 404s from my index. Ahh that’s better, I only have a few hundred pages to worry about now. And in the coming months I might (just might) spend a tiny bit of resource looking for new pages on the site.

Admittedly the above is greatly simplified – Googlebot is actually incredibly sophisticated (it does all sorts of cool things like indexing the content in lazy loading javascript comment plugins) – but on the crawling basics the above is what you can expect.

With a new site the redirects need to be in place from day one – they can’t just be added later. Redirects aren’t like a performance tune-up (‘ooops I forgot to add caching on the server – I’ll just add that now’) – once your URLs are out of Google’s index, they’re gone.

How to Setup 301 Redirects

Implementing 301 redirects is actually really easy (although preparing the actual mappings of old to new pages can be very time-consuming).

For bulk redirects (a new site going live), these are normally done at the server level.

IIS Redirects

On Windows/IIS hosting the easiest way to configure redirects is via rewritemaps.config files. These are simple text files placed in the same directory as the web.config file and an entry added to the web.config file that refers to it. Ruslan covers using rewritemaps.config files really well.

My usual process when engaging with larger companies who use IIS is to provide their IT team with a fully prepared rewritemaps.config file which they just need to place on the server. I never have to go near their servers or even be given FTP access to anything.

Apache Redirects

On Linux/Apache hosting (the majority of PHP and open source CMS hosting scenarios) redirects are implemented in a .htaccess file (also a simple text file). I won’t go through the details – there’s hundreds of posts on how to set up .htaccess files, but here’s a typical example of a redirect line in a .htaccess file (from this site), it’s super simple:

Redirect 301 /live/post/2004/06/27/Single-Malt-Scotch.aspx

My site currently uses WordPress, before that it was BlogEngine, before that it was a custom rolled blog [hey, we’ve all been there :-)], and before that it was Blogger, so there’s been some changes over the years. Because of that my current .htaccess file has more that 1100 redirects from old URLs to current locations. These all sit in the .htaccess file at the domain root.

Both IIS and Apache redirects support cross domain redirects, and both support parameters as well.

Content Management System Redirects

Increasingly you can set up redirects within the CMS used as well.

For example, in WordPress I always install the Redirection plugin. This can help with bulk URL changes eg if you change the permalink structure – it will automatically catch the change and set up Redirects for all the old URLs to the new ones. (Make sure you have the plugin enabled before you make the permalink changes!) This plugin will also catch any ad hoc changes made (eg you create a new page, and then a few days later change it’s URL).

WordPress Permalink Structure

elcomCMS (a CMS I deal with a lot in enterprises) also has this as a standard feature. Any changes to pages have redirects automatically created for them, and you can easily add multiple redirects for any page. You’d be surprised at how rare this is in the enterprise CMSs.

How To Prepare Redirect Mappings

Preparing the mappings is actually the time consuming part. Most clients I work with these days have production and staging sites. In the weeks leading up to the cutover I request that the staging site be available externally (usually IP limited) so I can fully crawl it. I usually use ScreamingFrog to crawl the staging site and the current (production) site. There’s tons of other tools of course eg I also use this one from time to time.

Most CMS have an export tool as well, but I don’t typically use them unless I’m familiar with the export format. I’m really familiar with ScreamingFrog so it’s my usual go-to tool. I also prefer to use a tool that crawls the site (as opposed to exporting from an internal database) because the URLs discovered will more closely match what Google has likely found in the past (or will find in the future).

From there it’s a manual process of matching the old to the new URLs.

Excel is Your Friend

I do this in Excel, and depending on patterns in the before and after URLs you can often use sorting, formulas and lookups to help expedite the process. On big sites I usually need to engage with the subject matter experts and content authors to check on information architecture flows. I mention this because it’s important to understand any content strategy changes that may be being implemented as part of the new site.

Once my spreadsheet is ready I then simply format the mappings into either .htaccess or rewritemaps.config files and hand over to the client.

This will cover 95% of all the URL changes.

Bonus Redirects

Sometimes there are old links out there that give errors eg an external site might link to a page that no longer exists on your current site. I use backlink tools (eg ahrefs, Majestic, OpenSiteExplorer) to find all the backlinks from other external sites. I run these through a link checker and note any 404s. I then add redirects for these too. This can result in a little boost for the site when it goes live. Bonus!

A Guide To Effort Required

As mentioned, the time consuming part of setting up redirects is preparing the mappings between old URLs and new ones.

As an example, I was involved in a large-ish Federal Government site revamp last year – they had close to 2000 pages of content, plus hundreds of additional documents (remember Google indexes PDF files and other documents – you can often rank well for PDFs). In that case constructing the redirect mappings alone took me close to 4 whole days. And would have taken much longer if not for the help of the content authors who helped with specific sections of the site. At my hourly rates, that might seem a high price to pay for something that doesn’t actually add any additional value, but compared to the potential cost of not doing it, it’s a no-brainer.

This is worth highlighting – because the next time you’re preparing a large site go-live and the project manager gets a quote for $5K+ just to prepare the redirects, don’t be surprised. In the case of the site example at the start, they would have paid for this in a few hours of a typical day (pre traffic fall). Currently they are spending multiples of that working with an agency to incrementally build the traffic up again.


Sadly, not putting redirects in place is something I see far too often. It’s probably the biggest issue (in terms of impact) I encounter. And it’s totally avoidable. Please ensure you plan for and implement redirects in all your upcoming web projects.

Google Instant Roundup

Google InstantIt’s been a week now since Google Instant hit the news desks of IT blogs everywhere.

I’ve started getting a few questions from clients and friends about it, so I thought I’d put together a simple collection of recommended links, and finish off with some of my thoughts. There’s been tons of discussion of course, but for a good summary of the main points, the following 4 or 5 posts will serve you well:

SEObookHow Google Instant Changes the SEO Landscape

Aaron Wall has a good overview of the announcement and summary of the main effects (and if you are an SEObook member a very detailed analysis inside) as well as some interesting thoughts in the comments. Primarily it seems as though the changes are profit motivated, wrapped up in a ‘it’s faster and saves time’ ribbon.

Search Engine LandGoogle Instant: The Complete User’s Guide

Matt McGee provides a good analysis of the user experience, covering what kinds of searches work well and which don’t, as well as how it works (it’s localised and personalised) and how to disable it.

John Ellis covers probably the main concern of SEOs: that Google Instant will reduce the number of long tail searches (ie searches with 4 or more keywords). His post – Will Google Instant Kill the Long Tail? is pretty much on the money as far as I’m concerned.

Danny Sullivan has a great post – SEO is here to stay, It will never die – where he answers the inevitable ‘does this mean SEO is irrelevant now’ questions.  Answer, no. Danny also puts Google Instant through the George Carlin ‘seven words’ test in this post.

PPCblogHow AdWords Counts Ad Impressions with Google Instant

Geordie Carswell clarifies how impressions are counted. Initially some advertisers were worried that their impression counts would be based on the instant search keystrokes. But as per the Google AdWords help, impressions are only counted when a user clicks, presses enter, selects a predicted query or results are displayed for 3+ seconds.

That 3 second rule… BTW here’s a possible gotcha – type a keyword term and hold the Enter key for 3 seconds. When you release you’ll be taken auto-magically to the first result (this result will be the first paid result if some are listed, otherwise the first organic result). Yep – in a competitive field, someone might be getting a bunch of unexpected clicks if they are in the first ad position. This could be good and bad. Good in terms of raising your CTR, bad in terms of sending you possibly inappropriate clicks. [Hat tip to the SEOBook forums (where I’m a member) for first discussing this issue.]

Matt CuttsThoughts on Google Instant

Finally, Matt Cutts finishes with a nice Q+A style post where he outlines why SEO is still important Smile in spite of what some people trot out. Summary: SEO is very important, and gets more important every time Google puts out a major change (not to mention the hundreds of minor changes they release every year).

SEOmoz – strangely, no post from Rand on the Google Instant release…

Here’s the obligatory Introducing Google Instant video, complete with ‘zetabyte barrier’ and ‘magic’ references as well as footage of people being astounded by how useful Google Instant is – ‘oh look at that… I like that’.

To be fair, Google Instant is pretty cool. The impressive part is how they’ve managed to scale this out. As many have pointed out, the technology itself is not that innovative – Long Zheng had a similar concept he dubbed The Real Live Search (based on the Bing APIs) working over a year ago. And Yahoo had their own Instant Search plans as far back as 2005.

Instead what is impressive with Google Instant is how they are providing the results plus advertising so quickly. It’s a massive undertaking.

My experiences so far

Personally, as a user, I quite like the experience.

But putting my SEO and AdWords hat on, what are the effects?

It’s only early days so it’s not yet clear how big an impact this is going to have on SEO and PPC. In the private forums I’m a member of, the jury is still out, with people having varied experiences. With that caveat, I thought I’d share some of my own observations based on my own sites, and some of my clients’ sites.

The summary (thankfully) appears to be that there’s only minimal changes on both organic and paid fronts:


After a week I’m relived that our revenue producing sites have experienced little impact. Our organizing site for example has had a small decrease in organic traffic, mostly in long tail terms (which we rank really well for). This is understandable given that Google Instant is encouraging shorter keyword phrases. Users are getting results immediately – they get distracted and don’t bother finishing the longer phrase they might have otherwise entered (as discussed earlier).

The result of this is that I’ll probably start focussing on optimising for some of the shorter terms again – something I haven’t put a high priority on lately – as shorter terms in our niches tend to have lower buyer intent.

On client sites that have focussed on shorter terms there has been a slight rise in traffic. This is pleasing but isn’t significant enough to say is definitely Google Instant having an impact – it may just be seasonal fluctuations at play. It’ll take a few months to see whether the trends really pan out.


On the other side of the page (ie the paid results via AdWords) we’ve seem an increase in both impressions and clicks. Again this is understandable because in our AdWords campaigns we tend to be top heavy in the shorter terms. Our average CPC is also slightly up.  As our campaigns mature the terms head more to longer tail but when I’m experimenting and looking for new terms to target the terms are usually shorter (and broad match even!). Google Instant means I’ll probably be keeping short terms in campaigns longer…

I don’t run any local campaigns on my own sites, but I do manage them for clients – and this has been one area of noticeable improvement. The click through rate on location based ad groups started increasing on 12 September (a few days after Google Instant was released) – I assume as it started rolling out by default to Australian users. I’ve had specific location keywords jumping from less than 1% CTR to up to 20% in some cases. CPCs are slightly down also, so this is a great result… but I suspect this window of opportunity won’t last long as most PPC managers cotton on to it.


Overall then, the Google Instant changes haven’t had a major negative effect on my SEO or PPC efforts, and there’s even some positive opportunity in the local side of things. So, Google Instant will affect my approach in some areas going forward.

But What About Bing?

Does Google Instant mean I should focus more on Bing?

This shouldn’t really change your attitude to Bing. Because basically, if you haven’t been targeting Bing yet, then you definitely should be! But not because of Google Instant. Instead you should be targeting Bing because it’s now heading to the number 2 spot, and commands close to a quarter of all US searches.

I get asked all the time how to target Bing over Google. It’s not a simple question to answer of course, but when forced I usually mention that Bing tends to favour fresh links – so if you’ve got a blog that’s regularly updated and getting freshly linked to, that’s a nice advantage to have (over a static web site that has an aging link graph). Just sayin’.

Introduction to Search Engine Marketing

Just realised I hadn’t posted this presentation I gave at the NSW.NET event last month. I was lucky enough to be presenting after Mark Vozzo who gave an excellent introduction to SEO, so my session on SEM flowed on nicely.

In the SEM presentation (embedded below) I went through the following main points:

  • How SEO and SEM work together (and covered the various definitions of SEM)
  • Why you should use paid search to complement your SEO activities
  • Advice on using AdWords effectively including coverage of 3 useful strategies:
    • tight grouping
    • ad positioning
    • bid stacking
  • Facebook marketing strategies
  • Facebook Page tips and Facebook advertising tips

A big thank you to NSW.NET for having me along.

For additional links and resources check out my SEO Resources page.