Conducting An SEO Audit To Troubleshoot Problems – SMX Advanced

We’re back for day 2 of SMX Advanced. My first session of the day: “Conducting An SEO Audit To Troubleshoot Problems & Tune-Up Performance” modertaed by Vanessa Fox. Speaking on the panel:

Adam Audette, Founder, Audettemedia, Inc.
Vanessa Fox, Contributing Editor, Search Engine Land
Derrick Wheeler, Senior SEO Architect, Microsoft

Here’s the overview of the panel:

Has something gone wrong with your organic search engine traffic? An SEO audit might be in order. This session covers how to conduct an efficient audit that troubleshoots real problems, rather than taking you down blind alleys. It also helps you reassess your current SEO efforts for areas that can be tweaked and improved.

On with business. First up, Derrick Wheeler.

Derrick starts up with an overview of how search engines work. He talks us through how a search engine crawler finds and indexes your site. He’s using some pretty cool pointer graphics in powerpoint and though basic, he’s an entertaining speaker.

Derrick explains that we’ll be looking at the process below and learning at what point, SEO’s can find problems and fix them

Crawl, Index, Rank, Traffic Action.

1) Search engine crawls site
2) Search engine indexes site
3) Users perform targeted queries
4) Search engine ranks appropriate site
5) Users click
6) Users take action

Successful troubleshooting requires data. Derrick advises us to keep several lists of keywords and track rankings on all of them on a regular basis. Keep the data so you can keep an eye on your progress and use the data to analyse historical issues to resolve problems.

That was a pretty short presentation and I realise that the format of this session is quite different to the usual. A lot more time will be dedicated to Q&A which is quite cool.

Adam Audette comes on stage.

Site audits are part art, part science. With experience, SEO’s get a “feel” for problems. Adam explains that site audits are extremely work intensive and rely heavily on experience. Problems solving is crucial, basics can be easily taught but it takes time to learn deeply.

After a brief introduction to himself, Adam continues.

SEO Audits are part art. Follow your nose and use your experience to find problems. It requires diligence and trust in yourself and your abilities. Your clients trust you to deliver value.

Part science

Use tools to diagnose, undergo calculations and watch for set factors in your work. Adam presents an entire checklist (there are at least 50 different bullets on this slide).

A framework for site audits:

On page

  • domains
  • sections and categories
  • pages
  • media

Off page

  • Backlinks
  • Social media signals
  • Cache dates, indexed pages
  • Toolbar pagerank

Marketleap has a tool that shows historical indexed pages.

The Big 4 factors

  • URLs
  • Site architecture and navigation
  • Product level pages
  • Site latency (page load times – slow = bad user experience!)

What about deliveragbles?

  • Summarise
  • Keep it prioritised
  • Keep it actionable
  • Build in follow up – roadmap your next steps
  • “Sizzle” matters – presentation, your work must look professional and authoritative

Adam shows us a screenshot of an audit document with an executive summary covering off all of the problems identified, explaining the problems and summarising the impact of each issue. Typically site audits are large documents that

Some cool tools

  • Google searches (eg: inurl: site: etc). Use screenshots of Google SERPs to explain the problems you find
  • Use Lynx and SEO-browser.com to see the site the way a search engine crawler does
  • “charles” is a latency proxy so you can analyse the performance / page load and response times – (Fiddler for Windows is the same thing)
  • Yslow with Firebug
  • Check out these Firefox plugins for SEO
  • Wave toolbar for accessibility monitoring
  • Wget, Linkscape and Audette Media’s Free SEO diagnostics tool (log file based)
  • SEMrush

Vanessa comes back on and mentions the IIS SEO Toolkit – check it out. Vanessa begins her presentation.

You really have to dive in to cover off all of your potential issues to find the problem. Vanessa presents a flow chart looking at the process she follows.

Some cool tips links from Vanessa:

  • Jane and Robot have checklists and SEO tools. Check out Vanessa’s site here.
  • Firefox SEO tools list (already covered above)
  • do you really have a problem? Checks like indexed pages might not necessarily highlight a problem.
  • Take a benchmark of your top 10 queries and use rankchecker to keep an eye on their positions (keep the ranking URL in the data)
  • Use Apache log analyser to see how search engine crawlers are working through your site. Categorise your pages together.
  • Latency is not a rankings factor but it is a crawl factor. Check your latency and keep your site page load time low.

Jane covered the common issues you’ll come across with discoverable links in Flash and Ajax based sites. Use Google Webmaster tools to check for problems. She (very quickly) worked through a URL structure checklist, a canonicalisation checklist and a crawl efficiency checklist – she worked through those pages so fast I could barely get round to even reading them! Hopefully Vanessa will publish the list on her blog. Very inspiring idea and I guess the only feedback I can give you in this post is sit down and construct those lists and develop them as you carry out site audits.

Submit multiple sitemaps arranged by category (eg products) and use Google webmaster tools “number of indexed pages” as a more accurate way of getting a sense of how well your site sections are indexed. That’s a nice tip!

Vanessa rounds up with some advice on how to submit a reinclusion request. Don’t file for reinclusion until you’ve actually fixed the issue!

Sometimes you have to look at the SERPs and see what’s there. How are you displaying in the search results?

Q&A

Vanessa took a question about the <noframes> and <noscript>. Vanessa recomended we read this post on Search Engine Land.

Nofollows – an attendee mentioned the problem that Google is advising webmasters to not use nofollow but the SEO speakers are recommending it’s use. Vanessa suggests you “could” use nofollows but wants to check in with Matt before giving her full answer. The background on this is it was a nofollow question on shopping cart URLS. Maile stepped in on this and said that it’s better to disallow urls with robots.txt such as shopping cart URLs that evolve as customers add products rather than using a rel=”nofollow”.

Derrick comes back on stage to discuss Microsoft.com “taming the MSCOM Beast” and gives us a 3 tier framework (components of SEO) – Authority, Content, Structure. He mentioned he may add a fourth tier covering the organisational aspect of SEO.

The biggest challenge for Derrick at Microsoft.com is site structure. He identified his central theme as crawler efficiency. Microsoft have billions of indexed URLs and run at least 10 different content management systems. At this scale, the problems faced are huge. If one CMS has a spider trap, Google might only crawl 1 billion URLs, not 1.4 billion (I’m amazed at the size of their site).

Derrick presented the Microsoft.com SEO framework – a structured diagram encompassing the prioritised order of seo initiatives, programs, and a measurement framework.

Site wide SEO initiatives at Microsoft

1) Duplicate and undesirable pages (billions of pages)
2) Excessive use of redirecst (40% of their urls internally redirect)
3) Improper error handling eg responding with a 200 when it’s actually a 404
4) Structure of subsidiary content – eg: microsoft.com/australia, microsoft.com/en/australia
5) Low quality page titles and meta tags

Microsoft have an incredible framework to manage their site but Derrick discussed that the biggest challenge is getting recomendations actually implemented. 50% of their recomendations don’t get through.

This was a brilliant presentation and it was genuinely tough keeping up with all of the info covered. I recomend you check out Lisa’s site to pull together her notes too!