SEO Checkup Using Google Webmaster Tools


If you’re an SEO beginner, Google Webmaster Tools (also known as GWT, if my fingers get lazy) is a great place to start a site tuneup. If you’re buried in SEO minutiae and need to pull together some intelligent, actionable to-do items for your site, you could do a lot worse than signing in at Google.com/webmasters/tools. Here’s my quick guide to an SEO checkup, GWT-style:

1. Go Looking For Trouble

First, fix what’s broke. Use the Crawl Errors report as a great way to get a head start.

Click Diagnostics::Crawl Errors. This report will show all of the broken links Googlebot found in its last crawl of your site. It may also show ‘soft’ 404 errors – broken links that don’t deliver a proper 404 code – and pages where your server said “nope, I’m not responding. Phhbbbtttt.”

Broken links

Fix the broken links wherever possible. If you can’t fix ’em, build pages at the broken link destinations or use 301 redirects to reroute visitors.

2. Find Duplicates Duplicates

Haha. Get it? Duplicates twice… it’s funny… sniff. I crack myself up.

Duplicate content is a long-standing SEO bugaboo. Use the HTML Suggestions report to help diagnose duplication problems.

Click Diagnostics::HTML suggestions. Then click duplicate meta descriptions or duplicate title tags. The report you see shows you pages that have identical description or title tags:

Duplicate description tags report

This report can give you great insight into duplicate content. Where there’s duplicate metadata, there may be duplicate pages. For example, I clicked one of the pages listed and found this:

Duplicate description tags example

The ?param=hello is creating a duplicate of my home page.

Even if you don’t find a single duplicate page, it’s important that you have unique, descriptive title tags and compelling description meta tags for each page. So combing through the HTML suggestions report is always helpful.

3. Find Crawl Depth Problems

In a perfect world, you want Google to crawl 100% of your site’s visible pages. Use the crawl stats report to see how close you’re getting to this ideal.

Now, click Diagnostics::Crawl stats. Google gives you a succinct report showing you how many pages Googlebot’s crawling per day, the number of kilobytes downloaded, and the average time spent downloading each page:

Crawl stats

Ideally, you want to minimize time spent and kilobytes downloaded per day, and maximize pages crawled per day. But what really matters is trending.

If pages crawled per day decreases, but time spent downloading a page increases, check your site for performance problems. Poorly-compressed images, bloated code and server problems are a few issues that could be hurting Google’s ability to crawl your site.

CREDIT: 9 Step SEO Checkup Using Google Webmaster Tools

Advertisements

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s