Secondhand SEO: How to Optimize a Site Already “Optimized” by Someone Else. Part 1

As the manufacturer of a popular SEO software suite, from time to time we are asked how to best optimize a completed Web site that someone else has already attempted to optimize. Disappointed with the results of some self-proclaimed ‘SEO Specialists,’ Web site owners often decide to optimize and market their sites themselves.

That’s where we come in. In this issue of SEO MixTour, we offer you key recommendations as to which site aspects should be checked carefully for mistakes; mistakes that might be impeding your site’s search engine visibility.

First, check if your site is crawlable. In other words, you need to find out if search engine spiders can access all of your site pages that you want presented in search engine results. How?

  • Open robots.txt ( and see if there are any instructions for robots that are wrong (For example: User agent: * Disallow /). If your site has no robots.txt file, then all robots are able to crawl your entire site, which is OK unless you want certain specially protected site areas to be hidden from random visitors.
  • On all important pages of your site, check for the presence and content of a Meta robots tag. A Meta robots tag should not use ‘noindex’ and ‘nofollow’ commands.
  • On all important pages of your site, check for the presence of a Meta refresh tag. Google doesn’t recommend them, so remove any Meta refresh tags and redirect visitors to a different URL with a server-side 301 redirect.
  • Check if your site has a valid Sitemap. Typically, you can find your site map at:, but may be placed in a different location. If you have no Sitemap, create one—see more details in Google’s Webmaster Guidelines . You can perform all of the above suggestions using your Web CEO Editor and Optimization tools.
  • If you have a pure Flash-based site, you can’t hope for a good site ranking unless you have enough descriptive, keyword-rich text in your videos. Find more information about Flash site optimization on Adobe’s Site and in the previous issue of SEO Mix Tour. You might also want to create keyword-rich static alternatives of your most important pages.

Second, make sure there are no technical mistakes that may be costing you good site ranking.

  • Ensure all your links have a consistent URL syntax, i.e. your links always begin only with and never with Make sure your server correctly translates requests for or to  (Google names it ‘a canonical URL’) with the help of a server-side 301 redirect. You can also create a Google Webmaster account and use its tools to tell the preferred domain syntax specifically for Google.
  • Check all of your important pages for repetitive titles. The content of your title tags should be unique and contain the key phrases for which each particular page is optimized. This is especially important if you have a CMS-based Web site. The same check can be performed to detect repetitive content in your Meta description tags, although this is not as critical these days as it was in the past.
  • If your site pages are dynamically generated, make sure your site uses a solution for creating SEO-friendly page URLs that contain keywords. Typically, this task is completed using the .htaccess file, mod_rewrite or product/article/book/any meaningful name variables in the URL structure. If you have no idea of what kind of substitution for a dynamic URL would be a more readable static alternative, you can get a general understanding by reading the following Wikipedia article. With a static site, it is even easier for you to create keyword-rich page URLs.

Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s