What is Spam 3.0


What is Spam 3.0

Google have complained about the declining quality of their search results.  Numerous causes are cited for this decline – from increasingly successful underhanded marketing tactics, to Google deliberately propagating poor results for its financial benefit — but the unifying theme of the clamor is clear:  these days, there’s more spam in SERPs.

Spam was originally thought of as trickery in the SERPs, and subsequently morphed to reference notions of relevancy, then spam as defined as honest, relevant but suboptimal results marks the third phase in the evolution of search spam:  spam 3.0.

What is Spam?

Spam is fundamentally “something lacking merit” that has achieved high search engine visibility as a result of deliberate manipulative effort (in the context of this discussion – just to be clear – I am not referring to content delivered by mechanisms outside of search, such as email).

Google has traditionally referred to this as “web spam” (Matt Cutts is “the head of Google’s Web spam team“), but I think it is increasingly limiting to think of spam exclusively as links in search results that point to websites (for example, Google may now well return content in a rich snippet that is delivered to them from a data feed, and doesn’t reside on a website per se).

  • Intentional.  Search engine spam must, by definition, be directed at improving a resource’s visibility in the search engines (in times past, one might have said “of improving the rankings of a website”).  However appalling an included result, if it was not designed to perform well in search, it got there by chance.
  • Monetarily-focused.  For a spammer, the purpose of achieving high visibility in the search engines is to make money.  Exceptions where search engine spam has been propagated reasons other than cash exist, but are rare.
  • Cost-effective.  Search engine spam is at least designed to bring in more money than it costs to produce, though search engine spam has traditionally had an eye on an astronomical, rather than simply acceptable, ROI.
  • Suboptimal.  A top spam result is never the same as the top result that would be returned by a user conducting a through, independent survey of available resources.

The Evolution of Spam

In the table below is a summary of what I think what has typified search spam through different phases, as well as the different ways search engines have responded to spammers’ efforts.  There’s a lot of overlap between categories, and certainly spam or spam counter-measures are not mutually exclusive to each phase (for example, spammers are still keyword stuffing, and Google is still working at neutralizing at the impact of keyword stuffing).

Does Google Support Spam 3.0?

Some commentators have gone so far as to suggest that Google is knowingly taking a laissez-faire attititude toward low-quality content in its results in order to encourage clicks on Google ads in the SERPs that are seemingly more relevant than the organic results, or to encourage clicks on Google ads accompanying the low-quality content itself, or both.

Google understands where its bread is buttered.  By consistently providing better results than any competing search engine they will retain their enormous customer base, in turn enabling a steady flow of cash from their search-associated products like AdWords.  It is for reasons of corporate self-interest, not corporate altruism, that Google has seemed to largely adhere to the first two tenets of its stated corporate philosophy:  “Focus on the user and all else will follow” and, especially, “It’s best to do one thing really, really well.”

Spam 3.0 and the Future of SEO

Spam 3.0 has been made possible by the return on investment of search engine optimization.  Once upon a time, not that long ago, big business was skeptical of the value of SEO — or, at least, in the value of spending money to achieve a high degree of visibility in the search engines.  In realm after realm that ROI has now become statistically demonstrable, and increasingly companies are starting to get an idea of just how much they can invest in pleasing the search engines and still turn a profit.  At a time where search query volume is at least remaining high, and is combined with continued growth in the total value of online transactions, the attraction of a robust search engine presence does nothing but grow.

In that not-so-long-ago age, the bulk of marketing dollars might have been spent on building brand equity through traditional mass media advertising, traditional public relations exercises and brick-and-mortar promotions. But not only has an effective presence in search itself proven to be a brand-booster, but – as brand-building has focused more and more on social media – those same web activities which build brand also help with search engine visibility.  The decision is no longer between spending 100K on TV ads and 100K on SEO, but between spending 100K on TV ads and 100K on things like blogs, review mechanisms, Facebook pages and Twitter accounts – expenditures that promote brand awareness and improve that brand’s presence in search.

An increasing willingness to invest in search has lead to an evolution in tactics and strategies of search engine optmization.  There has been a relentless reworking of SEO “best practices” in enterprise environments, where large investments in content and technical infrastructure have produced stellar results, spurring yet further investments.  The noble efforts of individual SEOs crafting perfect title tags or acquiring that gold mine of a link from Sally’s Crochet Blog are being made increasingly less effective by big players with deep pockets who are willing to spend, and spend big.

An excellent example of this is the rise of big multi-product retailers in the SERPs, particularly for product categories (as opposed to individual product listings, though the two are not mutually exclusive).  At one time it was manufacturers or resellers that specialized in a particular product area that had the greatest visibility in the SERPs.  Now, virtually regardless of the query, the same huge players turn up again and again in the search results.  Amazon.  Overstock.  Target.  Sears.  Google hasn’t suddenly discovered that these are good places for its users to buy stuff:  these companies have realized that their websites are good places to Google’s users to buy stuff.  They’ve got the thousands – or tens of thousands, or hundreds of thousands – of dollars that it takes to consistently achieve this sort of ascendancy in search.  Extendible search-friendly site architecture that requires sophisticated custom code and endless maintenance.  Complex review and rating mechanisms requiring extensive moderation, and even the support of sentiment analysis algorithms.  Product information delivered in a dizzying array of data types, from simple XML sitemaps to specialized structured data based on complex ontologies.  Marketing teams producing content, engaging on social networks, monitoring discussions, creating campaigns.  This all takes cash, and lots of it.

The evolution of SEO for multi-topic content sites is analagous to the evolution of SEO for multi-product retailers.  Demand Media didn’t simply have some good ideas about how to attract long-tail queries, they invested in them.  Demand has raised some $355 million in funding, and while its content network is not the totality of their business, this capital has allowed them to develop search-focused algorithms, editorial processes and websites that have resulted in a staggering amount of traffic from organic search.  Critics of Demand frequently make reference to their “cheap content” but, from a corporate perspective, that content has been anything but cheap to produce.

Ultimately the search success of big multi-topic content sites might prove more fragile than that of big multi-product ecommerce sites.  Good ecommerce SEO basically entails giving consumers what they want:  comprehensive and accurate information about products.  Good content SEO does not necessarily entail providing comprehensive and accurate information about topics.  Google might eventually be able to use signals (like an author’s reputation profile) to better assess the quality of an article, but that store A sells item Y for amount Z is fulfilling a user demand in a very non-subjective way.  So where large online retailers can get away with being product generalists (especially if they enlist their customers as product specialists), multi-topic content peddlers may not always get away with being content generalists.  The exception to this may be collaborative multi-topic content sites supported by topic specialists, namely Wikipedia.  In this way Wikipedia is analagous to Amazon, with volunteer contributors taking the place of volunteer product reviewers.

This does not mean, however, that “low quality” content will necessarily be supplanted in the SERPs by better quality content from a broader variety of sources.  As long as it remains profitable to do, enterprising businesses will continue to feed the beast what it most wants to devour.  While this content might be “better,” it will still be intentionally designed for maximum visibility in search, and to achieve the maximum return on the substantial investment required for that search success.  Some might even call it spam.

CREDIT: SEOSKEPTIC

Advertisements

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s