Search
Compliance
Monday 13 January 2025 03:02 AM   Your IP: 18.97.14.86
Structural SEO
Home       SEO Enterprise Blog       Search Compliance       Structural SEO       The Semantic Imperative       About re1y.com      
Home
Restoring Ranks Post Panda
When Google Destroys Your Business
Search Due Diligence For Internet Investments
SEO Enterprise Blog
Enterprise SEO
Negative SEO
The Risks of Relying on Search
Rank Preservation
When SEO Fails
Search Compliance
Google Penalty Solutions
The Ethics Of Search
Structural SEO
Multiple Sites
Defensive Strategies
Inbound Links
Link Vetting
Third Party Interference
Filename Masking
Black Hat Reality
Recourse & SEO
The Null Set Redundancy
The Semantic Imperative
In The Name Of Relevance?
Automation And SEO
PageRank
Content Authority
Google Penalties Insight
Link Authority Trainwreck
robots.txt
Paid Links
Securing robots.txt
Foreign Language Sites
nofollow
RDF / RDFa
Replacing Nofollow
Canonical Condom
Granularity In CMS
Evaluating SEO Agencies
Search Forensics: Subdomains & Supplemental Results
Google Hiding Link Metrics Behind Sample Links
Enterprise Link Building
Link Velocity Debunked
New Link Disavow Tool
Turn Old Product Pages Into Link Bait
16097

The Risks Of Relying On Search Results

by Bob Sakayama

14 April 2011

(This is the longer version of the presentation made at the London seminar. Ray Snoddy's article on the topic is posted here)

Example Of Widespread, Unrecognized Risk:

I realize that risks of a search engine penalty, or a Google penalty, is something that is not very well understood, even by the community of website businesses. So I'd like to start by demonstrating how this risk may be present for your enterprise very early in the process, long before any commerce even takes place on the new website.

Imagine this scenario:

Your enterprise has acquired the worldwide rights to market a hugely successful new product called Super Magic Widgets. This product has such incredible potential that the decision is made to create a separate internet business to directly market it, rather than distributing it through existing venues.

Of course, the very first action is to search for a suitable domain to launch the internet commerce platform for the marketing effort. The selection of the domain is very important, because if it carries the correct semantics, it can make a huge difference in the rank performance of the website. Search engines respond to the relevancy of the domain semantic, so if you're selling widgets, it's a very smart move to make sure that the domain name includes the term "widgets." To your amazement, marketing team finds that SuperMagicWidgets.com is available and buys it (with the blessing of your seo team).

Right at this point, if you do not vet this domain for search viability before pouring resources into the implementation, you could create an existential risk to this new business. Because if this domain had a previous owner who nuked it with a toxic optimization strategy, it will never perform in the natural search. This happened to me in 2007, and I since then I have prevented many clients from wasting resources on a dead domain.

Domains have existed long enough now so that many that are in the pool of available domains have had previous lives. Their previous owners were either unsuccessful at building businesses on them, lost interest, or created such a mess that the domain was 'nuked' so it could never rank. So prior to purchase, the enterprise must be able to know if the domain is viable in the search. It is obvious that as more and more domains are abandoned, that this problem is growing with time. Also, when looking at available domains, the more generic and desirable the name, the more likely it was pre-owned.

In order to protect the enterprise, the domain name vet must proceed no matter where the domain is purchased - whether from the publicly available pool, or from an individual or business selling the domain. I am aware of one case where a domain that sold for $20,000 by the previous owner was penalized and not recoverable.

One interesting side note: Non-dot com domains may have a lower risk of previous ownership and provide the same semantic edge. For example, the .co domains became universal only in the past year, so there should be a low probability of a previous owner triggering penalties on them.

Most of the penalties that we work on are not related to domain purchases, but rather are applied to once thriving websites. While there may be many variations on exactly how these sites are penalized, the one thing they all have in common is that they are all financial emergencies - where at least one major revenue stream is lost. In some cases the penalty is across a network of sites. These are existential threats to the businesses involved - I have seen businesses forced to shut down as a result of the revenue loss. Given this fact, it is critical to begin to understand the risks taken on when businesses choose strategies that depend upon performance in the natural search.

A Devastating, Yet Largely Unknown & Misunderstood Risk:

The enterprise thrives by making smart decisions, especially in the area of risk management. Internet businesses must contend with a special set of risks that accompany their heavy reliance on technology. The reliability of the website's functionality and the security of these systems are 2 obvious examples of well known risks that have solutions in place.

But for internet businesses that rely on the natural search to provide sales and leads, there are significant risks that remain very poorly understood. These risks are present because the search ranks of your business are not within your control - you rely on a third party, the search engine, to provide the ranking mechanism. This dependency inserts a huge unknown into the pool of risks. One such risk is the possibility of devastating rank loss due to a search engine penalty.

According to a survey mentioned here, only 11% of the website owners who responded were aware of the possibility of a search engine penalty impacting their business. Given what we already know from our client work about the devastating consequences of Google penalties, this is shocking. We have seen $100 million businesses watch their revenue streams shut down overnight because all their previously productive sites could no longer be found in the search.

One of the explanations for the lack of knowledge on search penalties is that it would only impact an organization whose websites have already achieved productive, organic search ranks - in reality, that's #1-5 on page one of a Google search - but let's count any site on page 1. The businesses that hold those positions represent a small club compared to all others. Most businesses BUY their search engine traffic, via ads that appear along with the organic results. So the number of sites being harmed by search penalties is relatively small.

Another reason we don't know more: Most of our clients who have been penalized expect confidentiality, so the damage is essentially kept secret, and that just serves to insulate the larger community from awareness of the risks.

The small number of victims, and the privacy concerns of those compromised businesses keep us ignorant. This ignorance creates an environment in which most successful search dependent businesses do not have an understanding of the potential for catastrophic rank loss.

Now imagine the impact on a longstanding, search dependent business of a severe penalty that degrades all their productive ranks back to page 5 or worse. In many instances, that would be an existential threat to the enterprise. And it happens more often than most realize.

Why Websites Get Penalized

A large part of our work has been an attempt to understand why sites get penalized, and to gain insight into the restoration process. We have intentionally penalized sites, then unwound those penalties in order to discover where the red lines are. We also keep many domains in penalties as platforms for other experiments. This is really the only way to observe how penalties are handled by Google, and to gain an understanding of the impact on related sites. So, for example, we know that, for most compliance based penalties, links from penalized sites do not cause harm - we can see that from many perspectives on our own sites.

Google would like us to believe that sites get penalized for falling outside their published "guidelines" - their rule set for search behavior. And for the most part this is probably true - we see many penalties that get imposed for obvious breaches of the guidelines. And their ranks return when the breach is remedied - ie. when search compliance is restored.

Many retail SEOs believe that all rank issues can be remedied with optimization. We view the process of obtaining productive high ranks to be broken into 2 distinct parts: compliance & optimization. Search compliance means making sure the site is implemented within the guidelines and is indexable or viable for the bots. Optimization is the strategy or technique used to manipulate ranks higher. Attempting to optimize a non-compliant website will either fail (no rank improvement), or trigger an event (penalty). Maintaining search compliance is critical to avoiding penalties, and is something that automation can be applied to.

The world would be a much less risky place if the "published rules" compliance model were the only explanation for penalties. But it turns out that the published guidelines are only that - guidelines. The actual rules are secret. And constantly changing.

We know from first-hand experience that optimization strategies that performed fabulously in 2001 can now get you penalized. In fact, if you look at the list of "don'ts" from the guidelines (examples below) all of them were once effective strategies.

- Avoid hidden text or hidden links.
- Don't use cloaking or sneaky redirects.
- Don't send automated queries to Google.
- Don't create multiple pages, subdomains, or domains with substantially duplicate content.

Google updates their algorithm very frequently - evidence of reordering sometimes occurs daily. This constantly changing rules set, combined with zero transparency from the search engines, creates an environment where the seo strategies used by many seo agencies lag behind the current best practices. So much so that uninformed seos are one of the main causes of search penalties.

We also know that Google's infrastructure is vulnerable to attack. And that 3rd parties are able to take advantage of these vulnerabilities to hack their index, inserting information that leads to high ranks being hijacked. Sometimes the frailties of the algorithm create rank issues by themselves. Google is usually very quick to address these issues when they are pointed out, but the fact that the frailties exist at all creates a risk.

Another consideration is that most large sites are complex structures, and their complexity often explains how they might fall outside compliance inadvertently, and how that might escape detection until a penalty is imposed. Automation can be your best friend when it's working for you, and your worst nightmare when it triggers an unintended non-compliance. Most penalties are triggered by human error or oversight. And at the enterprise level, this includes compliance breaches caused by everything from poorly coded automation to mistakes made by a copywriter. Ignorance is almost always behind most penalties triggered by non-compliance.

Because businesses are harmed/destroyed by penalties and because of how these penalties are imposed, ethics enters the conversation about search. As the dominant player, we would hope that Google would be sensitive to the ethical quandaries they create as a matter of course, and in some way act to mitigate these perceived risks. But this is not happening - in fact, the ethical issues seem to grow larger with time. Consider the ethical appropriateness of:

-1- imposing harsh punishments based on secret rules
-2- an unwillingness to even acknowledge when a site is penalized
-3- being the sole arbiter of justice with a stake in the decision
-4- permitting no recourse
-5- creating victims through Google's own frailty
-6- penalizing sites in Google for the actions of 3rd parties
-7- rules that change during the game
-8- the absence of an effective warning mechanism


Managing the risks

So the risks of relying the search are very real, but how are they being addressed? This is the question that we all need to seriously address because the current solutions are not viable. For example, a check of your errors and omissions policies probably rules out an insurance claim. A Google penalty is more like an act of God, than an understood market risk.

Even our law enforcement agencies are unclear as to how sites harmed in the search can seek justice. I once was asked by a client to report an attack on their ranks to the FBI. This was an instance in which my client was losing half a million dollars in sales per day as a result of a Google penalty triggered by the actions of a third party. My clients sites were not hacked, Google's index was, so the attack on my client's revenue was indirect, but the impact was severe and instantaneous. After explaining several times what had taken place, and the losses caused by it, the FBI agent asked me, "What's the crime?" There was no way for them to take any kind of enforcement action because there was no obvious violation of law.

Even though our work is focused on performance in the search, we strongly recommend to clients that they move away from complete dependence on search ranks. This is the only way to mitigate this risk. Many companies create numerous websites and feel that this diversifies their exposure to a penalty, or any kind of rank loss. But we often find penalized environments in which all the websites are penalized simultaneously. Other online techniques that can act independently of the search include a robust social media presence, targeted email campaigns, other advertising, promotion.

Finally, there is one solution that is not within the grasp of us as individual businesses, but may be available if we act in concert. And that is to work to force the search engines to act more responsibly when it comes to their enforcement actions. Some form of regulation is clearly warranted, but exactly how this will come about is unclear. There are some tiny first steps in this direction that I would advocate. For example, simply notifying sites that they are penalized.

Every major industry that developed in the US, including railroads, telephone, petroleum, steel, etc. all evolved in such a way that eventually a monopoly was created. And regulation was required to force ethical standards and fair play into those markets. We are at such a point now with search, without the regulation.


Related notes - short term solutions for penalized sites:

AdWords
Affiliates
Email marketing
Social media
other search engines

Home       SEO Enterprise Blog       Search Compliance       Structural SEO       The Semantic Imperative       About re1y.com      

re1y.com
Enterprise SEO
Google Penalty Solutions
Automation & Search Compliance

Looking for SEO enabled content management systems with structural, semantic optimization built into the cms? You're on the right site. Research identified targets are implemented within the markup, content, and filenames to enable the site to rank as high as possible based upon semantic relevance. 34789366G off site content requirements