Search
Compliance
Friday 19 April 2024 09:20 AM   Your IP: 3.15.143.181
Structural SEO
Home       SEO Enterprise Blog       Search Compliance       Structural SEO       The Semantic Imperative       About re1y.com      
Home
Restoring Ranks Post Panda
When Google Destroys Your Business
Search Due Diligence For Internet Investments
SEO Enterprise Blog
Enterprise SEO
Negative SEO
The Risks of Relying on Search
Rank Preservation
When SEO Fails
Search Compliance
Google Penalty Solutions
The Ethics Of Search
Structural SEO
Multiple Sites
Defensive Strategies
Inbound Links
Link Vetting
Third Party Interference
Filename Masking
Black Hat Reality
Recourse & SEO
The Null Set Redundancy
The Semantic Imperative
In The Name Of Relevance?
Automation And SEO
PageRank
Content Authority
Google Penalties Insight
Link Authority Trainwreck
robots.txt
Paid Links
Securing robots.txt
Foreign Language Sites
nofollow
RDF / RDFa
Replacing Nofollow
Canonical Condom
Granularity In CMS
Evaluating SEO Agencies
Search Forensics: Subdomains & Supplemental Results
Google Hiding Link Metrics Behind Sample Links
Enterprise Link Building
Link Velocity Debunked
New Link Disavow Tool
Turn Old Product Pages Into Link Bait
22746

Google Penalties & Enterprise Search Compliance

The list of Google penalties include issues that impact both large and small sites. But most of these have a component that especially impact authority and multi-site enterprises. We added to Bob Sakayama's big list (Google penalties) and retooled the discussion toward enterprise solutions.

-1- domain level redundancy
In small web businesses, these Google penalties are most often caused by webmasters who 'clone' sites - point the DNS from many different domains into the same directory causing each domain to display the exact same site - or launch multiple domains that are interlinked. Of all the Google penalties, these 2 are really Bozo level problems. If your enterprise suffers from a cloning issue or has multiple sites interlinked, you need to hire more robust developers. But for technically robust, large web entities, domain level redundancies can be triggered by other causes, including automation, shared functionality, syntax errors on robots.txt, poor compliance oversight, etc. The usual consequence is at least a trademark suppression, one of the more severe Google penalties.

-2- content redundancy
Happens when you duplicate the same content across multiple pages, sites, or if someone copies your site/content. Automation can do this quite well by accident. Generally you will lose the content authority of those pages, sometimes your homepage if there's enough dupes, and these pages will not rank #1 for searches using unique snippets of text taken from the page.

-3- multi-domain
Once a valid strategy involving the purchase of many domains each addressing a separate keyword target, now a trigger of Google penalties. You can of course still do this if you pay careful attention to enterprise search compliance, and build legitimate sites.

-4- link seller
If you get flagged as a seller, your link juice goes away. When your links no longer pass PR, your links are worthless, even when used on your own site. Unless selling links is your core business, we recommend you not do this.

-5- young site
Referred to as the 'sandbox,' this is one of those Google penalties that is questioned in the forums. That's because you don't see this penalty unless you attempt to seo the site with a semantic build. Any effort to push hard on a site less than 6 months old using content will discover the disadvantages of youth. It's a real penalty. And every enterprise should be holding a set of backup domains, parked live, so the clock is running on them, just in case you have to abandon your original. And buy domains the moment the discussion of a new business gets underway, just to get the aging process going as soon as possible.

-6- intermittent ranks
An issue created when large data sites fluctuate in and out of the index. Usually caused by structural and masking issues. This is one issue where size can work against you. Certain kinds of filename masking protocols on large sites (100,000+ pages) are able to create confusion within Google, causing rank oscillations. In this case, you are a victim of Google.

-7- bad neighborhood
If you appear to be supporting sites selling porn, gambling, Viagra... It's interesting how we've never been able to find a casino, poker, or porn bad neighborhood that reliably triggers a penalty. The toxic pages (and there's evidence it's usually pages, occasionally entire sites) we find are on directory or blog sites probably owned by seos, link sellers, or former link sellers. And while you want to avoid porn or gambling, we're convinced it's something else that's triggering the penalty. And what about if you ARE the bad neighborhood? If your enterprise is a casino, welcome to the rabbit hole - sites that ARE bad neighborhoods play by different (better) rules, because they aren't harmed by others in their bad neighborhood. This whole area needs some research focused on it to determine exactly what the standards are (if any) for marking a bad neighborhood. Our results conflict with the general thinking here. We aren't finding toxic links with bad neighborhoods tools. This category should probably be changed to 'toxic links.'

-8- spam
Do we even need to comment? But then again, are you sure your automation is not pumping out pages, or files, or running scripts, or multiple redirects and getting them indexed?

-9- homeland security
Some sites' businesses make them targets of our new security infrastructure - like the sellers of fake ids, chemicals, etc. You may not rank well if finding your service or product is perceived to create a threat to the government/security.

-10- canonical glitch
If your site does not resolve to the www subdomain, it is vulnerable to a problem whereby your own content can be seen as redundant with itself. This is Google's fault and they have it more under control now, but it's still showing up in our client base, especially with large sites.

-11- proxy hack
When a 3rd party is able to hijack your Google ranks by using a proxy and some cloaking expertise, you end up being penalized by Google. Another one that is Google's fault for not being robust enough. We've seen great damage done by this brilliant, nefarious technique. Bob has consulted with the FBI on behalf of clients who were harmed by organized attacks on valuable ranks. The enterprise needs to know more about this.

-12- other third party penalties
This includes both innocent and intentionally harmful tactics. You can be harmed when someone steals your content, or when one of your affiliates uses content off your site, or when a competitor intentionally places links to you from identified bad neighborhoods.

-13- Google algorithm
Their algorithm is far from perfect and is in fact quite broken. So when it fails it's the equivalent of a penalty if you're on the receiving end.

-14- masking issues
When dynamic sites use mod rewrites to make pages appear static, many create ranking issues in the process. If you don't mask files uniquely, or if you mask without filename extensions, or mask the filename but keep a dynamic (php or asp) extension, you may be creating a potential issue for your enterprise regarding rank, security, or both.

-15- subdomain issues
If your subs are not implemented as if separate domains, used only to push rank, and/or are interlinked inappropriately (eg share the main site's nav), they're eventually going to trigger a trademark suppression penalty. You can't treat subdomains as if they are pages of the main site - they must be given their own nav structure and content. (See discovery tip below in #16.)

-16- https leakage
If you permit the https side of your site to get indexed, you can create a redundancy of every page within Google's index. This happened to many sites when Google started indexing forms. Check the index using the following searches: "site:domain.com inurl:https" and "site:domain.com -'www'" to find instances of both https and subdomains in the main index.

-17- 404 error correction issues (The GoDaddy penalty)
Bob has a client whose site has been suffering continuing rank suppression after receiving a penalty triggered by systemic error correction that was not sending a 404 in the header. We call it the GoDaddy penalty, because we first discovered it on a site that was hosted there. If you have a GD hosting account, and you set your error correction to "show homepage," you may be vulnerable to this trigger. If a page does not exist, you see the homepage and the non-existent url shows 200 ok. The result is that your homepage gets indexed for every instance of a page that should be 404 - a huge problem if you've deleted any pages (you don't see the problem until you've deleted a file). Every deleted page becomes a redundant copy of your homepage. While we realize this is NOT a typical enterprise problem (since most enterprise operations don't use GD hosting), we're including it because the basis of this penalty is a flawed 404 strategy, something that every large enterprise needs to be aware of. Make sure your deleted files either redirect properly, or show 404 in the header, so you can either delete from Google's index, or permit them to age away gracefully.

-18- unnatural links
From 2007-2014, this manual action was the most common penalty, due to the fact that most of the seo agencies were using link schemes until Google cracked down. If you paid someone to post links to your site, it probably worked for a while. And some who really bought into it posted enough links to nuke the domain - may not be worth attempting to recover if the cost of link removals is high enough.

-19- too many ads above the fold
If your content is hidden by ads and promotional material, this update may have a bad reaction to it.

-20- exact match domain
If your site was riding high because of an exact match domain, and you did not create the content to support that high rank, this update set things straight. Not a real penalty, more a closing of a loophole.

-21- Penguin
This automated (you don't receive a manual action notice) penalty addresses with automation what the manual action does for unnatural links. Problem is you have to wait for the update to recover, and since there's no manual action, you can't file for reconsideration. However, you can recover by using the disavow tool - don't have to remove the links.

-22- Pay Day Loan update
Addressing specific sites that address search terms for this industry as a result of much spam and malfeasance in this marketplace.

-23- doorway pages
Orphaned pages designed only to rank for keywords that send traffic one way to a site are now flagged as a cause for manual action.

-24- Panda
Automated content evaluation and page layout update that zeros in on content presentation in addition to thin, automated, copied content. This is the first algorithm update focused on user experience. Sites harmed recover by incorporating page specific attributes that engage and hold the user's attention and improve readability. Images, media, page specific links to content, links out to authority sites, reviews, comments, etc.

-25- Pirate
Specifically addressing sites that repeatedly infringe other site.

-26- captive review
Addresses site that appear to review products or services, but forward all traffic to a single vendor.

Home       SEO Enterprise Blog       Search Compliance       Structural SEO       The Semantic Imperative       About re1y.com      

re1y.com
Enterprise SEO
Google Penalty Solutions
Automation & Search Compliance

Looking for SEO enabled content management systems with structural, semantic optimization built into the cms? You're on the right site. Research identified targets are implemented within the markup, content, and filenames to enable the site to rank as high as possible based upon semantic relevance. 34789366G off site content requirements