The list of Google penalties include issues that impact both large and small sites. But most of these have a component that especially impact authority and multi-site enterprises. We borrowed these 14 points from Bob Sakayama's big list (Google penalties
) and retooled the discussion toward enterprise solutions.
-1- domain level redundancy
In small web businesses, these Google penalties are most often caused by webmasters who 'clone' sites - point the DNS from many different domains into the same directory causing each domain to display the exact same site - or launch multiple domains that are interlinked. Of all the Google penalties, these 2 are really Bozo level problems. If your enterprise suffers from a cloning issue or has multiple sites interlinked, you need to hire more robust developers. But for technically robust, large web entities, domain level redundancies can be triggered by other causes, including automation, shared functionality, syntax errors on robots.txt, poor compliance oversight, etc. The usual consequence is at least a trademark suppression, one of the more severe Google penalties.
-2- content redundancy
Happens when you duplicate the same content across multiple pages, sites, or if someone copies your site/content. Automation can do this quite well by accident. Generally you will lose the content authority of those pages, sometimes your homepage if there's enough dupes, and these pages will not rank #1 for searches using unique snippets of text taken from the page.
Once a valid strategy involving the purchase of many domains each addressing a separate keyword target, now a trigger of Google penalties. You can of course still do this if you pay careful attention to enterprise search compliance, and build legitimate sites.
-4- link seller
If you get flagged as a seller, your link juice goes away. When your links no longer pass PR, your links are worthless, even when used on your own site. Unless selling links is your core business, we recommend you not do this.
-5- young site
Referred to as the 'sandbox,' this is one of those Google penalties that is questioned in the forums. That's because you don't see this penalty unless you attempt to seo the site with a semantic build. Any effort to push hard on a site less than 6 months old using content will discover the disadvantages of youth. It's a real penalty. And every enterprise should be holding a set of backup domains, parked live, so the clock is running on them, just in case you have to abandon your original. And buy domains the moment the discussion of a new business gets underway, just to get the aging process going as soon as possible.
-6- intermittent ranks
An issue created when large data sites fluctuate in and out of the index. Usually caused by structural and masking issues. This is one issue where size can work against you. Certain kinds of filename masking protocols on large sites (100,000+ pages) are able to create confusion within Google, causing rank oscillations. In this case, you are a victim of Google.
-7- bad neighborhood
If you appear to be supporting sites selling porn, gambling, Viagra... It's interesting how we've never been able to find a casino, poker, or porn bad neighborhood that reliably triggers a penalty. The toxic pages (and there's evidence it's usually pages, occasionally entire sites) we find are on directory or blog sites probably owned by seos, link sellers, or former link sellers. And while you want to avoid porn or gambling, we're convinced it's something else that's triggering the penalty. And what about if you ARE the bad neighborhood? If your enterprise is a casino, welcome to the rabbit hole - sites that ARE bad neighborhoods play by different (better) rules, because they aren't harmed by others in their bad neighborhood. This whole area needs some research focused on it to determine exactly what the standards are (if any) for marking a bad neighborhood. Our results conflict with the general thinking here. We aren't finding toxic links with bad neighborhoods tools. This category should probably be changed to 'toxic links.'
Do we even need to comment? But then again, are you sure your automation is not pumping out pages, or files, or running scripts, or multiple redirects and getting them indexed?
-9- homeland security
Some sites' businesses make them targets of our new security infrastructure - like the sellers of fake ids, chemicals, etc. You may not rank well if finding your service or product is perceived to create a threat to the government/security.
-10- canonical glitch
If your site does not resolve to the www subdomain, it is vulnerable to a problem whereby your own content can be seen as redundant with itself. This is Google's fault and they have it more under control now, but it's still showing up in our client base, especially with large sites.
-11- proxy hack
When a 3rd party is able to hijack your Google ranks by using a proxy and some cloaking expertise, you end up being penalized by Google. Another one that is Google's fault for not being robust enough. We've seen great damage done by this brilliant, nefarious technique. Bob has consulted with the FBI on behalf of clients who were harmed by organized attacks on valuable ranks. The enterprise needs to know more about this.
-12- other third party penalties
This includes both innocent and intentionally harmful tactics. You can be harmed when someone steals your content, or when one of your affiliates uses content off your site, or when a competitor intentionally places links to you from identified bad neighborhoods.
-13- Google algorithm
Their algorithm is far from perfect and is in fact quite broken. So when it fails it's the equivalent of a penalty if you're on the receiving end.
-14- masking issues
When dynamic sites use mod rewrites to make pages appear static, many create ranking issues in the process. If you don't mask files uniquely, or if you mask without filename extensions, or mask the filename but keep a dynamic (php or asp) extension, you may be creating a potential issue for your enterprise regarding rank, security, or both.
-15- subdomain issues
If your subs are not implemented as if separate domains, used only to push rank, and/or are interlinked inappropriately (eg share the main site's nav), they're eventually going to trigger a trademark suppression penalty. You can't treat subdomains as if they are pages of the main site - they must be given their own nav structure and content. (See discovery tip below in #16.)
-16- https leakage
If you permit the https side of your site to get indexed, you can create a redundancy of every page within Google's index. This happened to many sites when Google started indexing forms. Check the index using the following searches: "site:domain.com inurl:https" and "site:domain.com -'www'" to find instances of both https and subdomains in the main index.
-17- 404 error correction issues (The GoDaddy penalty)
Bob has a client whose site has been suffering continuing rank suppression after receiving a penalty triggered by systemic error correction that was not sending a 404 in the header. We call it the GoDaddy penalty, because we first discovered it on a site that was hosted there. If you have a GD hosting account, and you set your error correction to "show homepage," you may be vulnerable to this trigger. If a page does not exist, you see the homepage and the non-existent url shows 200 ok. The result is that your homepage gets indexed for every instance of a page that should be 404 - a huge problem if you've deleted any pages (you don't see the problem until you've deleted a file). Every deleted page becomes a redundant copy of your homepage. While we realize this is NOT a typical enterprise problem (since most enterprise operations don't use GD hosting), we're including it because the basis of this penalty is a flawed 404 strategy, something that every large enterprise needs to be aware of. Make sure your deleted files either redirect properly, or show 404 in the header, so you can either delete from Google's index, or permit them to age away gracefully.