1 July 2010 : Bob Sakayama
There's a disturbance in the force. And it ain't welcome.
We noticed that a lot of our clients had huge changes occurring in the metrics on external links within Google Webmaster Tools.
Clearly some very large change just occurred in the way links are reported within WMT, and it's not for the better. It was never a robust area, and the links G indexed never came close to the actual numbers, but at least we had some granular information we could act on.
One of the compliance checks we always perform when checking sites for ranking issues is a link evaluation - something that relies on access to this data. It also used to be one way to gauge the effectiveness of any link building campaign, or social marketing efforts. And it's the only way to identify a link attack by a 3rd party - although G says it can't happen, we know otherwise. And now we just lost the only avenue for discovery.
By moving to this new, truncated system, we once again lose something we once had (remember when the supplemental results used to be labeled in the results?). This is one that makes some very important forensic work now impossible. If you can't see the links that Google is using to evaluate you, you can't act to address issues caused by them.
This action should be interpreted an intentional effort to reduce Google's accountability to the sites that rely on it for ranks - you can't question something you can't prove. And website owners can't improve something they can't see.
We realize that whatever information Google does provide, we should be grateful for. And clearly by only showing a tiny fraction of what they used to show us in terms of link information, can save resources that can now be allocated for other purposes. But this change is drastic and very harmful for the management of large sites.
Here's just one example: There are 114,000+ links to a url according to WMT for one client's site. But instead of permitting access to that data, the heading on the page now reads, "This table lists a sample of 13 external pages that link to (url)." That's 0.01% of the actual number of links.
Google gets negative respect for this one, and we hope we see a reversion when they recognize the extent of the blinders they just put on us all. Haven't seen any major complaints yet, so make noise if you read this and agree.
13 July 2010 - I just posted this question to the Webmaster Central Help Forum - Bob Sakayama
Is the removal of inbound link metrics from WMT a permanent degradation of that tool?
Why have you removed the most valuable tool for link discovery used by large websites from Webmaster Tools? Until a couple of weeks ago we could download the entire list of indexed inbound links, but now, even though the link on that page says "Download all external links" we only get the samples. One client's site has over 117,000 links pointing to a url, but the "sample" is only 15. ONLY 15 out of 117,000!!!!
Anyone attempting to keep a large website search compliant is hugely and negatively impacted by this change.
A white hat link building campaign can no longer be effectively evaluated from within WMT. Since Google is going to penalize sites that link inappropriately or buy links, doesn't it make sense to maintain the ability to discover and remediate these kinds of issues? A site penalized because a former seo agency bought links can no longer do the forensics necessary to discover and remove them. The withdrawal of these valuable metrics is taking Google in the OPPOSITE direction from transparency and openness - where we hoped you were going. Clearly, forcing us to wear blinders is not welcome.
Is it really necessary to eliminate this most valuable resource?
To post a question in this forum, sign in and go here:
Posted as a comment to Matt Cutt's blog 11 July, but by 14 July it is still 'awaiting moderation' - comments by others, posted after this are now showing live. Why would this comment be restricted and not pushed live? 15 July - now live.
Bob Sakayama July 11, 2010 at 9:38 pm
Would really be useful to reveal more forensic information for sites having difficulty, especially for sites that are penalized. We help in the recovery process and know that many sites that get penalized are actually themselves victims - of other seos, their own ignorance, etc., and that no malfeasance is intended. I understand that revealing too much is not wise, and that there are bad players. But for those who actually intend to be good web citizens but inadvertently triggered a Google penalty, there must be a better way.
I'm particularly alarmed at the recent pullback on information available from within Webmaster Tools on inbound links. We used to be able to download the full list, and now we're restricted to a tiny sample.
Given Google's emphasis on relevant, organic links, and the fact that Google penalizes sites with inappropriate linking, why would you intentionally neuter one of the most valuable tools used for the discovery of rank issues triggered by links? When attempting to diagnose a site's ranking or penalty issues, we are no longer able to see the links that you are crediting the site with, only a minuscule sample, many of which are often multiple links from only a few domains, making this information useless.
I strongly encourage you to revert the discovery metrics within WMT regarding inbound links, plus 2 other useful discovery items:
Redirect Detector - sites get harmed by chaining redirects, but most don't even know they're doing it. Would be great to be able to see that from within WMT
Penalty Confirmation - I know this is asking a lot. But it shouldn't be. Can we at least know, from within WMT, are we penalized, yes or no?
Post a comment to Cutt's blog: