When you have a huge number of products or sites, the world is a very different place. The standards regarding web pages, robot limitations, etc remain the same, yet you're tasked with enabling productive search performance on a list that keeps growing.
And this in the face of very restrictive limitations. For one, the typical page can launch up to 30-35 keyword targets comfortably. And we know from our experiments and from comments from Google engineers that creating a page with thousands of links passes so little PR to each that it's unproductive from a ranking perspective.
And what if you've got a hundred sites - what about that guideline about each enterprise being represented by only one website? We all know there are enterprises out there successfully managing hundreds of sites in the natural search. Why don't they get penalized?
So you can see that world of large numbers in search is a different one from which most businesses have to contend.