Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's also possible they have a huge amount of content that is sparsely accessed by a large number of users. But in general I agree.

Edit: SmugMug seems to fall into this category: http://don.blogs.smugmug.com/2010/07/15/great-idea-google-sh...

Also interesting:

And if you think about it, the robots are much harder to optimize for – they’re crawling the long tail, which totally annihilates your caching layers. Humans are much easier to predict and optimize for.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: