Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Googlebot won't follow the link because it's listed in robots.txt.


Googlebot caches robots.txt for a very, very long time. If you disallow a directory it may take months for the entire googlebot fleet to start ignoring it. Google's official stance is that you should manage disallow directives through webmaster tools.


Yes, but it will index it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: