Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Everyone needs to put the following in their robots.txt to give Linkedin a taste of their own medicine.

User-agent: LinkedinBot

Disallow: /

# Notice: If you would like to crawl us, # please email whitelist-crawl@domain.com to apply # for white listing.

*taken from linkedin.com/robots.txt



HN messes up whitespace, so you'll have to indent this to make if preserve the newlines.


Ah yes, surely they will always respect the holy robots.txt




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: