Not having robots.txt just results in Google defaulting to their reasonable crawl settings. For most sites, a hastily configured robots.txt usually results in problems rather than SEO help.
The indexed pages are also missing most of the 'important' landing pages and other big pages, so most likely some sort of automated spam detection was triggered.
My ISP seems to time out on the domain, sshing and curling from various servers returns the page.
The indexed pages are also missing most of the 'important' landing pages and other big pages, so most likely some sort of automated spam detection was triggered.
My ISP seems to time out on the domain, sshing and curling from various servers returns the page.