Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Best of luck to you, I think there is a targettable niche that could utilize this.

Having thought quite a bit about the search space, I think a whitelist approach is going to be the next big search thing, because advertising and bs sites have corrupted SEO too far.

I'm reminded of the site indexer websites in the early days of the internet. Curation if done properly (based on quality of content and not certain other factors that currently play too heavy a role in the seo algo black boxes) seems to be how we adapt to the current information tsunami we are all dealing with.

I think a long time ago I decided I would even pay for such a service, just like I am willing to pay for a good new source (FT for me, not cheap, but worth it). Im not positive the 10$ mark is low enough but I hope for the general landscape it is.

Just dont forget to keep dontbeevil more than a catchphrase. In particlar, please be transparent with what user data you collect and how you use it.



> I think a whitelist approach is going to be the next big search thing

It almost as to be. Spammers, growth hackers, et al. are just too numerous and too good.

> Im not positive the 10$ mark is low enough but I hope for the general landscape it is.

I saw enough people mention $10 that I decided to go for it. To be honest, $10 is already probably too low to be sustainable because of the huge amount of resources search consumes and the high cost of development.

My gut feeling is that it's economically impossible to build a good search engine that isn't loaded with ads and spyware. But I spent so long complaining about G that I decided to prove to myself one way or another.


> because of the huge amount of resources search consumes

I’ve been intermittently working on much the same idea as the OP, and I suspect this is actually a lot less of a problem than it seems, since they’re focusing on a niche. Indexing everything the way Google does requires a lot of resources, but indexing the majority of useful material in a specific domain takes a lot less. (My ElasticSearch index for the entirety of StackOverflow is a mere 40 GB, for example.)

By far the more expensive part is likely to be paying market rates for a developer (you need a decent number of users paying $10/mo to hit a mid-market salary), but in theory this scales relatively independently of userbase.

Edit: I’ve just noticed I’m replying to the OP, who’s mentioned downthread that they’re using BigQuery and spending $200/week. I’ve gone the marginalia.nu route and run everything on a computer in my living room, which changes the calculus somewhat—it’s a lot cheaper, but probably involves more development time.

For me it’s mainly about the learning experience but I’d be interested to hear your thoughts on the tradeoff.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: