Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

They are also a spam filter. It's not just an index of whats relevant, but removal of what maliciously appears to be relevant at first glance.


This. Everyone's missing the point of a search engine.

We're talking about billions of pages and if not ranked (authority is a good hueristic), filtered (de-ranked), etc then good luck finding valuable information because everyone is gaming the systems to improve their ranking.

I think this is part of the reason you get a lot of fake news on social media. It's a constant stream of information (a new dimension of time has been added to the ranking, basically) that needs to be ranked and with humans in the loop, there's no way to do this very easily without filtering for noise and outright malicious content.


i disagree that there isnt a way, just that nobodies tried a good one yet.

take reddit for example. it should be very easy to establish a few voters who make "good" decisions, and then extrapolate their good decisions based on people with similar voting patterns. it would combine a million monkeys with typewriters with expert meritocracy. you want different sorting, sort by different experts until you get the results you want. it seems every platform is too busy fighting noise to focus on amplifying signal, or are focused on teaching machines to do the entire task, instead of using machines to multiply the efficiency of people with taste who can make a good judgement call with regard to whether something is novel or or pseudo-intellectual. Not to pick on them, but I would suspect an expert to be better at deranking aeon/brainpickings type clickbait than an eruditelike ai, if only because humans can still more easily determine if someone is making an actual worthwhile point, vs repeating a platitude, conventional wisdom, or something hollow.


It should, but if anyone knows who these kingmakers are, it's still probably just a matter of time before they accrue enough power for it to be worth someone's time to at least try to track them down and manipulate their decisions (bribe, blackmail, sponsor, send free trials, target with marketing/propaganda campaigns, etc.)


Who says it even has the same kingmakers every day? Slashdot solved that part of metamoderation two decades ago.

A person might be an expert in cars but not horses. A car expert might be superseded . The seed data creators could be a fluid thing.


This is a technocracy. Noone wants this but Hacker News.


Let's say you have a subreddit like /r/cooking. You think exposing a control in the user agent (browser, app, ui) that let's you sort recipe results by lay democracy, professional chefs, or restaurant critics taste is a technocracy?

Are consumer reports and wirecutter less valuable than Walmarts best sellers? Is techmeme.com worse than Hackernews by virtue of being a small cabal of voters? Should I dismiss longform.org and aldaily as elitist because they aren't determining priority solely from the larger populations preferences. Is Facebooks news algorithm better because it uses my friends to suggest content?

Is it a technocracy that metacritic and rotten tomatoes show both user and critic score? I'm proposing an additional algorithm that compares critic score with user score to find like voters and extrapolate how a critic would score a movie they have never seen. I think that would be useful without diminishing the other true scores. I would find it useful to be able to choose my own set of favorite letterboxd or redef voters and see results it predicts they would recommend, despite them never having actually voted on a movie or article. Instead of seeding a movie recommendation algorithm with my thoughts, I could input others already well documented opinions to speed up the process.

This idea would work better if people voted without seeing each others votes until after they vote. It might be hard to extrapolate Roger Ebert's preferences if voters formed their opinions of movies based on his reviews. You'd end up with a false positive that mimics his past but poorly predicts his future.


The reverse is a problem too, Google filtering things out based on their political leanings in an attempt to shape public opinion.


I haven't seen any examples which were anything other than runaway persecution complexes of those who found their world view was less popular than they believed - which were greeted with exasperation by testifying engineers who had to explain how absurdly unscaleable it would be to do it manually.





Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: