Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> The popularity of the service makes human moderation impossible, creating a need for inevitably-flawed robots

That's true, but we might be able to improve things with a bit more human moderation.

For instance facebook is insanely profitable. They could probably increase their staffing for moderation by a pretty decent multiple and still be very profitable.

So the current state of moderation is not strictly a matter of need, it's also a matter of greed in terms of Facebook wanting to automate away jobs they could pay people to do. And given the state of online discourse, it's a decision we're all paying for.



Then you have the problem of bias in human reviewers. I don't think you can solve the problem of censorship with just a few more hires.


In the past for a game-modding project we sort of opensourced reports. Users (meeting certain criteria) could visit a page and view 5 seconds of video and statistics of an alleged/detected cheater. Then select positive/negative/inconclusive. If (IIRC 5 users) had voted and the majority said positive/negative then a ban/unban would be issued. Because it was random reports and the usernames were hidden, there were no obvious bias.


This works for obvious cases like cheating. Kinda like the legal system. This won't work for political speech and we knew this. Most legal systems are set up to avoid having courts judge political speech in most cases.

Even then it will probably only keeps working until there's a rift in your community. Say people start arguing over trans rights (weirdly common on discord), and then users get mass-reported and mass-voted to be banned by an activist minority.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: