Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> It's a general-purpose tool. The politics is in what you use it for.

That's exactly the view they are arguing against. There is no hard border between tool and decision to use it, the infrastructure will always shift the context of what's possible and alter society.

Once nuclear weapons become possible, a coldwar and arms race becomes imminent. Once a massive, uncensorable communication network becomes pervasive, the old social institutions that guide public communication become irrelevant and we build new ones or face massive misinformation and propaganda campaigns. It's not that we choose to use social media to propagate fake news, it's the inevitable political consequence of their technical structure.

This is not an argument against building things. But an argument for taking responsibility for what you build and acknowledging that building technology is a political move, it strongly affects the inner life of the polis.



If a smith makes a hammer for nails, makes the blueprint available for everyone and someone uses that hammer model to kill someone, is the smith evil? Should the smith no longer make the blueprint available because hammers can be used for evil?

Has the smith made a political move by making the hammer available to everyone? Because all the smith wanted to do is share his knowledge to everyone, so everyone can smith their own hammers.

Ultimately the smith is not morally responsible for what people do with the hammer IMO. The entire intent behind all actions of the smith was to do good, can we call him evil for that?

Similarly I don't think simply building tools is a political or moral move. Rather, it's what you encourage people to do with it and what people do with it that is politically and moral. If I encourage my tool to be used for the better of society, am I evil is a small number of people abuse it?

Don't get me wrong, I still think SV is a den of abuse, however that is because social networks like Twitter and Facebook aren't the hammer. They're not neutral tools. Both profit from moral abuse of the platform, it is enabled and encouraged to do so. Google as a search engine is making it's tool with the only intend to manipulate and abuse users for ads.

There is definitely tools that exist in a moral vacuum (just think of a selfhosted radioshow software, it can host a show about pro-LGBT and one about 9/11 conspiracies, should the author be responsible for either?). But social networks and the tools you mention are largely not it. Those tools have been made with the purpose to do evil.


Google and co aren't the smiths, they are the ones that hire them


In this example, they hire smiths to do evil and the smiths comply to make tools for evil. You can equate Google and Friends to the Smiths since the outcome is the same.


> Has the smith made a political move by making the hammer available to everyone?

It depends. Is the "hammer" a set of instructions to build an airborne pathogen starting from the HIV genome?

Technology is power and power is always political.


Even airborne pathogens could be used for good, the HIV-based one could be reengineered to transport a HIV cure while only showing mild symptoms of the common cold. Or cure cancer with it.

Technology is not power. It enables power. If you engineer the technology to enable only a specific kind of power, then you the engineer and the technology you made can be called evil.

Otherwise I would like you to point me to the particle of malice in the hammer and pathogen. The atom of injustice and evil that it's made of.


> Technology is not power. It enables power. If you engineer the technology to enable only a specific kind of power, then you the engineer and the technology you made can be called evil.

That's a clear contradiction. One one hand you acknowledge the choice of the technologist to be a power of evil, on the other you surmise that, as long as they are willfully ignorant to how the tech is used, they are merely a conduit of political power. It sounds like denial of an uncomfortable truth.


Not necessarily. I acknowledge that a technologist with the goal of doing evil can use technology to achieve that goal.

But a technologist with the goal to do good or neutral will be able to use the same technology to achieve the goal.

Technology itself is ignorant of how it is used, it's not a human. The tools, the technology, remains ignorant of how it's used even if it's for evil because ultimately it's the human wielding the tool that does evil.

You may call technology only made for evil purposes evil if you want (I did say "can be called evil", not "must be called evil").

Or otherwise, did the technology choose to be evil? Was it asked and did it consent to be evil? Where is it's atom of injustice in the tool? Or for software, where is the bit that is evil?

If I make a thousand hammers and use one to murder, are all hammers evil?

I find it dangerous to use chains to bind evil, consider a doctor saving a life and then the patient murders someone. Is the doctor evil? Are the tools of the doctor evil? Should doctors consider if a patient might do evil?

The same process should be applied to tools as well. If a tool is used to save a life and then a life is taken, is the tool evil? What if it doesn't save a life beforehand, is is now evil? How much live must it save before doing evil to be neutral?

That is why I consider technology neutral (even if you may call it evil it remains neutral in nature, not in use), once you allow technology to become moral and political, you open the can of worms that is the consideration if evil is what enables evil in any possible form or shape.


> There is no hard border between tool and decision to use it, the infrastructure will always shift the context of what's possible and alter society.

Technology always changes the landscape, but that isn't the thing anybody ever complains about. Nuclear weapons are only terrible when they're actually used against a city, not when they cause a cold war instead of a kinetic one. Which is the political decision.

When you build a system to affect mass desires and use it to get people to conserve water during a drought, nobody calls you a monster who is destroying society. You'll find serious people arguing that it's irresponsible not to do that.

People don't object to the tools, they object to the uses. Especially when somebody else uses them in a way that disadvantages the allegedly aggrieved party's political goals.

> Once a massive, uncensorable communication network becomes pervasive, the old social institutions that guide public communication become irrelevant and we build new ones or face massive misinformation and propaganda campaigns. It's not that we choose to use social media to propagate fake news, it's the inevitable political consequence of their technical structure.

Fake news isn't a controversy because it's a new thing. The "old social institutions that guide public communication" are the things that brought us three centuries of racist propaganda, gave birth to "yellow journalism", celebrated every desecration of the justice system in the Drug War, swallowed the party line on the War in Iraq and flamed outrage for the sake of ratings so hard that the current President of the United States is Donald Trump.

The reason fake news is now a big controversy is that now there are a hundred cell phone videos of the event in question, ordinary people have easier access to primary sources and the barrier is much lower for someone who objects to the prevailing narrative to find an audience. So when the latest line of bull makes the rounds, it's more often followed by an angry mob decrying its falsity to anyone who will listen and backing their objections with evidence. And then it seems like there are a lot more lies, not because there are actually more lies but because there are all the same lies and many more people pointing them out.

The people to watch out for aren't the people who create this kind of technology. It's the people who think only they should be in control of how people can use it. Which includes both Facebook and the people who think Facebook should be regulated instead of smashed into a million tiny pieces.


I find these two statements pretty interesting:

> Once a massive, uncensorable communication network becomes pervasive

> the old social institutions that guide public communication become irrelevant and we build new ones or face massive misinformation and propaganda campaigns

You're basically saying state run propaganda is the one you want... Which is "fine", but still propaganda and misinformation, just of a different nature (apparently you agree with that one)


"Please respond to the strongest plausible interpretation of what someone says, not a weaker one that's easier to criticize."

https://news.ycombinator.com/newsguidelines.html


I don't see at all how your conclusion ("state run propaganda is the one you want") follows from what you quoted, or from that comment as a whole.


Like a communist who calls genetics a "bourgeois science", or a libertarian who calls global warming a "government conspiracy": the point is to apply an ideology to an uncomfortable reality, and get labels that terminate any further thought.


I'm basically saying, there is no "fake news" it's all "fake" or white washed with the politics of the presenter.

By making the claim it's a technical innovation that led to the political problem - it's ignoring the fact it's a political problem, caused by a political problem (that everyone is biased by politics).

I think it's naive to think technical innovations cause the political issues. Even from the cold war example - it's politics that caused it, just the technology raised the stakes. Politics created the issue, not the engineering


I certainly did not say that official state propaganda is better than online propaganda by other political actors (including foreign states). It's that modern society has developed institutions to deal with the innate tendency of political actors to control the public discourse. The free press, democracy and public accountability of leaders evolved in a different technological context, and the existence of those institutions was sometimes defended with blood.

When you disrupt the economic basis of free press, you create a fundamental vulnerability, at least on the short run until those institutions adapt, like an organism exposed to a new mutation of a pathogen. An it might well take blood again to recalibrate. As a technologist, you don't get to ignore the political effects of what you create, you can't say "well, people have been killing themselves on this planet for millions of years, I've merely created a device to automate killing a whole country, a neutral technology." It's not neutral, it has imediate political effects, it gives power to those who have it to the detriment of the rest.

There is no debate technology can change the world for the better, you just can't presume that any technology, introduced at any moment of time, to any audience will be by definition positive. And once you start to nuance that position of absolute amorality for technology, you are engaging in politics. Technology is always power and power is always political.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: