Well that's what I'm arguing. I think algorithms have less power than people think they do, even the computer scientists. Social sciences/economic theories that study how crowds work all say that it's driven by information. You don't need an algorithm to do this you just need someone in your group to share that information with you, and this is actually is what really is happening when you look at it and get out of data and charts.
People are leaving these algorithm ran places in droves especially now because they already believe that the government is working with these companies to track their movements and radical plans. They're already going to decentralized networks, and once that happens you will not be able to control the radicals just like you couldn't control the pirates after they switched to decentralized piracy networks. This isn't trying to stop someone downloading a couple Metallica songs. This is trying to stop the proliferation of fascist/totalitarian thought and foreign actors stoking it up. It's much more dire, and the world of John Perry Barlow is dead. Not to mention the last few right wing terrorist attacks weren't planned on Facebook or were they radicalized through there. Neither was QAnon. That all came from 4chan/8chan/etc. There are no algorithms there, no AI, and anyone could basically code something on that level of complexity in a day.
Information should not be free. It should have limits. Once such information threatens the proliferation of a free society and brings people back towards totalitarianism the slippery slope "first they take away the press and then we get Hitler" argument falls flat on it's face because large swaths of the population freely gravitating towards Hitler gets you the exact same effect, and not only that the nuance between society trying to regulate information for the general health of said society and a fascist dictator doing the same is completely different.
> I think algorithms have less power than people think they do, even the computer scientists.
Algorithms are powerful because they can cause tiny, incremental and often completely unnoticeable changes of opinions and perception. These changes adds up when they affect large enough populations. I agree that people who gather in groups and share information makes up the bulk of this equation – algorithms do not operate in a vacuum.
I do think that the negative effects caused by algorithms are largely unintended consequences. Profit is the motive, not malice.
> People are leaving these algorithm ran places in droves especially now because they already believe that the government is working with these companies to track their movements and radical plans.
A tiny fraction of the technically literate people escape the big platforms, the rest stays on even though they are somewhat conscious of the negative effects and ruthless business practices.
> That all came from 4chan/8chan/etc. There are no algorithms there, no AI, and anyone could basically code something on that level of complexity in a day.
Many people joining extreme communities have been nudged by algorithms, especially Youtube's. What happens from there is usually just plain old group think and tribalism, feeling the comfort of finding a community to belong to. The point is that those nudges from suggested videos gently pushes you down a rabbit hole you otherwise would not be exploring in such a rapid and captivating manner.
Like they say in the film; algorithms are not evil on their own, they just tend to enable and amplify some of the worst tendencies in people who know how to exploit this tool for their own gain, political or otherwise.
> Information should not be free. It should have limits.
I disagree. What I do think is that the many "information outlets" should be held responsible for their editorialization. Newspapers, TV and social media platforms do all editorialize, only social media has left this task to the algorithms – which is rather careless and naive. These companies have indeed moved fast and are now braking things.
> Algorithms are powerful because they can cause tiny, incremental and often completely unnoticeable changes of opinions and perception.
Stafford Beer predicted in 1972 that increasing variety without regulatory variety to combat said variety would send society towards catastrophic collapse because there are so many possible states in the social system it becomes as complex as things like weather and wave formation.
But we got the American Skinner Box model, mixed with heavy doses of hipsterism instead.
> A tiny fraction of the technically literate people escape the big platforms
I suggest you look into it a bit more and read about the associations between people like Nick Land, Milo Yiannopoulos, Steve Bannon, etc. While it's true some of it was done by social media, the thing about it is that conservative media has and always been this close knit juggernaut hype machine since when Rush Limbaugh appeared in the depths of AM Radio hell. To them, it's merely a faster way of organizing the way they have for years because there isn't any cost. They no longer have to print 5000 copies of something. It's low hanging fruit. Most of the alt-right are ex Ron Paulers, that were already into things like bitcoin through the libertarian party since like 2012. They never "escaped the big platforms", they were always recruitment points for normies, which doesn't even require them to do anything but promote mainstream conservatism then say "go here for more stuff" that can't be tracked by Facebook's AI. Maybe your grandma may not have "escaped" but your grandma probably isn't on the streets shooting black people or lefties. Bitcoin was transacted between these Russian/American groups according to the Mueller report also.
> Many people joining extreme communities have been nudged by algorithms, especially Youtube's. What happens from there is usually just plain old group think and tribalism, feeling the comfort of finding a community to belong to.
The algorithms merely bayesian filter a giant database of what people like and feed it back to them for their increased engagement. These people would have chosen such content on their own, and the fact the content exists in the first place that plays to their real views (not views they show in public) that have existed for hundreds if not thousands of years and are passed through blood lines (if your mom is a dem or rep or kkk, you're gonna be a dem or rep or kkk 80% of the time) it legitimizes the worst of human behavior which was socially unacceptable.
But algorithms don't do that, communication without moderation does. You aren't going to get a Trump supporter to start watching CNN or MSNBC, and if the satellite stopped carrying it they would cancel their subscription and go somewhere else. So if the Internet offers media that has their views, they will seek it out and if the algorithms amplify this by sharing what other people in their social circles say, even better according to them. Saying that these people were innocent and normal before Facebook came along and made them radicals is just flat out wrong. They can just say the n word to each other with millions of people instead of having to hide it among close friends like they're smoking weed or something.
This polarization was happening long before Facebook and the Internet even existed, especially during the Clinton administration and the things that got Tim McVeigh to bomb Oklahoma City over Ruby Ridge, Branch Davidians, and the Assault Weapons ban. That's who these people are, always will be, and giving them a communication platform without strong information control is asking for trouble.
You aren't going to get away from business interests, religious interests, or racial interests it's basically impossible. You just need to create a sense of civil society via the web through ejecting bad actors from the public square. I know if I walked into a gay bar and started yelling homophobic slurs I probably wouldn't make it out of there alive, and if so I'd be kicked out for life with a nice bouncer at the door to greet me if I tried again. The Internet has no such protections.
People and groups with strongly held beliefs are not the most affected by algorithms, as you are pointing out. Worst case scenario is that their echo chamber becomes less impenetrable or that they become even more extreme.
Algorithms does most damage to people not currently holding any strong beliefs about any given topic, but do have some wage leanings one way or another. These people can be swung and their views amplified and radicalized without much effort or financial input, as the film The Great Hack makes a good case for.
Google, Youtube, Facebook and Twitter (who all offers fine-grained ad targeting) can be weaponized to push "normal" people towards the extreme, and pit groups against each other and affect democratic processes in a big way. This was Cambridge Analytica's business model, and it worked really well.
I think a lot of people confuse the appearance of a lack of strong beliefs with simple social filtering. There is no social filtering on the web because you don't have to worry about your reputation.
The thing that people try the hardest to stop from happening to themselves is social isolation and being an outcast. People will lie all the time just to get along with the crowd. You wouldn't know people with strongly held beliefs, unless you found out what they had in their library. It's just the filters are removed and laid bare for the whole society to see, and it's ugly and it always has been.
"In 1981, former Republican Party strategist Lee Atwater, when giving an anonymous interview discussing Nixon's Southern Strategy, said:[28][29][30]
You start out in 1954 by saying, "Nigger, nigger, nigger." By 1968, you can't say "nigger" – that hurts you. Backfires. So you say stuff like forced busing, states' rights and all that stuff. You're getting so abstract now, you're talking about cutting taxes. And all these things you're talking about are totally economic things and a byproduct of them is [that] blacks get hurt worse than whites. And subconsciously maybe that is part of it. I'm not saying that. But I'm saying that if it is getting that abstract, and that coded, that we are doing away with the racial problem one way or the other. You follow me – because obviously sitting around saying, "We want to cut this," is much more abstract than even the busing thing, and a hell of a lot more abstract than "Nigger, nigger."[31]"
This is a great point. At the same time I observe a night and day difference in how some people close to me behaved before and after they had access to Facebook and Youtube. Pre-social media I could have a long and rather nuanced discussion on controversial topics, and we both left the conversation with slightly altered opinions and some new perspectives. Today, all nuance is gone and the conversation is scattered all over the place, spiced up with whatever conspiracy theory they "discovered" on Youtube lately.
Maybe one of the effects of algorithms has been to push people to extremes to such a degree that social filtering is discarded? People I know seem to be almost apologetic in their approach, as if their life depend on convincing people about their views and to not care about the social consequences of constantly "preaching". This form of polarization is not healthy – basic respect for other people is lessened and listening to counter arguments is considered a weakness.
This is just an anecdote from my life. From what I read and hear from others, this experience is not unique.
People are leaving these algorithm ran places in droves especially now because they already believe that the government is working with these companies to track their movements and radical plans. They're already going to decentralized networks, and once that happens you will not be able to control the radicals just like you couldn't control the pirates after they switched to decentralized piracy networks. This isn't trying to stop someone downloading a couple Metallica songs. This is trying to stop the proliferation of fascist/totalitarian thought and foreign actors stoking it up. It's much more dire, and the world of John Perry Barlow is dead. Not to mention the last few right wing terrorist attacks weren't planned on Facebook or were they radicalized through there. Neither was QAnon. That all came from 4chan/8chan/etc. There are no algorithms there, no AI, and anyone could basically code something on that level of complexity in a day.
Information should not be free. It should have limits. Once such information threatens the proliferation of a free society and brings people back towards totalitarianism the slippery slope "first they take away the press and then we get Hitler" argument falls flat on it's face because large swaths of the population freely gravitating towards Hitler gets you the exact same effect, and not only that the nuance between society trying to regulate information for the general health of said society and a fascist dictator doing the same is completely different.