What about the fact that facebook has an algorithm they made deciding on what posts are presented each user, apparently tailored to drive engagement? They can feature all the controversial posts that stir people up for the clicks. Youtube is similar. One could make a case that these algorithms cause these problems, promoting conspiracies/etc for the clicks.
Unfortunately, the whole concept of "growth and engagement" (and their biggest implementations - Facebook, YouTube, etc) supports so much of our society today that I don't expect neither mainstream media nor politicians to attack it.
The reason we're attacking Parler and not the underlying evil is because Parler is an easy target while the other big implementation of said evil (Facebook) underpins the careers and livelihoods of many of the people who are in a position to ban it or reform our laws.
Like I said, the conspiracy theories are a virus that exploits the algorithm, which is otherwise harmless and serves a very different purpose. Youtube recommends children's videos to me because sometimes I let my son watch children's videos, which is a pretty reasonable proposition. The problem is that the very same system can become harmful when it starts recommending more and more misinformation after a person watches one conspiracy theory video; Google has been trying to address this by displaying truthful information when certain topics are detected, but obviously there is work left to do.
The real problem here is that we are focusing on the way that these algorithms can send people into rabbit holes of misinformation, without stopping to consider what the same algorithms do in general or the fact that people actually like recommendations (which are in most cases harmless to society). Again, the response to "HIV propagates via the immune system" should not be "we should get rid of the immune system to prevent the spread of HIV."
I grant that it's nice to have relevant content presented. But I'm not in favor of profit-driven companies controlling social discourse with their secret algorithms.
Couldn't there be a way where users have more control over this? Perhaps recommendations to friends, user ratings, stuff like web-rings, etc.
Even the fact that these algorithms are secret gives me the creeps. The political problems we're seen have been accidental, what happens when someone is using these to manipulate everyone on purpose?