Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I wish they had presented more solutions. The way I see it, infinite scroll feeds generated by algorithms should be outlawed. Same with Like counts. (It seems Instagram has begun testing this https://techcrunch.com/2019/11/14/instagram-private-like-cou...)

Facebook wasn't that bad when you had to visit each person's wall or when the feed was in chronological order.



IMO a bigger problem is the effect these platforms have on children. There should be systems in place that extend the regulations that exist on TV to the social media experience.


I am thinking a lot about this problem space and trying to integrate some of my ideas into my project (https://www.confidist.com). I am also interested in solutions and alternatives.

Centralization of power

I think this is the difference between these social media platforms being a good source to share information with friends and family and major social media platforms dictating culture through self-interests. Taxing data collection and storage is a good idea presented by the film. I believe social media companies should stay smaller in size and scope. These companies should make guarantees and hold each other accountable through regulation and organizations about data usage and adopting direct to user business models.

The attention economy

Again I think some regulation here is needed. In addition to looking closely at infinite scroll feeds, and the information associated with status that is displayed publicly, we also need to look very closely at the nature of notification management and data ownership. Taking email as an example, we must have an accessible way to unsubscribe from all. In parallel, all platforms should be required to have an accessible and straightforward unsubscribe from all notifications toggle. The obvious attention loophole of just creating new notification categories to continuously alert users needs to be closed. And lastly about data ownership; All social platforms that collect data, not only need to be accountable for the money trail your data takes, but should also allow users to export their data in a usable format. That means I should be able to export all my friends and connections from Facebook and delete my account. This would prevent that psychological gotcha that plays on the fear of losing what you have built on a given platform. It is also anti-competitive to other social platforms if data is not easily exportable. All of these companies know this and make it very difficult.

Intermittent Variable Rewards

You hit on this with the infinite scroll, but we can likely do more. We need to help people be patient again. Guess what? Notifications can help manage our time and attention instead of hurtful if used narrowly and as a service to the user. For example, if I make this post on hacker news and I have notifications disabled, I'll probably keep checking back to see if any updates occur. That involves the intermittent variable reward. However, if I know very clearly that I'll receive a notification when an update occurs, I can exactly respond when I need to. Companies like Apple and Google who manage device notifications should be responsible for their intermittent variability as well. We should require the ability to have "notification digests" instead of immediate notifications. We know they can cause traffic accidents, but they also waste a lot of time and are rarely all that important.

Echo Chambers

You spoke of the like counter. We all know status is a huge part of social and it is especially important to young people. Viewpoint diversity and acceptance play off of status as well. So how do we help remove the perception of our social media status? In the film, they spoke of narrowing in on certain ages before social media is allowed under the law. I think that is a good first step before we fully understand how to make improvements through the user experience. I am requiring 18 and over.

What I am doing with my platform is to privatize as much interaction as I can, with the balance of moderation and a rating system that is not displayed back. Sometimes good old fashion one-on-one communication is the way to go. If that can be used more in a distributed way to substitute for one-to-many interactions I think that goes a long way towards the social pressure of the group.

I think special attention also needs to be made towards each type of media. Pictures, videos, text, virtual reality, augmented reality, and more. I isolate interactions to text only similar to here on HN. This removes social pressures and "shiny objects" from easily manipulating our attention. It prevents every context with another person from being about appearance and prejudices. But it also allows users to fill in the unknowns with blanket assumptions. If you are just a username and well maybe you said something I don't agree with, maybe you are everything I dislike in the world wrapped into one anonymous internet persona? To try and bring a little more empathy into the picture I am trying out showing various attributes users have in common when they interact as part of an introduction. In a virtual reality social situation we could show emotions from users in a more effective way without revealing their identity which is pretty neat. Pictures, especially of users, can allow targeted objectification, and the hyper and unrealistic comparison with our peers that is most likely with younger women. We should have regulations around pictures and videos that require a visible indication that a filter was used. I think allowing young people to share pictures and videos of themselves will become less of a problem the more decentralized social media becomes. We need to change the interaction of, "wow did you see the <media> of x on y?", to "Hey I posted x on y, the niche platform I use, check it out". Sharing pictures and videos need to be more of a niche hobby, and less the thing people need to do to be relevant.

Echo chambers I think also directly relates to growth and growth hacks. People interact more and enjoy interactions more when they are validated. 8 friends within 11 days. This gets users focused on that status number of # of friends. Or maybe # of connections which exploded LinkedIn. Guess who people mostly know and that can help grow the platform? Oh, people that are very similar to each other and that generally like each other. This is great for a niche platform where we stay connected with close friends and family, but it is a disaster for a platform that aims to connect everyone and is also a way to share news, opinions, etc etc. So again, the scale and scope matters. What about platforms that aim to form communities or groups based on a common interest or identity? People love forming their echo chambers and I don't think there is anything we can do about that human desire. What we can do as system engineers use our algorithms to promote diversity of thought and perspective within groups and among groups.

On my platform Confidist, I am working on a design for a connection system called "Orbits". Instead of encouraging connections with just about anyone you encounter, I want to leverage the data that users have volunteered, to similar to a dating platform, promote diversity and health of your Orbit. This might equate to having multiple strong foundational connections such as being introverted from a rural community and prioritizing other attributes that you have not yet been exposed to such as someone who is socially liberal. Additionally, I created a spotlight event system that is tied to experience-based rewards. As the system designer, I can cross-promote these events from one community to another. The site can also encourage a spirit or culture of openness.

I did not touch on solutions and alternatives to the advertisement model, however, I think plenty of people are experimenting. I am interested in hearing what others think are the most successful alternatives. But to me, the key is staying relatively small in size and scope, prioritize the user directly with your business model, and be very conscious about how you grow and regulate your system. Don't make something you don't want your kid to use.


It reads like you have put a lot of time and energy into thinking the issues and possible solutions through. Here is a tip: as a first time potential user, this site gives an awfully complex vibe and its hard for me to get the layout, what should I do next, etc. Compare that to Facebook/Instagram sign up / onboarding / first steps experience. I suggest enlisting a UX researcher/designer who have done it before to tackle this problem.


Agree, thanks for taking the time to check it out and give some critical feedback. Much appreciated. Any UX'r who wants to help please email me info@confidist.com.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: