Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The past decade or so, I've found myself in a constant battle with UI/UX team members.

A big part of the battle seems to be a shift from power/tool user to consumer design paradigms. 20 years ago, most software was tools for professionals, now most software (on a usage level at least) is consumer software (literally consumers of video, social media etc.) Designers can't seem to switch modes and keep trying to apply "content consumption" rules and techniques to "jobs to be done" projects.

A favourite is removing visible tools with 3-dot menus in the name of "cleaner" or "more intuitive" user interface that turns one-click actions into two clicks: 2 clicks hundreds of times a day for the tool users. Tables that show 10 items with 4 details each in the same screen real estate that used to show 20 items with 6-8 details. Dashboards that show 4 graphs instead of 10. Menus that take up the top 10-20% of the app. Massive H1 titles.

One of my aunts changes around her furniture constantly, not because she has an idea of a better layout but because she's bored with the current layout. This seems to be a common personality trait in PMs and/or UX people. Adding a 4th table view to the product? ... "we should use this entirely different style" not because it's "better". Since COVID, I've run a weekly Google meet with friends ... it changes in some way pretty much every time we get together, most recently the request to join process has changed, again, and not for the better.



The best is that if you (junior or senior) bring this up, you're either shut down by management because you don't have the right expertise (how dare you, especially if you like me are mostly working in backend) or by the UX guy saying 1) it's a best practice, and 2) if they ran a research/survey, they found that it makes "almost no difference" - the major points are xyz (which is something really worse that requires immediate work).

At a certain point just accept the way it is, and wait until the better way of doing things becomes the new best practice...


Not just a front end problem. As a customer of facebook, netflix, hbo,prime, disney+ etc., it continually amazes that these expensive devs can't make the backend remember what I watched and not. Combined with an "engaging" front end that hides what I'm interested in and shifts the interface randomly, the experience is pretty low.


i think they do it on purpose just to give you the feeling of having a lot of content. Hate it, too.


They intentionally change cover art for exactly this purpose


Yup, it's another great example of the sort of "late stage" problems you run into with capitalism, professionalization of business management (MBA degrees), and fundamentally just about any system w/ rules and "(semi-)autonomous agents" trying to maximize (or at least perform well [enough]) the value of some metric / function. Here, profit &/ even staying in business.

Capitalism and other features of the system currently used in many of the most developed countries is great during development of markets. The performance is excellent, which makes perfect sense.* You run into problems when markets reach saturation. And, markets have never been remotely as saturated as now. Now, you're literally talking about how finely you can slice the sum total of ATTENTION, period. Think "TikTok".

Basically, you see a lot more "zero sum" behavior at such points. I.e., it's much more just Darwinian** ... some winning, some losing.

One example of this type of strategy that I find distasteful is the "retail reset". Every 6 months or so, the retail reset dairy visits the supermarket I shop at most frequently, forcing me to re-find at least some of what I want to buy. It rarely causes me to buy anything unusual or new to me, but, I understand why they do it. It's not "terrible", but there's a feel to it of a certain ... lack of morals / ethics... feeds into a pervasiveness to that feel ... I believe it to be subtly but moderately bad for societies overall...

Couldn't find a more authoritative source / less markety site quickly, but here's at least some of the description:

https://canadasbeststorefixtures.com/why-a-store-reset-boost...

Of course, this kind of thinking is taught and exchanged among the business-oriented. Makes "the disease" worse (more pervasive) ...

* "Command economy" makes very little if you actually care about overall performance. It's the problem anyone faces when trying to model [e.g., "data assimilation" techniques, very powerful but enough data is essential] - not enough data! Of course, in today's world, it might be increasingly possible to make a more "command" style work, from the data side ... and, if you want to go for that "panopticon look" ... China has headed back that way ... again, in some sense. But, it's really still not good ... you're up against all sorts of other human factors, even if you HAD the data.

** Edit: or noticeably Darwinian (in a more common usage sense). This behavior of systems regarding "niches" and competition applies at all times. It's when the "empty space(s)" or "low density space(s)" are occupied that the more directly / actively competitive behavior w/ more clear negatives for various 'participants' tends to emerge.


Almost every company internally works as command economy. I doubt that it can scale any more than that.


Netflix used to work like that, once upon a time. They intentionally changed it.


I think what happened was that a lot of UIs were really bad in the 90s. A bunch of books were research and written pointing out the problems, people made an effort to fix them, and UX became a skillset.

But it seemed like the trends that were intended to fix the original problems just kept getting followed and amplified with nobody really noticing they had overshot. Or based on some comments here, new designers without the context of the old UIs just reapplying them to already fixed UIs?

Instead of the pendulum coming back - it kinda reached escape velocity.


UX was often pretty poor in the 90s too. There's some rose tinted glasses here. This was the era in which app theming was a huge fad. Relatively few companies attempted to create consistent and usability tested approaches to their UI, and when they did, it was always operating system companies that were writing their own UI toolkits and lots of little bundled applets. They wanted these to be consistent, and they wanted to understand how to make things like file or hardware management easy.

This then dovetailed with some other trends: the software industry was moving to GUIs very fast, competitive advantage often came from how quickly you could release a Windows port back then, and the industry was relatively poor. Many big name software firms were like 20 devs and a few admin assistant types at most. So they all used the built-in APIs because that was the only way to actually get a decent GUI to market fast enough, and it outsourced all the thinking and design leadership to successful companies. If you tried to reinvent all that stuff, not only would you arrive late to the GUI party but your app would probably not fit on floppy disks anymore.

Towards the end of the 90s the internet started getting good, but OS makers were also getting fat and lazy. They dropped the ball on internet distribution and connectivity, then dropped it on managed code as well. People started writing these new fangled "DHTML Web Apps" along with Java applets to obtain a solution, and also perhaps one day to escape from the Microsoft hegemony. But the web was extremely under-designed. It barely had GUI widgets at all, let alone the large usability tested sets of UX conventions that Windows and macOS had. Java had those widgets but it came from a server company that was just aping whatever the desktop companies had done, they didn't really "get" it and their tech back then wasn't good enough to make it fly. For a while people tried to emulate desktop widgets in JS and DHTML but the results always felt second class.

But the combination of easy distribution, a mainframe style app model, open standards and free-beer end user clients was more valuable to people than the classical Windows/Mac OS offering. Not to everyone by any means, and there was much wailing and gnashing of teeth as developers started abandoning native apps. But distribution dominates everything and OS vendors didn't respond, so, we got the web.

With the web most of the old Xerox PARC desktop idioms died due to lack of proper browser support. Menu bars, context menus, files, folders, customizable toolbars, multiple windows ... none of these worked properly on the web. So people learned to make UI within those constraints. Then mobile came along and with it, a new generation of native app idioms. OS leadership was re-established and people started following that, but at the cost of it being inappropriate for productivity apps.


> UX was often pretty poor in the 90s too. There's some rose tinted glasses here.

Agreed. The thing that is new to me though is PMs and UX/UI designers pushing for things that are so obviously out of whack with what the actual users are asking for AND that it isn't being driven from external forces. Literally no one is asking for all this white space, it's coming from the team responsible for designing the app.


This tracks, thanks for a really interesting take.


> removing visible tools with 3-dot menus in the name of "cleaner" or "more intuitive" user interface

It just came to me what this truly means: professional designers tasked with designing aesthetics of man-machine interfaces inherently develops allergic fear response through overexposure to user interface elements such as buttons and scrollbars. It is annoying and too obvious to them that EACH and every BUTTONS and SCROLLBARS tell them ALL the TIME on screen CONSTANTLY what COULD be DONE but are NOT being DONE, at some point enough is enough, to them personally, so they must remove all.

I wonder at what point the UI trends would return to necessary amounts of skeuomorphism required for average users, not designers - apparently the web link paradigm is just too difficult for older and/or less experienced users, so sooner or later it'll start showing on data no one can get around.


Thanks for putting into words exactly how so many of us been feeling for the past few years. Last week Slack's UI changed, again, I lost precious time having to find the buttons I'm used to, again. It's not better or worse from my point of view, it's just different and that's very annoying.

> This seems to be a common personality trait in PMs and/or UX people.

They also are paid to do a job, and their job is to change the UI/UX... if you were paid to maintain a piece of software that would be pretty much done, you'd likely spend your time refactoring and fixing minor issues. Except not everything we do is visible to the user.


One of the most wonderful developments of the last decade is the "Command Palette" UX pattern. You can have a tool driven predominantly by keyboard and also make all the keyboard shortcuts discoverable (display them next to the command in the dropdown). Maybe just let the UX guys do whatever they want, just so long as I can press Ctrl-K and type the first letters of the command I want to do.


I use Excel's "/, Q, <search>" frequently, because I'm often searching for a feature whose name I remember, but whose place or existence in some menu or panel of the Ribbon is ephemeral or not memorable, often changing with window size.

If only activating the search box weren't so slow!

It feels like I'm starting a moderately large app and waiting for it to load (while still being faster than clicking through Ribbons and opening various subpanels by clicking on teensy "arrow in a corner" widgets).

I'd expect it to be a tiny array of text held in memory for instant access, with no perceived delay between when I finish typing "/Q" and the search interface appearing.

But then, I'd also expect the first search result to be selectable by just hitting <Enter>. Having to use an arrow key to navigate down one, then press <Enter> is unfortunate, especially when using an MS Surface's keyboard's half-size up/down arrows.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: