Hacker Newsnew | past | comments | ask | show | jobs | submit | abanana's commentslogin

Indeed, they're talking about the opposite extreme from the usual problem we all bemoan in here, which is JS devs being determined to use the newest shiniest thing as soon as it's been announced, instead of being willing to continue to use what they've always used and to wait until the new stuff works across all browsers. This article really surprised me, in how far some are apparently going in the opposite direction. I'm very surprised the baseline mentioned is ES3 rather than ES5 or 6.

The GP's comment - that we have to upgrade our hardware because devs are "anorexically obsessed with lean code, and find complex dependancies too confusing/bothersome" - is surely the exact opposite of reality? We have to upgrade to faster hardware because the bloat slows everything down!


Fair, but personally I’d absolutely prefer slower bloated code with twice the lifespan to faster code that forces me to buy new hardware I can’t afford. But I’m a nearly extinct type of consumer who happily clings to pre-subscription-era software (e.g., Photoshop 7, Sketchup 2017). I understand and begrudgingly accept that businesses couldn’t survive by tending to the desires of folks like me.


A nitpick to add to the sibling comment, more a minor personal annoyance than anything: No throttling is a menu button that, when clicked, gives you a dropdown menu - not a "combobox". A combobox is a text input element that has an associated dropdown menu.

I see this mistake very often from people whose UI learnings came via Visual Studio, because it didn't have a separate UI element named "dropdown menu" or similar. You instead had to add a combobox and configure an option to turn it into a plain drodown list (e.g. set editable to false in VB6, or change dropDownStyle in VB.net).


The BBC continually tries to convince the government that their problems are due to illegal action that must be stopped.

They do everything in their power to distract from the real issue - that the landscape of television has changed beyond recognition since the tax was brought in.

It's completely clear to everybody that the TV licence is an outdated model that makes no sense in today's world of competing commercial streaming services, but they're desperate to control the narrative to avoid losing their income stream. Which is understandable I suppose, from their narrow point of view. But for the country's point of view, we need a politician with balls, to step up and reform the system. But I'm not sure those even exist anymore.


The BBC obviously wants to avoid losing their income stream, and the current UK government has made clear verbal statements that they not only want the BBC to avoid losing their income stream but that they also want a change to a more sustainable and enforceable model for this. The BBC has not argued that the current license fee is the only model, but they have argued that if this is the model that is going to be used, something about it needs to change if they are to have the income stream that they need.

It also isn't clear to me that the TV license is an outdated model in entirety. The notion that a country would levy a fee on more or less any instance of an activity in order to fund a non-commercial institution related to that activity doesn't seem strange to me at all. What is true is that the nature of the activity and the enforceability of the fee have both changed, and that therefore something probably does need to be done.


They will probably end up following the Australian model where it is funded directly by taxation. Of course this will undermine the BBC's supposed quasi-independence.

The BBC is ridiculously slow to pick up on trends. BBC pop radio (Radio 1) only came in as a response to pirate radio. Its streaming services aren't as good as they could be, and they have the double paradox of showing the same content over and over (such as "Dad's Army" made before I was born), while keeping a lot of classic content unavailable.


The Windows 3.1 UI example screenshots are a reminder of how primitive 3.1 felt compared to other OSes of the time.

The need for instructions in that Search dialog is appalling from a usability perspective.

When Win95 was released, it was widely seen as Microsoft finally catching up with its rivals. They had at last added features that Mac, NeXTSTEP, Amiga, etc had had for years.


Those numbers are UI only. 12 just to design it, another 12 to build it. That's not counting the vastly larger number of developers who built all the various elements of the underlying codebase.

Team bloat is a real issue but I don't think this case is relevant.


> UI taste that is stuck in the 2000s

> UIs that are barely usable... like Windows 2000

Words fail me.

Perhaps it's that well-known psychological effect where people self-report higher productivity when using an interface they find more visually appealing, whereas studying them proves the opposite is true.


Just a few examples of what makes Windows 2000 barely usable for me (and pretty much anyone who grew up with later UIs):

No central place to search for software, files, or settings. You have to dig through layers of menu trees like an idiot.

No visual preview to find the right open window. You have to alt-tab through a list of windows like an idiot.

No way of separating windows into work spaces / desktops (whatever you might call them). You have to either constantly kill windows or work your way through layers of them. The point above doesn't help with that.

This one has less to do with Windows 2000 but was part of the state of the art of the time for software: Walls of icons and buttons and not even a way to group them. Sometimes the entire window is just one wall of tiles sometimes there's the tool bar of doom at the top.

On top of lacking usability, Windows 2000 is ugly. Mostly because all main UI elements like buttons are visually thrust into your face by faux 3d elevation. This had it's place at the time when most of your users would not have had experience with computer UIs in the first place. With those users UI designers back then felt they needed to overemphasize visual cues from the real world. Nowadays you can show the user just a box or something that looks like a link (because people are used to browsers now). Maybe give a cue by changing the emphasis on hover.

The other reason that comes to mind why Windows 2000 is so ugly is colors. Again, this is due to its time and the capabilities of graphics cards back then that mostly didn't allow more subtle color differences.

I'm just using Windows 2000 as pars pro toto here. Pretty much all graphical UIs back then were lacking modern usability features and UI sensibilities, regardless of OS.

> Perhaps it's that well-known psychological effect where people self-report higher productivity when using an interface they find more visually appealing, whereas studying them proves the opposite is true.

You have your slightly condescending explanation for why we disagree and I have mine. Let me give you a hint quoting Douglas Adams:

"I've come up with a set of rules that describe our reactions to technologies: 1. Anything that is in the world when you’re born is normal and ordinary and is just a natural part of the way the world works. 2. Anything that's invented between when you’re fifteen and thirty-five is new and exciting and revolutionary and you can probably get a career in it. 3. Anything invented after you're thirty-five is against the natural order of things."


No, they did not (or if they did, they didn't publish it). If I'm wrong, please give me some links because I'd genuinely love to see it.

Microsoft did those usability studies on the versions of Office that were current before the ribbon. The ribbon followed those studies as their attempt at a solution.

A few times over the years I've tried to search for usability studies of the ribbon interface because I've never got on with it myself. I find plenty of others asking the same thing online, and everybody points them to those same earlier studies from before the ribbon, while wrongly telling them it's a study of the ribbon.

Those studies are unable to tell us whether or not MS's attempt at a solution actually fixed the problems.

I believe the ribbon was a downgrade in usability terms (but people expect it in office suites, purely because it's seen as looking more modern). And I'd love to see real intensive research to tell me whether my belief is right or wrong.


The studies I can't point you to, but there were lots of blogs by the lead Office UX person at the time, Jensen Harris.

https://learn.microsoft.com/en-us/archive/blogs/jensenh/

Unfortunately those blog entries have been destroyed because the images are no longer there.

I read all of them, they were at least 6-7 and quite detailed and I remember thinking that the thought process behind the ribbon was very solid.

https://learn.microsoft.com/en-us/archive/blogs/jensenh/the-...

https://learn.microsoft.com/en-us/archive/blogs/jensenh/ye-o...

etc - you can find all of them there plus many other related blog entries.


Yeah, that's exactly it - there were all those history blogposts, full of very interesting stuff, but all about before the ribbon was in active use. (Pity about the image rot.) No usability studies of the ribbon itself.

Parts of those blog posts were unintentionally revealing of the groupthink of an enclosed bubble of people who couldn't see the wood for the trees. A great example is this piece about moving menu entries around so you couldn't build muscle memory, and had to take the time to look for what you wanted:

> First, remember that we're analyzing this with 20/20 hindsight... there was a lot of excitement (not just at Microsoft) about "auto-customization"... to present exactly the right UI for the person at hand. Now, it's easy to say that today people are generally against this idea... but we know that mainly through trying... the adaptive UI in Office 2000

As I recall it, the vast majority at the time - users, reviewers, UI/UX writers - considered its downsides to be completely obvious and were firmly against it. Its designers were apparently the only ones who needed 20/20 hindsight to see that.

> I remember thinking that the thought process behind the ribbon was very solid

I agree, the historical research, and the work on identifying the problems, was very solid. But the massive criticisms of the ribbon suggest it was not an entirely successful attempt at a solution.

I've seen it said that there's no way Microsoft would have neglected to carry out major usability studies on such a major UI change, and that the fact that nothing's been published, after all the blogposts and talks beforehand, suggests they chose to bury a bad result. No idea whether there's any truth in that of course, but it does sound plausible.


As a techie with no horse in this race I've always found the ribbon very usable. It has a layered shortcut system that is much logical than the legacy one, it still supports the legacy shortcuts (Alt-d, f, f forever!) and the number of commands now easily accessible for sure is higher than with the old menus.


Indeed, the basic point is fine - just 2 competitors standing up for their own choice - but the use of the words "and most open format" ruins the GP's point and perhaps is the reason for the downvotes. There's no way one can argue that Microsoft believes their format is the most open.


It's Lego Masters USA (Fox), rather than the Lego company itself, so I imagine they're being extra-careful with licensing.

I'm in the UK and it's geoblocked for me.


Not always, if we go back to the 1980s. But in very modern times, they've lost all the learnings from back then.


old school apple design stubborness: I remember they insisted on putting the grooves on the "D" and "K" keys instead of the "F" and "J" keys. So you had to find home base on the keyboard with your middle fingers on an apple rather than index fingers like on everything else. No, that place has always been a design shop run amok.


It made sense because the numeric keypad had the dot on the 5. Early IBM keyboards (Model F) didn't have home markers, IIRC. But the PC world standardized on F and J, and eventually everyone else, too.


echo "It made sense because the numeric keypad had the dot on the 5." | sed 's/had/has/'


lol, no, they sucked even more in the 1980s.

Did you ever notice that "About this software" is the first thing on the first menu of every application? Is that because people have to know what version of the software they are using every time they start it? It's still like that today, and it's very very stupid. Other OSs get it right and put the version information on the last menu, where it doesn't clutter up the most prominent area in the most used menus.

Finder was crap in the 1980s. Still is crap, but it used to be crap too.

The window system in the 80s and 90s was also crap. Could not resize a window from any side or corner of the window except the lower right. Windows has had resizing from any edge or corner since forever.

Apple "design" is just not as good as people seem to think it is.

They've also had plenty of weird and unloved hardware designs... the infamous trash can, the clamshell laptop, the weird anniversary macs, a mouse with a charging port on the bottom so that you can't use the mouse while it's charging, and the list goes on and on and on.


As someone who has switched from Windows to Apple recently, my God the Finder is terrible. I can't understand how people aren't flipping tables over how bad it is.


Finder has to be used with the Miller columns; otherwise, it doesn't make sense.

But since the switch to the new filesystem, it's kinda slow and annoying.

They have built some proprietary stuff around their filesystem to increase their walled garden height. Which is kind of stupid in the era of cloud computing, because you cannot use any of it if you share files/directories with other people who don't use Macs.


Because Mac OS X Finder has always been kinda terrible. There was a lot of talk about this in the early 2000s and it's just faded away since the people using macOS now probably never experienced the good old Mac OS 9 Finder.

And its Windows competition Windows Explorer has likewise gotten worse and worse each revision of Windows.


Oh... Finder is the name of the default file browser? I always thought it was the search results that popped down from the top right search area.

Last Mac I was on still had OSX on it.

Thank goodness for Dopus.


lol, directory opus? I was using that on the Amiga way back in the day. I tried it like a decade ago, but it didn't stick for me. It doesn't seem to run on Linux, and it costs $$$, so no chance I'll try it again.


I can't think of a better rationale for the ubiquitous worsening of local search than increasing ignorance of comp sci fundamentals.

There's no reason a senior at undergrad level shouldn't be able to write an efficient, fast, deterministic, precomputed search function.

... and yet, professional developers at major companies seem completely incapable.

Minimum acceptance criteria for any proposed shipping search feature should be "There is no file / object in the local system that fails to show up if you type its visible name" ffs.


The whole window management system is an exercise in contrarianism. They basically chose to do things in the opposite manner of their competitor and mostly against what intuition would dictate for the sole reason of being different.

macOS is very frustrating to use without utility apps that provide the necessary improvements. But they are never as well integrated, cost money or are a hassle to set up.

Apple just wins because they make good-looking, well-built hardware, and sometimes they win on some performance metrics (in the Apple Silicon era, it's mostly about efficiency and single-core speed, which is not as useful as some like to believe).


Apple only "wins" by charging exorbitant prices that idiots are willing to pay to have a digital status symbol. What they have not "won" is market share. They have always been an "also-ran" in market share.

Android (70%) beats iOS (30%). Windows (68%) beats MacOS (13%).


Well, I agree with that if we are talking about the general population. But Apple does have some niches it serves very well that make the prices worth it for some. But of course, this is a very tiny minority of their customers.

For example, they always have been focusing on video editing since the PPC days, starting with the iMac DV. And nowadays, Macs are still quite good for video editing; even when you factor in the price, it's not that bad of a deal. Previously it was about DTP and desktop graphics generally.

But it's always the same playbook; they are first to offer the possibilities of a new usage, but that comes with their high price; over time they lose competitiveness, and they end up switching to something else.

The question is always if the asking price is going to be worth it for whatever you try to accomplish with a computer at the moment. If you are doing work that doesn't require being on the bleeding edge, the answer is probably no.

However, in general, people buy Apple stuff for the status, very often as an ego trip (to prove they are better) and not infrequently out of ignorance/incompetence (it's crazy how much stupid shit Apple fans believe).


What makes you think the first menu is one of the most used menus?


Well it probably isn't because Apple doesn't put useful things there, which is completely stupid from a UX perspective.


Heh, you're going to mention a mouse without bringing up the puck?!


I'm a little surprised they never came out with some oversized mouse pad and a mouse that charges from it.

Always seemed like an apple sort of idea.


> a mouse with a charging port on the bottom so that you can't use the mouse while it's charging

I'm surprised you went for that over the puck. At least when you unplugged it, you could use it. The puck was just terrible. And old.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: