Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Linux on the desktop isn't helped when people post truly delusional things like calling this 'modern.' Makes it look like people don't know what they don't know.


> Linux on the desktop isn't helped when people post truly delusional things like calling this 'modern.'

Not enough animation? Too easy to distinguish parts of the interface? Needs more ads? Do you have anything material to say?


Missing the basics of high-quality, high-precision, and high-fidelity rendering. We've learned how to do these things since 1993.


It should be a doable effort to implement high-res font rendering - maybe even just the font needs to be swapped out. I think it's great when good software gets maintained and renewed which is the case here as far as I understand even if there are still some things that could be improved. That said, I like the look&feel of CDE, good to see efforts to get this working on Linux.


> high-quality

What does that mean? Quality in what sense?

> high-precision

The UI is just rectangles and lines. Even "modern" UIs like Windows 10 is made up of just flat rectangles and lines. The precision to draw flat rectangles and lines hasn't changed over the decades.

> and high-fidelity rendering

The fidelity being high or low depends on the source material, GUIs are drawing themselves from scratch and at most copy existing resources (images, etc). Unless you use "high fidelity" as another way to write "good looking", which i hope you understand that is highly subjective.


Look at the text. Can you see how blocky it is? And how awkward the gaps are between letters? That doesn't represent the definition of the fonts that are being used. It's a low-quality rendering in that it's literally a binary quantisation of the curves in the font. It's low-precision in that the output only considers pixel boundaries. It's low-fidelity in that it doesn't represent the creator's intent for the font.

None of that is subjective. In fact my point is that the rendering is subjective rather than objective (high-quality, high-precision, high-fidelity).

It's not just the text. Look at the dithering: low-precision rendering of colours. Look at the alpha-blending of the Firefox icon: low-fidelity of the icon file as it's been blended as an integer. Etc etc etc.

Given the mathematical input of the curves of the fonts, the colour gradients, the icon definitions, this is objectively a poor rendering as it doesn't match the declared intention.


> Look at the text.

Many people (me included) prefer non-antialiased text as it looks sharper. What you describe is an issue with the font used (unless it is a bitmap font, in which case the look you'd get would be the creator's intent). You can get that with fonts that do not have proper hinting, but that can be fixed (assuming you dislike the result) by using another font. Regardless this isn't an issue with the entirety of CDE or the rendering it uses, but with the specific font used.

> None of that is subjective.

The very specific issue you mentioned isn't subjective indeed, again assuming that the font in question was not a bitmap font designed to look as it looks. And of course it is only about the font, nothing else.

> In fact my point is that the rendering is subjective rather than objective (high-quality, high-prevision, high-fidelity).

TBH "high-quality, high-prevision, high-fidelity" sounds like some motto without substance. My comment was about trying to be more specific about what your issues were.

> Look at the dithering: low-precision rendering of colours.

The dithering is intentional, not due to an inability to display additional colors. While back in the 80s and 90s dithering was used to work around lack of color, nowadays it is almost always used as a stylistic choice. For example personally i often make my own icons for Window Maker (my WM of choice) and i pretty much always lower the colors to introduce some ordered dithering that creates that common cross-hatching pattern (often i do not even care about the amount of colors, just whatever looks adds a bit of that pattern without losing much color info) that i personally find aesthetically pleasing.

> Look at the alpha-blending of the Firefox icon: low-fidelity of the icon file as it's been blended as an integer

Yes that is also an objective issue. Not sure how hard it'd be to fix CDE use alpha blending, though the functionality is already provided by the X server as many window managers use it.


> TBH "high-quality, high-prevision, high-fidelity" sounds like some motto without substance.

I spent time explaining what I meant by each of these, relating them to the mathematical definition of the input and how the output matched that.

If you wanted to, you could probably write a function to quantise how the output for different renderers diverged from the declared input. CDE would not do well compared to modern systems like macOS.

> Many people (me included) prefer non-antialiased text

Do you think CDE is making an intentional stylistic choice to render text in this way? Or do you think it doesn't have the functionality to render it with more fidelity to the font definition?

I think it's the latter. Which is why it isn't a modern system.


> I spent time explaining what I meant by each of these, relating them to the mathematical definition of the input and how the output matched that.

You only mentioned fonts and what you mentioned was only true about vector fonts without any hinting information (which is what allows fonts to be displayed as the designer intended on fixed grid output like a pixel grid). It is not true for vector fonts with hinting information nor about bitmap fonts.

And what your issue really was was about the binary-only representation, which basically means that what you wanted is subpixel accuracy - ie. introducing antialiasing. This is something that is actually subjective as many people (me included, as i already wrote) prefer the binary approach.

And indeed while you are right about the very specific case of how vector fonts are displayed without enough precision, even when a font renderer does allow for subpixel precision many people prefer to alter the output from what the designer would do for -in their opinion- better results: this is one of the oldest differences between people preferring Mac OS X's font rendering (no alterations) over Windows rendering (alterations to make fonts look sharper). Many font rendering setups you'll find on Linux provide control over this too and while you can objectively say which setup would be more "correct", what exactly looks better is down to the user.

Anyway, this was all about a specific case of font rendering and nothing else.

> If you wanted to, you could probably write a function to quantise how the output for different renderers diverged from the input. CDE would not do well compared to modern systems like macOS.

CDE is not a renderer, it uses Motif and Xlib (and perhaps xcb in places) for its graphics.

> Do you think CDE is making an intentional stylistic choice to render text in this way? Or do you think it doesn't have the functionality to render it with more fidelity to the font definition?

CDE uses Motif and Xlib, Motif does have the ability to render antialiased text via Xft which uses FreeType which is pretty much what most other toolkits use to perform font rasterization.


> The dithering is intentional, not due to an inability to display additional colors. While back in the 80s and 90s dithering was used to work around lack of color, nowadays it is almost always used as a stylistic choice.

I'm pretty sure the limited number of colours is exactly why CDE does this. At the time it was popular most X terms could only do a palette of 256 colours (from a larger number, I think 65535). Thus the dithering.

In fact if you opened up an image viewer like xv (xview) it would switch palettes and make your other windows look weird. Just because it needed all the colours to display a picture decently.

24-bit + alpha channel greatness only became mainstream after CDE was already a thing of the past.


Yes, as i wrote in the part you quoted, back in the 80s and 90s this was a problem however nowadays the dithering is an aesthetic choice and not a CDE limitation.

You can even see in the screenshot the Firefox icon using true color output (it sadly doesn't use alpha blending though, so this might actually be a CDE or Motif limitation - it shouldn't be impossible to fix it as the X server can do alpha blending nowadays but it needs someone to figure out where the limitation is and implement it)


I agree with you, but I get a feeling that all points you've mentioned can be fixed and still maintain the same retro aesthetics. Isn't it an implementation issue rather than a design issue? Genuinely curious.


Yes agreed they could be fixed and maintain the retro aesthetic, and it is an implementation issue.

The implementation is not modern. You could have a modern implementation of the same aesthetic.


Not enough graphics?


I don't get this response to my comment. Why do you want more graphics? How many graphics there are is a decision of the application. The fidelity of the rendering is up to CDE, and the result is poor.


>Why do you want more graphics?

I don't. You seem to do. More as in better, not as in more triangles on screen or more graphic operations per second.

>The fidelity of the rendering is up to CDE, and the result is poor.

Hence my response: is this poor rendering "not enough graphics" for you (as in "not fancy enough")?

Because many could not care less for the "fidelity of the rendering" compared to the actual functionality.

I could be black and white dithered for all it matters (in fact, many would do with just some terminals).


> is this poor rendering "not enough graphics" for you (as in "not fancy enough")?

I never mentioned more graphics. Only you did! Nobody here wants more graphics. You’re arguing with an imaginary person saying different things to me.

I’m interested in rendering that is accurate, with fonts that are correctly hinted and rasterised and icons that are correctly composited, with a high-resolution for high-fidelity. This rendering is inaccurate and low-fidelity which makes it a poorer experience.

It’s not about being fancy it’s about showing me the text and I’m working with and the graphics that are there with as much precision as possible.

Compare the same fonts and icons rendered with a modern rasteriser style a modern DPI. Doesn’t compete.

> Because many could not care less for the "fidelity of the rendering" compared to the actual functionality.

Showing me the text and UI I’m working with is the functionality of a desktop environment. Without rendered text for me to work on… what am I doing in my desktop environment?


>I never mentioned more graphics. Only you did! Nobody here wants more graphics. You’re arguing with an imaginary person saying different things to me.

I've already explained what I meant in my follow up comment, so I'm not sure why you continue to protest being misinterpreted. Not sure what you understood as the meaning of my "not enough graphics?" comment. But I tried to make it clear in the follow up (e.g. that your issue is the poor rendering, etc.), and it seems you have the same thing in mind as being the problem.

The distinction I am making is between graphics and functionality.

For me "rendering that is accurate, with fonts that are correctly hinted and rasterised and icons that are correctly composited, with a high-resolution for high-fidelity" is not functionality, it's graphics.

It's how CDE looks.

Whereas what's important (functionality) is what it does as a desktop environment/launcher.

If we were talking about GIMP or Photoshop or Premiere or even Word, sure "fonts that are correctly hinted and rasterised" would be part of the functionality. For a desktop environment, they are secondary.

You might agree or disagree, but that was the point I was making.


Their UI looks like it's from the 90s. The technology may be modern, but the things users actually interact with and see is certainly not.


UI was best in the 90s anyway, so that is a plus.


I implore you to try to use a 90s-style UI on a touchscreen, and then let's see if you want to revisit that statement...


I do not use touchscreens for desktop environments (the post is for a desktop environment) and the overwhelmingly vast majority of touch screens are for mobile devices which need and have their own conventions and UX - trying to shoehorn one on another is a disaster (not that some do not try but this is also a major source of modern UI awfulness).

Though having said that even with touchscreens i'd rather use a stylus (i do that for my tablet actually) and styluses work fine with 90s style UIs.


Laptops are the new desktops, and many of them (unless Macs) ship with touch screens.


Ever seen those used in an office setting? I have, and it was hilarious. Everyone swore at their laptops for two or three days, and then they all scrambled for mice. No screens were touched afterwards.

It looks cute in ads but truth is spending eight hours a day poking at your screen all the time just makes your hands sore, and after an hour or so, you spend half that time poking at the Undo button because you poked somewhere else than you meant by mistake.

It's just a wet dream of executives who never bothered to ask their underlings how they do their puny underling work. Wherever touch screens provided considerable advantages -- e.g. in touch-operated software for medical, imaging, CNC, SCADA industries -- they've already displaced non-touch devices since the 1990s. Everywhere else, suits have been singing dirges and whining about the post-PC era for ten years now and yet lo, walk into any open-space office, and you'll find nothing but rows and rows of 24" LCDs hooked to laptops with pristine screens.


Yes, when at the desk they are plugged into the docking station and used as regular desktops, when going into meetings they get folded and used as tablets.


IMHO it's not at all a good approach to optimize UIs for that one hour of largely unproductive work instead of the other seven hours that actually make the dough.


Not everyone has your work schedule.


And not everyone has yours; maybe CDE is totally fine for plenty of people even if it'd be a pain for you.


It really isn't though. I'm honestly getting very exhausted reading these HN comments suggesting that the situation with using Linux on the desktop is okay. I feel like I have been reading those comments for like 10 years (and on Slashdot for another 10 years before that) and nothing has changed and still nobody actually wants to use Linux on the desktop besides programmers and geeks. It's messed up enough if you use one of the popular desktops, never mind if you use an obscure desktop from the 90s like CDE.


Laptops also come with touchpads and mice and laptops with touch screens are a niche in the first place. Forcing everyone to use an interface that is optimized for a subcase of a subcase is the worst way to tackle UI design.

Besides even then i'd use a stylus - in fact i have a tablet which is essentially a PC without a keyboard and i use it with one instead of rubbing my fingers against the device. Both cleaner and more precise.


Okay, just for you I went and installed CDE on my pinephone. It's too small (not really a touch issue; it'd probably be fine if I could figure out how to adjust the DPI), but even so everything seems to work fine. Window bars are big enough to move around, dock icons work fine, menus are okay (albeit, yes, cramped on that screen). What did you expect to break?


Well, for starters, anti-aliased text?


Not everyone likes that, personally i always disable antialiasing. Even on Windows 10 i use a registry setting (since Microsoft thought it wise to hide the UI option) to disable text antialiasing since i prefer the sharp text.

Sadly many fonts nowadays do not have hinting information at all, but at least so far Firefox, etc allow me to override sites' font choices with fonts that look good with my setup.


Anti-aliasing looks great on a screen with high enough DPI so you can barely see the pixels. On a low-DPI screen it does look fuzzy.

On my 4K 24" turning anti-aliasing off makes it look less crisp though because the letters don't look just right.


Yeah, maybe. I do not have a very high DPI monitor and TBH i find even my 27" 1440p monitor to be too high DPI for my taste, i just couldn't find something smaller with the other specs i wanted (VA for having decent blacks and with a high refresh rate). I'll probably replace it with a ~23" 1080p monitor at some point though if i find a VA with high refresh rate... and not be curved (because for some reason 99% of them are curved). Either way i'll stick with the non-antialiased fonts :-P.


Well I would find 27" 1440p too high DPI too if I ran it at 100% scaling :) I run my 24" 4K at 200% scaling so it's effectively 1080p. Just super crisp which I like.

We have super-crisp screens on our phones these days, why not on the computer?


There are too many micro-issues with high DPI and TBH while the text is sharper i do not see the benefit for everything else while they do add additional strain on the CPU and the GPU (especially for games), so i just avoid it.

Some time ago I tried 150% scaling on my laptop (which has a 1080p monitor at a small size) and i just reverted to 100% because a lot of things, from websites to applications, etc didn't work properly. Note that this was on Windows but i do not expect things to be better in other OSes anyway (except perhaps macOS but it has a ton of other much worse drawbacks IMO).

Perhaps if all i used was terminal apps and pure text editors i'd have a different opinion but i use a lot of GUI apps with images, games, etc.


I don't know, but seems like you're bending over backwards to explain that CDE's weaker rendering is a good thing. Maybe it suits you, but I think it's a clear fact that the reason it's weaker is because they haven't developed it, not because they were able to but made a conscious decision not to.

If CDE could render better but let you opt out that'd be great... but it can't render more accurately in the first place. It's not choosing to be worse.


> seems like you're bending over backwards to explain that CDE's weaker rendering is a good thing

You'd be wrong.

What i write isn't about CDE but about antialiased text. As i wrote a few parent posts above i even have antialiased text disabled on Windows 10 via a registry setting and Windows can certainly do antialiased text rendering.

> If CDE could render better but let you opt out that'd be great... but it can't render more accurately in the first place. It's not choosing to be worse.

CDE can do antialiased text rendering, or more precisely Motif can (CDE doesn't render its own fonts) if configured to do so via Xft which uses FreeType for font rasterization - the same library that other toolkits (e.g. Qt) use. So its font rasterization can be as good as other toolkits. You need to do it via X resources and is opt-in instead of opt-out.

And besides i do not even use CDE myself, last time i used it was years ago out of curiosity. Personally on Linux i prefer Window Maker (which can also optionally use Xft for antialiased text rendering).


> So its font rasterization can be as good as other toolkits.

Can someone configure CDE to have accurate fonts and icons and 320 DPI if that’s what they prefer? If not why is that not possible?


My guess is that because beauty is in the eye of the beholder, you seem to make the assumption that "antialiased=good looking, non-antialised=bad looking" but in my comments for this entire thread i claim that i prefer non-antialiased text. And i'm going to guess that anyone who would use CDE would also like it to use non-antialiased text.

Anyway, here are a couple of screenshots in the CDE wiki that says how to enable it[0]. Unsurprisingly the only comment there is about someone saying that they prefer the non-antialiased version.

Apparently there are a issues with the terminal emulator though i guess xterm would be a better choice anyway.

https://sourceforge.net/p/cdesktopenv/wiki/FontsWithXFT/


The pixels in that screenshot are absolutely enormous. Modern screens have far higher resolutions. If you can’t even see that then I don’t what what else to say.


Sorry but what you write make zero sense, the pixel size has nothing to do with screen resolution and the pixel sizes are not even something you can judge from a screenshot. And even ignoring all that, the topic was about text antialiasing, pixel sizes (whatever you may mean with that) and resolutions have absolutely zero to do with it.


It depends which OS and desktop you use of course. Mac does it perfectly. Windows still has many issues (sadly). Gnome is pretty great at it, KDE too but there are some minor issues (the mouse pointer doesn't scale and has to be manually enlarged). Still, I use KDE this way and it's fine for me. Every version brings improvements too.

Overall I still think it's worth it. But YMMV of course!


It must be truely depressing being a Linux FOSS developer when any time you try to make something quality and up to todays standards, the Linux user base whinge and insult you for years.

I have no idea why people still work on gnome/wayland right now. I would have left FOSS entirely if subject to that behaviour.


If you've got a bare-bones, basics, simple desktop-environment then embrace that and tell us!

If you're calling it 'modern' when it everyone can see in two seconds that it isn't then you're just setting up both yourself and us for disappointment.

Don't blame the reaction - blame the preventable mistakes that caused the reaction.


I’m not calling CDE modern. I’m saying that the Linux user base has terrible preferences when it comes to software which hold back actually modern and good software like Gnome/Wayland/systemd.


What I don't like about Gnome is the lack of options. It's all so barebones, things have to work just one way and if you don't like it you have to resort to a load of plugins that often don't work right with the latest versions and/or cause issues when working together. I don't understand why lack of choice is meant to be 'modern'. It's not an iPad I just use for facebook. It's a computer. Apple has put the opinionated software paradigm on the map but it's not the be-all-end-all. Especially for desktops. For screen-constrained mobiles it works a lot better.

Systemd I don't like because it's so heavy. I really like lightweight systems. I use FreeBSD for that reason though I have other reasons too. And Alpine when I need Linux. Alpine is working on an init system based on s6 that will be more lightweight and I will give that a chance. I'm not against progress. I just like having options.

And Wayland, I don't mind it too much but I do miss the network transparency badly.


Every boolean option you add to a system doubles the possible configurations of the system, and the possible interactions and bugs. Add just ten new flag options and you've increased the configuration state by 2*10 or 1024 times.

That's why options are a bad thing.


For developers maybe but not for users :) And these are the ones you're developing for.

And look at other systems accomplishing this very well. Like KDE which has a ton of options and they don't suffer this problem.


KDE suffers a quality problem that Gnome does not. I had countless bugs and random issues last time I tried KDE but I have pretty much never had an issue with the core Gnome desktop experience (the bundled programs are another matter).

At first the lack of options on gnome bothered me but then I just accepted the default config and found that it isn't actually important to be able to tweak every single bit of layout and the default config gnome ships is actually really good.


That’s abusive to users. They constantly change random things, and optimized the desktop for mobile regardless of if I’m mobile or not.

I don’t want to be stuck with a crappy desktop environment that I don’t like if I left windows for the same problems, and I doubt most people want to be forced into someone else’s narrow workflow as well.


>I had countless bugs and random issues last time I tried KDE

Would you mind saying when was this? This is important as the KDE crew is delivering new features and bug fixes on a rapid cadence. So your experience from a year ago, might not be valid today.

Also, a lot of the issues KDE suffers from are out of the hands of the KDE devs, as some issues stem form GNOME devs enforcing their agenda on the Linux DE community as the defacto standard on how things should work, leaving the KDE devs to fix the jank they create.


The statement about booleans is a total fallacy. In real life, this MAY be the case for some cases (to introduce a bug in some combinations), and totally harmless for other cases. It depends on the context.

Even then, you cannot make the omelet if you don't break the eggs.

That said, GNOME is one nice whole egg.


NHF, but I must ask:

Are you a GNOME developer?


> actually modern and good software like Gnome/Wayland/systemd.

Well, many people disagree that those are modern and good software - and while you could claim that those are "terrible preferences" the exact same can be claimed about your preferences. The point of Free Software is to allow for everyone to be in control of their own terrible choices :-P


Everyone is entitled to use what they want, but this goes way beyond choosing something different. FOSS devs are receiving real insults and threats constantly for simply developing something. In the end the Linux desktop is being severely held back and it takes over a decade to implement improvements like wayland which are critical for Linux to keep up with Windows and MacOS.


> this goes way beyond choosing something different. FOSS devs are receiving real insults and threats constantly for simply developing something.

As long as people become emotional about these things, they will react in emotional ways - which works both ways. I don't know if you realized it but here:

> In the end the Linux desktop is being severely held back and it takes over a decade to implement improvements like wayland which are critical for Linux to keep up with Windows and MacOS.

...you are basically accusing others for holding Linux back (ie. not letting it "keep up" with Windows and MacOS) because they do not see wayland (or gnome or whatever preferences you have) as an improvement. People, especially if they feel powerless to do much, can react to this as an attack. And i do not think you are making this easier by trying to pull out an emotional card like "FOSS devs are receiving real insults and threats constantly for simply developing something" - this thing can work wonders if you are among people who agree with you but also backfire badly if you are not.


I think the lack of development of Wayland is more due to its corporate sponsors. Nvidia has been really slow to embrace it which hindered adoption. RedHat is putting a lot of effort in but desktop is not their main game so it doesn't have priority. And Ubuntu has been trying to play the "Not Invented Here" game for a long time and trying to get their own intellectual property on the map which didn't really have any discerning feature from Wayland other than being their own. But it did fragment the attention it received.

But this is mainly a corporate game, not one played by independent developers. Remember, also X11 itself was a corporate invention. It came from DEC, MIT and IBM. I don't think it's fair to blame the users for its lack of progress.

But I still think choice is good. I don't want to be forced to use Gnome, I don't agree with their UI ideas (in particular the opinionated software paradigm). Having multiple desktop options is a great thing to have, and it lead to new ideas like tiling WMs such as i3. Personally I use KDE.

I don't think "Linux on the Desktop" should ever be a mainstream thing anyway. It would suffer from the drawbacks of the other options, such as too much corporate control. For something like Wayland it doesn't matter as it's just in the background doing its thing. But for a desktop environment it matters a lot. If Linux had only one option it would be just one of the other not-ideal options around. Just like ChromeOS has become, for that matter.

Also, it's not really realistic to expect all Linux users to 'get with the program' and use the biggest option available to strengthen Linux as a whole. Most of them use it because they want something different. Not because they want to promote Linux.


One of the problems of open source is top down imposition by distros. Perhaps software should be developed with a certain regard to the preferences of the userbase? I suspect software not liked by its users is not going to be very successful in the marketplace.

Aside: Gnome is far worse than KDE. systemd is an improvement on what was there previously, albeit not the only choice. I have enough experience with X to know how difficult it was to work with, a lot there needed to go.

However, Wayland is too minimal, we're going to have lots of unnecessary differences between Gnome-wayland, KDE-wayland, and wlroots-wayland thanks to wayland's bad focus. Speaking of this, want to look at stuff that held back Linux? Maybe have some words for people who prioritized irrelevant features over basic support for remote work.


What’s your definition of modern? I don’t agree with it being modern looking unless a GUI is sufficient to make it so. LXQt was pretty, do you consider it modern?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: