Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I regret moving from macOS to Linux.

While the tooling and system stability are better, the desktop and app experiences are far worse. It’s buggy, half-baked, and crash.

I regularly hop distributions and do clean installs to see if that will fix it but they’re all the same — pop!_os might be slightly more stable in that regard.

I’ve been using Linux since 1992. I’ve had several year periods in the 90s and early 2000s where Linux was all I used.

I can ditch GNOME, KDE, and XFCE and go back to twm or i3 or fvwm or Window Maker, but I’ve moved beyond thoses experiences. I want a more functional desktop.

But, I see now why I paid more to put up with Apple.



Can you please explain what exactly you get from the Apple desktop you don't with any of those options? I'm currently forced onto a mac for work and I find it lacking in almost every way other than an ever so slight advantage in beauty.

So besides dpi-scaling issues linux is known for, whats the draw? Some hotkey that does a thing? Some visual candy? Smoothness in transitions?

Whatever it is, it's hard for me to imagine it being worth the tradeoff. I can't even name desktop workspaces on a Mac!

I'm reviewing the "new features" section on Monterey: https://www.apple.com/macos/monterey/features/ and I'm not seeing much other than a few known areas (multi-device support if you bought fully into the apple ecosystem, etc).

With most "I tried linux desktop" people it's a problem of DE/WM, but you seem to be beyond that so I'm genuinely curious.


> Can you please explain what exactly you get from the Apple desktop you don't with any of those options?

A more cohesive experience across the desktop and applications, eg.:

- The look and feel is uniform. Admittedly somewhat less so, these days now that the Apple Human Interface Guidelines are largely ignored.

- Mac apps and generally more stable.

- Mac apps generally have more features.

- The Mac apps I use generally tend to tie into the hardware (hw acceleration, drivers, etc) much better. The benefit of controlling the vertical, I suppose.

- The keyboard shortcuts both exist and are the same across apps.

- The design patterns.

- The AppleScript/x-callback-url, etc).

- Third party apps generally tend to fit well into the above.

On Linux, almost everybody seems to have their own ideas on how an app should be arranged. With full desktop environments, like GNOME and KDE lessen this, but it's still really, really common.

That said, my money is still on Linux and other open source OSes, because I don't want a single company telling me what I can and can't do.


yea like Mac not supporting Vulkan... or that Safari browser we have to make constant workarounds for, its the new IE5


You mean Safari, that browser that does not kill your battery?


MoltenVK is integrated into Vulkan now so Vulkan works fine on Mac


Not OP but I have extensive experience as a user of all three major desktop systems. Aqua, the macOS desktop, is by far the most stable and consistent, and in my experience the most usable and powerful as well.

A few things that stand out about it to me vs GNOME 3, KDE Plasma, Xfce, LXDE, and whatever the Windows interface is called:

- The consistency of the menu system, and being able to search and use the menus of any app from the keyboard with shift-cmd-?. This is like having Emacs M-x or Sublime/VSCode shift-cmd-P in every desktop app.

- Being able to assign custom keyboard shortcuts for any menu option in any app.

- Emacs keybindings for editing text in every text field (including on the web). I believe GNOME Tweaks is supposed to do this, but I could never get it to work reliably and universally like macOS.

- Native app ecosystem. Third-party Mac software is generally the most polished, though not quite as much exists as for Windows.

- System animations. This is a small one, but it makes things more fun and makes the whole system feel fluid and “organic,” for lack of a better term.


.


>When people say Linux just works fine for them I genuinely think like am I crazy or cursed or something?

People who do that generally don't use Gnome, but i3/swaywm/xmonad and multitude of tools where you're in control what happens.


Linux works fine for me, but I don’t use a desktop. I totally agree with you that they are disasters, with poor documentation and multiple conflicting ways to make settings.

Firefox crashing is probably a Firefox problem.

You can set the scaling to be anything you want on X, just set the Xft.dpi number in your .Xresources file.

One problem with your Hackintosh install is that it’s closed source, no? As far as you know it’s exfiltrating your financial information to China.


Desktop Linux is easily the least-stable OS I use with any regularity. It was my main OS from something like '01-'10, but after I finally gave OS X a try, and since Windows got its shit together some time late in the WinXP service pack cycle (or, arguably, Win2k, but that wouldn't run my games) and stopped crashing all the time, it's really hard to justify using desktop Linux.

Does the OS hard lock or completely crash? No... unless you have graphics drivers issues, which isn't unlikely. Then, oh man, yes, lots. X/Wayland crashes that restart the window server? Yep. Applications crashing pretty regularly or glitching out so badly they have to be restarted, including the basic applications distributed with the heavier DEs? Yep. And it turns out that your windowing environment crashing or the main program you're currently using crashing is really close to as bad as the whole machine blue-screening in Windows, from the perspective of the user. Using Linux makes me anxious, even though I very much know WTF I'm doing with it. MacOS and even (spits) Windows don't do that to me, any more. Now that I've experienced not feeling that way, I can't go back. I go years between work-lost crashes of any sort at all on MacOS. Linux, one month without such a thing would be miraculous.

If you build up from almost nothing and keep things very minimal and have very boring and stable old hardware, and do as much as possible from the command line, it can be kinda OK, but it's a lot of work to set something up like that, and ongoing effort every time you do something manually that'd be automatic or trivial on a more full-featured GUI desktop. If you start with something like Ubuntu or Fedora standard desktop installs, though, there's just too much that can go wrong, and it will, with some frequency.

I could tolerate some hardware or workflows not working and things generally being a little less convenient, maybe, if it were rock solid, but it's very far from that. The main problems seem to be that its entire graphics stack is incredibly fragile (Wayland doesn't seem to have done much, if anything, to fix that) and it's way too easy for a glitchy driver to screw up the whole system.


I've switched to Linux from Mac OS about 2 years ago, being frustrated by memory consumption, slowdowns and freezes. After trying a couple of distributions / DEs, I've settled on Manjaro (Arch based) and KDE. Delighted with flexibility, features and general performance and stability. While, before, I was constantly tempering with the environment and changing stuff and there was always "something missing", I've found myself not needing to touch configuration or change anything in my workflow for more than a year now...I did have some glitches and CPU usage issues when playing YT videos on Intel-based graphics, but have since been using Lenovo laptops with Ryzen 4000 and 5000, and it's been flawless without any tempering. P14s Gen 2 AMD (5850U + 32GB RAM) is the best laptop I have ever used (software development), and I change them A LOT...


AMD graphics on Linux nowadays are vastly more stable than anything else. NVIDIA is proprietary crap and Intel Graphics still have cross-platform bugs so often is almost comical.


The experience you are describing is more akin to what I remember being Linux in the '00s than nowadays. I run a machine with arguably a buggy Ryzen 1st gen motherboard (it's quite buggy on Windows too), and I have a very stable desktop experience with Arch Linux and KDE Plasma. The real game changer has been AMDGPU, I'm yet to have any sort of graphical issue with an RX 580 on Linux, and Plasma is arguably quite solid nowadays.

The only issue I had recently were audio issues which were both due to my buggy soundcard and a bug in Pipewire. Excluding that, I've had almost no issues with desktop Linux since 2017.


I think your experience is not universal. I would say windows has been far less reliable than linux on my thinkpad machine, but overall I have to say that I have yet to use an OS that never crashed on me, all software of this complexity is buggy.


macOS still crashes, has weird bugs and every 6 months they go and change a bunch of shit that either deprecates functionality, causes conflict with how things work or takes another handful of your privacy.

Grass isn't always greener.

Edit for an example, I'm stuck on Catalina because they tried deprecating functionality my firewall uses. And the window shifter shortcut key program I use won't work in Big Sur.


Sure, macOS still crashes, I didn't claim it doesn't. But macOS crashes with a lot less regularity for me.

I use Linux as my daily driver and the apps I use (GNOME Terminal, other terms, Spotify, Slack, Brave and Firefox, and the GNOME desktop itself) crash multiple, sometimes several times a day on my two (work, personal) machines.

This happens across diverse machines, as well as across multiple distributions (though again, pop!_os, so far seems much less trashy for whatever reason).


what are you doing to cause these crashes or what hardware?? i use manjaro and pop_o, makes no sense since i get less crashes then windows and my computer is on for days without anything crashing, other then me writing a bug and crashing my own stuff :) i do lots of work and gaming on them so i don't know why your experience is so different.


> and every 6 months they go and change a bunch of shit that either deprecates functionality, causes conflict with how things work or takes another handful of your privacy.

No, that's Microsoft Windows. Apple only does it annually.


Windows has some of the best backwards compatibility around. Stuff from the 90s still works on Windows. Meanwhile macOS deprecated all 32 bit programs.


Yes and many of the major Linux distros have discontinued or plan to discontinue official 32-bit/multilib support. Why? A lot of 32-bit libraries have serious vulnerabilities that no one is fixing. Essentially most of the world has transitioned to 64-bit except for a few key areas that are still trying to run libraries that are decades old.


The kinds of things the post above complained about breaking on macOS—third-party software that messes with system internals in unsupported ways—also break frequently on Windows. To say nothing of the first-party breakage of Microsoft constantly moving around settings, re-enabling or re-installing annoyances you turned off, and removing your ability to easily get rid of them, and adding new ones behind your back.

Microsoft really only tries to preserve basic compatibility for well-behaved applications that aren't tightly integrated with any OS components or specific hardware/drivers and aren't doing anything that Microsoft disapproves of. Anything that strays outside those boundaries will run into trouble, and a lot of software ends up falling outside those boundaries even if it didn't really need to.

Both operating systems are pretty bad at letting you wield control over your own computer. But macOS doesn't try as hard to obscure the BS or obstruct your attempts to tame it, and macOS actually lets you refuse updates that you don't want.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: