5 min read

I Just Want My Computer to Work

This could be a “look back on 30 years of using computers” but I’ve reduced it in scope. Here goes: I still build my own computer.

I have a Late 2012 15” Macbook Pro, and it’s still perfectly usable, but I can’t pass up AMD’s recent Ryzen 250$ 16-core CPUs. It’s too much power to have in a reasonably priced laptop, but you can build a tower PC with it and 16GB of RAM for around 1000$ - that’s just bonkers. Especially coupled with my 21:9 34” monitor. And I don’t carry my “computer” around for serious work - I have a sit/stand IKEA where I do all of my work at home. And that’s really my flow, with the phone and ageing tablet covering the late-night-browsing-in-bed needs. Did I mention the 34” 21:9 monitor too? It’s hard to lug it around.

But this is an awesome PC only on paper, because in reality it’s been crappy. I don’t blame the hardware, it’s just the operating system landscape these days is quite… grim.


I don’t hate on MS these days. I’ve grown up since my ~1997 days of bashing “M$” on LinuxQuestions running RedHat 4.1 Vanderbilt on my P166 MMX (rocking KDE 2 and Gnome 1.x which was shiiiiiiit). But it’s still not an OS that I can trust, especially Windows 10. And it’s a pain to run Windows 7, the mythical “last great release” for some, since some hardware drivers no longer support it. Ultimately, it comes down to trust. I don’t have it for Windows.


I’ve always been a fan of macOS, ever since it was called System 7. As I got a bit more technical about OSs, I couldn’t believe the clusterfuck that memory management on Mac’s was up to MacOS 9. Total amateur hour. And so is HFS and HFS+ - if you have valuable data on HFS+, may the gods bring you fortune.

But macOS X has strong technical underpinnings, and it usually “just works” which is easy to do since Apple only supports a small pool of hardware. And even though their QA effort lately is crap, it’s still a solid OS. Still, I don’t buy Apple anymore for many reasons, so macOS is out for me.

Yes, I could run a Hackintosh as I actually did once at work for almost a year, but the dread of point releases breaking everything and the fact it only runs (reliably) on Intel hardware are too much for my goal of just getting shit done.


I loved BeOS. It was so badass. I loved it. But the opensource recreation effort HaikuOS is in alpha for like a decade now, and it’s missed out on momentum. It’s cool to play with the latest nightlies, but it’s effectively dead. Maybe, maybe if Linux gets crappy or toxic enough and if Haiku gets a fucking release out, maybe more devs will move over and it will be a thing. But I doubt it.

*BSD (Free, Open, net)

I like BSD in practice. Different rock-solid kernels, saner package trees, one way to do things, text driven… it all sounds good. In practice, I’ve been spoiled by GUI installers and also, fundamentally, do not believe an “installer” in late 2017 should be a ncurses or shell driven affair.

Still, I could overlook all that and use the experimental installers for FreeBSD or even bite the bullet and install it anyway. Then come the year-old Gnome packages, the tools coming out that don’t package for *BSD, the hardware support lagging behind Linux, the elitist communities… no.


Which brings us to the only “real” choice - Linux, since I won’t even entertain the “x86 Android” notion.

Linux is great, and this whole blog post is a rant about how angry I am with it but still use it, but it just pisses my off how much it has improved and not at the same time. Bear in mind, much of my anger is directed at manufacturers.

My shiny new AMD Ryzen only functions in a stable manner because I keep running kernel 4.12, even though Fedora 26’s current is 4.13 but that’s a non-option for me since it freezes every 15 minutes. I also have to disable SMT (only using 8 of the potential 16 usable cores) and ASLR since it doesn’t work well on AMD CPUs of this family, potentially making my PC less secure. AMD could fix these issues on the kernel, but they probably already have all of the resources they’re “willing to commit” to opensource platforms taken up by their GPU driver efforts, so these awesome CPUs that sold like hot cakes just a few months ago are all going to Windows users since the Linux experience is so crappy.

And why is this? Linux kernel development is driven by corporations these days - the day of the “lone hacker” contributing significant chunks of work is kinda dead, because the kernel has ballooned in complexity, and big companies have armies of coders supplying code for major subsystems that take up all of the coordinating effort’s attention. AMD just doesn’t see the money in getting Ryzen’s to work for end-users on Linux, since 99% of users will be on Windows, which becomes true because of their lack of effort on Linux, etc, etc. Maybe with the EPYC server-grade CPUs they’ll start committing patches to the kernel to fix their thread scheduling and whatnot, and maybe some of that will trickle down to Ryzen users, but so far I’ve gone from kernel 4.11 to 4.13 and it’s still crap.

And that pisses me off about Linux as a whole. When I installed it for the first time in 1997, it was crappy, unreliable and just overall not a very good OS for desktop use. It has since matured to be the #1 OS on datacenters across the world and in frankensteined fashion as Android is on billions of phones around the world. It has won the OS wars by a number of metrics.

Then howcome it’s still crappy with a number of caveats as a development workhorse for a power user with 20 years of experience working with it? How long until it breaks free from these crappy market forces and becomes “a thing” on people’s PCs?

What the hell else will it take?