I just installed EndeavorOS on an HP Spectre360 that’s roughly 2 years old. I am honestly surprised at how easy it went. If you google it, you’ll get a lot of “lol good luck installing linux on that” type posts - so I was ready for a battle.
Turned off secure boot and tpm. Booted off a usb stick. Live environment, check. Start installer and wipe drive. Few minutes later I’m in. Ok let’s find out what’s not working…
WiFi check. Bluetooth check. Sound check (although a little quiet). Keyboard check. Screen resolution check. Hibernates correctly? Check. WTF I can’t believe this all works out the box. The touchscreen? Check. The stylus pen check. Flipping the screen over to a tablet check. Jesus H.
Ok, everything just works. Huh. Who’d have thunk?
Install programs, log into accounts, jeez this laptop is snappier than on windows. Make things pretty for my wife and install some fun games and stuff.
Finished. Ez. Why did I wait so long? Google was wrong - it was cake.
Yes, if you don’t have a computer that literally came out this year, don’t have 2 separate graphics cards and don’t need HDR, or specific Windows-only software, Linux generally just works.
And sometimes the Windows only software is more “Windows only” and works with Wine
Windows 3D Builder though is firmly in the Windows Only category though. Which is a bummer because in my experience it’s the best at repairing 3D models for 3D printing that have errors like holes, redundant geometry, inverted faces, etc.
However, some older programs may actually behave better in Wine than say on Windows 11.
Oh, it also supports ancient 16 bit programs which Windows doesn’t anymore.
I didn’t know about the 16-bit support, which is really cool to say the least
I see myself as still somewhat of a noob to Linux
Surely there are alternatives
Lychee Slicer (slicer used for resin printing) is usually pretty good but sometimes it’ll still fail
Which basically means I’d have 2 choices, go in there manually with Blender or fire up Windows 3D Builder and let it work it’s magic
I haven’t fully given up on trying to find a way to get it to work on Linux but I’ve had to take a break from trying purely due to frustration
The dual GPU problem has actually for the most part also been solved; Optimus rarely poses a problem these days
Yup. Fedora on my laptop defaults the internal GPU and you can run any program with the dedicated card with a right click. Pretty nice compared to last year where I had to throw my laptop across the room 😂
Hopefully HDR can get crossed off that list soon
Hdr in games is the last frontier from me totally dumping windows.
It looks like it works in KDE 6, albeit a bit janky. Might be worth seeing if it works now, and if not come back in a year or so. https://wiki.archlinux.org/title/HDR_monitor_support
Yeah I’m using 6, it works well for desktop but not in most games yet
You should be able to get most games to work with some extra tinkering.
Got Armored Core running in HDR with this.
Also, I found it was enough to run the just the game in gamescope, no need to run the entirety of steam in a gamescope window. Just set the launch options for the game you want to enable HDR on.
Yeah I can get HDR to enable w game scope but it looks way off in stuff I’ve tested like elden ring or Tekken 8. Gets kinda blown out looking.
You probably won’t be able to run an LTS kernel on a brand new PC that just hit the market. But using the most recent kernel for arch or a derivative like endevorOS should work after like a week maximum.
I did have an issue like this on Ubuntu and its what made me actually start distro hopping since it worked fine on fedora and Arch using the latest kernels.
I experienced this when installing my AMD Radeon RX 7600XT, it was released two weeks prior to me installing it, back then, and Linux Mint and games in it were clearly running off software rendering. Turns out LM uses a more tried and true LTS kernel by default, luckily ot easily allows you to switch or manage kernels through the GUI updater, so I got that fixed easily.
There’s plenty of laptops with 2 separate graphics cards (mine included) and I’d say it’s the ideal experience if you need an NVIDIA card. Everything related to your system is done in the integrated Intel/AMD GPU (which works perfectly) and games and GPU intensive work (like CUDA) gets done in the NVIDIA one.
deleted by creator
And HDR has been working for me for over 6 months with Plasma 6. I wish people wouldn’t upvote this stuff that gives the wrong idea.
Pretty sure HDR is “working” in the sense that KDE went ahead and implemented unfinished specs, so that the very few apps that also went ahead with it can do HDR, but only on Wayland which breaks other things that are behind, and also often requires very recent versions and specific obscure parameters to be passed to enable HDR support?
Yeah, it’s a great step forwards and great for enthusiasts, but unless I’m very behind on the state of HDR myself, it’s still something I’d consider “coming soon” and not proclaim it’s just “working for me”. It certainly feels like a “year from now” kind of thing - something to anticipate, not try to force just yet.
I don’t know when the last time you used Wayland was but in Plasma 6 I wouldn’t say it “breaks other things.” Before Plasma 6 I had plenty of problems and stuck on X11 but now it’s great. So give it another try if you haven’t recently. Every issue I used to have with it a year ago is gone.
As for the obscure parameters, as of Plasma 6.1 all you have to do for games is add
gamescope --hdr-enabled
to the launch options for the necessary games. I don’t think that’s particularly difficult or obscure. You can also set up Steam itself to run in gamescope with --hdr-enabled and then every game will have it.For HDR movies/TV/YouTube you can copy/paste the necessary options into your mpv.conf and then forget about it. It’s a one-time thing and then it works forever.
The biggest place HDR is missing is in Firefox, but Firefox doesn’t have HDR on Windows either so that’s not a Linux thing that’s a Firefox thing.
In my opinion, HDR on the desktop isn’t really there yet in general. Not just on Linux but on computers as a whole. HDR right now is really only for enthusiasts. The only monitors that properly support HDR1000 are $500+ for the entry level ones and $800+ for the decent ones. And you have to choose between miniLED with local dimming that don’t have enough zones yet or OLEDs that get burn-in after a year.
I use Wayland exclusively, and I’m on up to date Arch. I’m talking about issues like screenshare issues with software, XDG desktop portal screenshare randomly breaking, steam notifications started positioning wrongly, steam’s search stopped working (not 100% sure if those two are Wayland)…
I also tried running a game in game scope with HDR enabled, experimenting with options and env cars I found online, but it just didn’t work. It was a sample size of one, but it was one game I wanted to play with friends, so I gave up in favor of just playing.
I also don’t use MPV - I tried testing HDR with it, and it probably worked fine, but I don’t have the right media to test it. (Side note: I should try mpv more seriously, but I haven’t needed a video player much in general)
An extra annoyance is the fact that the LDR colors are quite off with HDR enabled on Plasma. I suspect this is the fault of the display or configuration, but it’s still something I’d have to spend time researching and fixing, only to barely get any use out of it.
I haven’t tried setting up steam itself in gamescope, but wouldn’t it be limited to one window then? Could try it just to experience an HDR game, but otherwise it’s a bit of a deal breaker.
You might be right about it being for enthusiasts in the first place, but I feel like there’s a lot of people who will just pay up for a good screen that includes HDR, and on Windows I’d imagine you can just turn it on and start getting HDR from various sources - something that will surely become possible on Linux, but will take a while longer.
All that said, I’m not saying this to shit on Wayland or the developers’ work on HDR. Not long ago HDR was something that just wasn’t possible, and people were whining it’ll take another 10 years at this rate. I’m excited to see the next update on this, as well as stable wider adoption, but that’s the thing - that’s something I’m anticipating, not something I’m gonna be using now.
To be fair I don’t play a lot of games so I have only used HDR in Baldur’s Gate 3 and Elden Ring but it worked perfectly in both so I am 2 for 2.
Plasma is supposed to be able to display SDR content correctly while HDR is enabled (which Windows 10 can’t even do) but I can’t actually test that properly because my monitor doesn’t allow you to disable local dimming while in HDR mode so desktop stuff is completely unusable anyway. But if it doesn’t look right it is probably something you can fix in your monitor’s OSD.
I actually suspect the colors are correct and your normal colors are the incorrect ones. If your monitor has a wider gamut than sRGB you need to either A) set it to sRGB mode or B) use a calibrated ICC profile. If you aren’t doing one of those then all of your colors are oversaturated. When you switch into HDR they are correct but it looks dull in comparison because you’re used to them being wrong. It’s a pretty common thing people experience on Windows as well. Not a lot of people realize their colors are horribly inaccurate by default.
Also, most people only turn HDR on when it’s needed. You can add a keybind for it in Plasma’s shortcut settings. The commands are
kscreen-doctor output.1.hdr.enable
andkscreen-doctor output.1.hdr.disable
. You may need to change the output number to the correct one.Yep. I don’t like it honestly. It’s just an option if you want to set it up once rather than on a per-game basis.
That’s the thing, even if you pay up there aren’t actually any “good” HDR monitors. At least not in the same way as there are good HDR TVs. That’s why some people use 48 inch TVs as monitors instead of actual monitors. There’s a few monitors that are “good enough” but I wouldn’t call any of them “good” right now. I am one of those people who considers anything below HDR1000 to not be real HDR. If you look at the rtings.com monitor table, out of 317 monitors they’ve reviewed only TWO of them actually hit the 1000 nits of real scene brightness needed for HDR1000. And both are miniLED with local dimming which have haloing and blooming because there’s not enough dimming zones.
I have a feeling that by the time genuinely “good” HDR monitors exist (maybe 2-3 more years) that will be enough time for Linux programs to seamlessly support it instead of requiring launch arguments.
I do have my screen set to sRGB, and it is possible it’s simply incorrect in SDR - when I enable HDR, everything looks greenish IIRC. As for color profiles, I think there might’ve been a built-in profile that was automatically enabled in the settings? It’s possible I’m looking at horrible colors and not realizing, but at least I’m not doing things like a friend, who “optimized” his colors to improve gaming performance, and keeps complaining about colors being weird 😅
Color management is annoying, since you need a correct reference to verify anything, and I never looked into that.
As for the monitors, I specifically meant good screens, not screens with good HDR - I feel like if you go for a good screen these days, it’ll likely have some HDR support, letting people simply try it out with little effort on Windows.
Oh if you have it set to sRGB mode then they should be accurate enough. That means it’s something else. My previous monitor also had a green tint in HDR and that was just because that monitor’s HDR was awful. If you want to check if it’s the monitor itself, you could try it with Windows or attach a Roku/Chromecast/Firestick type device that can output HDR. If it’s still green it’s the monitor’s fault and if it looks fine then it’s Plasma’s fault.
And yeah plenty of monitors have “some HDR support” it’s just not real HDR unless it gets bright enough (and dark enough). The whole point of having a High Dynamic Range is that the range is well… high. Black should be black and extremely bright things should be extremely bright. A lot of monitors advertise “HDR400” or “HDR600” but don’t have local dimming and only go to like 450 nits. At that level it’s barely going to look different from SDR which a lot of people run at 300-400 nits anyway. The overall range of brightness is around 0.2-450 when it should be 0-1000. That 0.2 doesn’t seem like a lot but if you’ve ever seen a completely black image in a dark room you know how not-black that is. Which is why OLED and local dimming are so important for HDR.
What is the issue with laptops this year? I was planning to upgrade.
If you follow general newbie advice and install Mint, the kernel is older than your laptop and may not support everything.
Fedora, EndeavorOS or Manjaro would be a better choice then.
I use endeavour os so fine I guess.
You can always install a newer kernel, or move to something Fedora or Arch based. My son has ZorinOS on 6.8
I know. But I wouldn’t consider that “just works”.
It would mean installing the most popular beginner distro, finding out it doesn’t work, and then first having to google what is even a kernel…
True. PopOS has pretty current kernels, and is very beginner friendly. What I mean is that there are options, regardless of hardware (unless your on an m3 Crapple chip).
They are fine with newer kernels
“Generally” is the key word. I’m a linux user since slackware on diskettes. My daily driver is Mint, because lazy. I have 2 VMs with kali and kinoite.
A couple of days ago a kernel update borked my install. A problem with the Ryzen graphics driver.
For me it was trivial. Boot into the previous kernel, timeshift roll back, and back in business, but I can see how a newbie woul go into panic.
A satisfied “customer” will recommend you to a friend. A pissed off one will tell 10.
I just did that with hdr on alderlake n95. Easy as hell with NixOS.
Or a Mac ime. I tried to run mint OS on a 2016 intel MBPro and it was a disaster. I got it up and running but the Touch Bar didn’t work, the Wi-Fi didn’t work, all kinds of issues.
That’s because Apple doesn’t release drivers for all those components.
Running anything but a Mac OS on a Mac is a nice pet project, but you can’t expect Linux to work.
It depends. I installed mint on a 2011 MBP a couple of years ago and it was a breeze. I installed arch on it recently and the only snag was having to install the proprietary Broadcom driver to get wireless. It runs great though — which is just as well because it would actually be more difficult to install OSX on the bloody thing, seeing as they no longer support it.
A 2016 MBP is still a bit recent, but, as a general rule of thumb, by the time a Mac stops getting software updates, Linux will be ready for it.
You should check out what the Asahi Linux project has been able to do with the ARM Macs already, it’s pretty impressive.
I do check in on it every now and again, and it is impressive! I reckon they’ll be able to offer a seamless transition once Apple stops servicing M1 Macs, which is really good going. But, depending on your use case, making the leap now would mean sacrificing some functionality
I hate to break it to you friendo, but 8 year old hardware isn’t recent. It may still be usable, but that doesn’t make it recent. It’s ok though grandpa, let’s get you back to bed
Learn to read.
I can read, and a 2016 MacBook pro is not even a bit recent; It’s from 8 years ago :-)
Just a bit of light-hearted leg pulling, nothing to get worked up over
8 years is recent if it’s apple hardware and you’re expecting Linux to work flawlessly out of the box. Maybe things were different back in your day though
It was a lighthearted jab at calling 8 years ago recent; Not a political statement about Apple or operating systems.
8 years is a ton of time in tech, CPUs from 2016 are ancient. Single-core CPU performance has doubled in Intel’s laptop chips since then, and modern laptop CPUs from Intel are often 12-core, versus the top end 2016 MacBook Pro having 4 cores.
Not trying to start any fights, was just poking fun at the choice to call 2016 recent
NixOS on an M2 Air here. Works fine, other than the fingerprint reader.
I know that now but I had a bunch of people encourage me to do it as if it was a reasonable thing for a novice to crack lol
The 2016-2017 MBP are unusually bad. Devices on either side of that? You’re fine. But the 2016-2017 devices? No wifi (except in some extremely unusual cases) is the big problem. Even then, it amazes me how much does work, with zero configuration, with a simple graphical install. The problem with this vintage MBP isn’t that it’s hard to get running–it’s that it’s (almost) impossible, but the parts that aren’t impossible are as smooth as they can be.
Yes, that’s cold comfort. But I’m speaking from the POV of an owner of a 2017 MBP who desperately wanted to keep it going.
The coda to the story is that my wife used it for a while with her business but it fell victim to an absolutely bizarre heat issue where the heat sink vents hot air directly across the controller cable for the display, leading to inevitable failure. Again: not an issue on either side of this model year. It’s sad because it could’ve served for another 4-5 years, making the initial purchase price substantially more tolerable.
Apple doesn’t support Linux
I mean I got mint OS running so it depends on what you mean by “support.”
deleted by creator