Disabling swap will prevent a system from suspending, which might be fine, but I use it.
And swap isn't some ancient relic. Sure, my 32GB desktop barely uses it, but my home server benefits greatly from having 64GB of swap in addition to 16GB of physical memory. It may not need to use much more than 16GB at any one time, but shit runs a lot better using a giant SSD swap with how many services I run.
System config is case by case, not "current year".
Not a lot. Even when it isn't a flatpak windows software running on linux won't be able to interact with the system anywhere near as deeply as on windows.
Couldn't you just use any wireless mouse? It's not like the deck is limited to only controllers.
Or is the idea that you want the left controller for movement, rather than using the left controls on the deck or a full controller?
That said, I'm sceptical that the joycon mouse experience is any good on surfaces other than a table. Or even then, considering the ergonomics of the thing when used as a mouse.
Even if the sensor in it is a good one, it's going to be bluetooth, and bluetooth mice have always had painfully noticeable latency in my experience.
Oh, for sure. If you wait a month, the bigger update can be a lot more trouble.
But look at it like this. If a rolling distro has a problem once a week, which is fixed within 24 hours, updating daily guarantees you will run into it.
While updating weekly means your chance is only one in seven. Since because by the time you update, the fix is more likely to already be in the repos, so you'll be jumping over the problematic update.
I share your desire for a system that always, 100%, every time, is there and ready to be used.
At the same time, I really like arch and the convenience of the AUR.
Hence, I boot-strap reliability onto my system through btrfs snapshots.
The setup is extremely simple, (provided your install is grub+btrfs) just install timeshift + the auto-snap systemd services. Configure it, and forget it.
Next time something breaks, instead of spending time on troubleshooting, you timeshift back to a known good point and then just get on with using your system.
With the auto-snap package installed every update also creates a restore point to go back to before it.
In addition to that, I started updating my system less frequency. The logic being that the more often you update a rolling release install, the more likely you are to catch it at a time when something is wrong, before it is fixed. Still regularly, but instead of every other day, I now have an update notification that goes off once a week.
The result has been zero time spent troubleshooting my system. If it worked yesterday, it'll work today. If it worked last week, but doesn't today, I'm a reboot away from a known good snapshot.
And that's completely normal. Every modern game has multiple versions of the same asset at various detail levels, all of which are used. And when you choose between "low, medium, high" that doesn't mean there's a giant pile of assets that go un-used. The game will use them all, rendering a different version of an asset depending on how close to something you are. The settings often just change how far away the game will render at the highest quality, before it starts to drop down to the lower LODs (level of detail).
That's why the games aren't much smaller on console, for exanple. They're not including all the unnecessary assets for different graphics settings from PC. They are all part of how modern game work.
"Handling that in the code" would still involve storing it all somewhere after "generation", same way shaders are better generated in advance, lest you get a stuttery mess.
And it isn't how most game do things even today. Such code does not exist. Not yet at least. Human artists produce better results, and hence games ship with every version of every asset.
Finally automating this is what Unreals nanite system has only recently promised to do, but it has run into snags.
Stuff like textures generally use a lossless bitmap format. The compression artefacts you get with lossy formats, while unnoticable to the human eye, can cause much more visible rendering artefacts once the game engine goes to calculate how light should interact with the material.
That's not to say devs couldn't be more efficient, but it does explain why games don't really compress that well.
I strongly disagree on their roguelite "bug" being something they need to drop.
Bastion didn't land for me, so I didn't play it, but Transistor would have shined as a roguelite. Its combat system is far too complex, and has potential for so much more, than what can be explored in one or two playthroughs.
The same goes for Cloudbank as a narrative setting.
Transistor, but with Hades' gameplay loop and storytelling style would be insane. It already felt like a roguelite, but without a gameplay or narrative reason to go in for multiple runs.
Supergiant hasn't cought a roguelite bug... They've found the perfect narrative and game format to match the gameplay systems and worlds they like to create.
And then an update comes along and breaks compatibility. News stories about this are frequent.
A proton update? Just use the last version.
If you mean game update, this dev is targeting proton. As in their "linux support" will take the form of making sure they don't break anything on their end.
It is also used for system suspend.
Disabling swap will prevent a system from suspending, which might be fine, but I use it.
And swap isn't some ancient relic. Sure, my 32GB desktop barely uses it, but my home server benefits greatly from having 64GB of swap in addition to 16GB of physical memory. It may not need to use much more than 16GB at any one time, but shit runs a lot better using a giant SSD swap with how many services I run.
System config is case by case, not "current year".
@Dave@lemmy.nz