AMD is generally a much better experience overall, but a handful of things are worse than NVidia (off the top of my head, Ray Tracing, AI, and Emulators. AMD cards tend to have graphical glitches in emulators even on Windows. They can be mitigated, and aren't universal but they are an issue).
In my experience, AMD is the way to go. My old GTX 1080 was a beast and put in great work, but just had too many naggling stability issues that constantly got in the way of enjoying it. Been really happy with my AMD.
Oversteer should be what you need. Just take note that you need an extra driver module for the T300RS.
Edit, if you meant the TS-PC you may be out of luck. It looks like support for the TS-PC has an open request in the T300RS driver but it isn't implemented yet.
I suppose, but in my mind, unless an absolutely revolutionary technology takes the world by storm, the industry wouldn't just up and abandon x86 and ARM unless compatibility was decent. We're talking ablut a world where businesses still use Windows XP because their software won't work on later versions.
Trial and error, lots of reading ProtonDB, wikis, etc. I only just recently got a decent handle on how to properly use wine prefixes to get mods and things working.
In general, use Steam when you can, then use Heroic for non Steam games. Lutris is very powerful and super useful for games that aren't installed from a larger distributor, ie from a CD or direct from the devs, but I find the UI can be a bit spartan. Steam and Heroic have fewer features but are way more user friendly.
Good luck. It can definitely be frustrating but remember that you have access to tons of resources and an excellent community if you encounter issues.
I've actually never tried on X11. I will admit, using VR seems to cause some issues with the rest of my desktop (Plasma ocassionally needs to be reloaded). However in the grand scheme, I can get past that for now considering it doesn't cause any gameplay issues.
I think that just means not making any crazy technological decisions that will likely make games incompatible on future hardware. A great example was the PS3's cell processor. It was excellent tech when used properly, but absolutley not "forward compatible"
We're on the exact opposite sides of this argument.
Being able to host your own servers means there is a much higher potential to have servers located close to you, giving you much lower latency. If there aren't, host your own. This is great for people in, for example, Australia, who often get really poor support in terms of servers in large games. Not an issue when they can host as many as they want.
As for security, what's more secure than having a server with a password only me and my friends know? On top of that, when a server is my own, I know when it's going to be down. When the studio is the one controlling all the servers, you are at their whim.
As for games not needing to last decades... why? Do you want to be kicked off of a service you paid for, then expected to buy a new one that's basically the same thing (which you will also eventually be kicked off)? Especially when the original still (in theory) functions perfectly?
Wanna know how to make that irrelevant? Make the server files available from the start. Wanna play with just your friends? Host a server. Wanna play with a dedicated group that actually bans cheaters effectively? Join a clan. Then, when the sequel comes out, who cares if the server tech is already known, because we can just host our own and collectively oust the cheaters ourselves. It's funny because when multiplayer is handled this way, it stays active for decades. Look at the community for the old Battlefield's, SW Battlefront's, Call of Duty's, Unreal Tournament's, Quake's, etc etc etc. They're small, but they're all still active and not chock full of hackers because they're community led and community maintained. That's a hell of a lot more consistent and reliable than trusting the studio to develop and maintain the server tech, and squash cheating long term. Eventually that system will always fail (look at every old CoD on console, where you can't run your own servers. It's basically a coin flip whether you end up in a game with a hacker, and I guarantee the devs will never do anything about it).
This is definitely a huge unsung benefit of having larger corperations get their fingers into FOSS projects. Not just the funding, which is great, but the literal job security. Good luck bullying a meta or google employee into giving over control to a stranger.
I will say this, I have a newer laptop that required manually installing a realtek wifi driver. I'm fine with that, but I know not everyone is, and I know it's already included in more up to date distros (Arch needed no setup on the same laptop, I'd imagine it's the same story with Ubuntu being more recent). So I get not wanting to go with Debian, I just used it as a base example of a "purer" OS. I guess Mint might have been a better alternative to use for my specific questiom.
Totally understandable, QOL and creature comforts are important. To be fair, I'm personally the type of user who prefers a spartan system that I can then tailor to my needs, rather than lots of features OOTB. To each their own I suppose.
I think looking nicer is very subjectve. I personally prefer default Gnome over Ubuntu's tweaks. However silent grub makes complete sense. Word vomit every boot does look very hack-ish if you arent used to it.
Serious question, genuinely curious; Beyond more recent package versions, why do people choose Ubuntu over plain Debian? Debian has been exceptionally stable for me, pushes no proprietary BS, and is as easy to intall and setup as any other distro I've used. Plus, for the average computer user, all the packages are recent enough that things should work as expected.
As a a part time tiling window manger user, I love the workspaces. So much cleaner and easier to keep track of for me than simply alt+tabbing between numerous windows glommed into the same desktop.
Spending easter weekend at my in-laws, meaning Folgers K-cups in a keurig. I'm incredibly spoiled with my fresh ground robusta+french press+tablespoon of ghee, but honestly after an incredibly long day of two family get-togethers (stopped in to see my own family en route), I was just happy to wake up the next morning and brew something bitter.
I mean, the Wii, WiiU and DS consoles have reasonably busy homebrew scenes. You're right, they're pretty small compared to the other consoles mentioned but they definitely exist, and I'm sure the Switch will get the same when they move on to the next console
AMD is generally a much better experience overall, but a handful of things are worse than NVidia (off the top of my head, Ray Tracing, AI, and Emulators. AMD cards tend to have graphical glitches in emulators even on Windows. They can be mitigated, and aren't universal but they are an issue).
In my experience, AMD is the way to go. My old GTX 1080 was a beast and put in great work, but just had too many naggling stability issues that constantly got in the way of enjoying it. Been really happy with my AMD.