Yes, I run two instances of Radarr and Sonarr.
One caps out at 1080p, the other one only allows 2160p.
Jellyfin just has two separate libraries for them.
I'm mostly doing this to prevent unecessary transcoding away from home where streaming 4k HDR is unlikely.
At some point I will merge them but bandwidth for 4k streaming is not there yet and proper HDR tone mapping is still rare.
I mainly use it instead of googling and skimming articles to get information quickly and allow follow up questions.
I do use it for boring refactoring stuff though.
Those are also the main uses cases I use it for.
Really good for getting a quick overview over a new topic and also really good at proposing different solutions/algorithms for issues when you describe the issue.
Doesn't always respond correctly but at least gives you the terminology you need to follow up with a web search.
Also very good for generating boilerplate code. Like here's a sample JSON, generate the corresponding C# classes for use with System.Text.Json.JsonSerializer.
Hopefully the hardware requirements will come down as the technology gets more mature or hardware gets faster so you can run your own "coding assistant" on your development machine.
That depends, we have quite a few images that are just a single shell script or a collection of shell scripts which run as jobs or cronjobs. Most of them are used for management tasks like cleaning up, moving stuff or debugging.
Has the big advantage of being identical on each node so you don't have to worry about keeping those shell scripts up to date with volumes. Very easy to just deploy a job with a debug image on every node to quickly check something in a cluster.
Of course, if the shell script "belongs" to an application you might as well add the shell script in the application container and override the start arguments.
Anything interesting going on in the kernel log while connection doesn't work?
If so, you could maybe write a bug report at the amdgpu repo.
One thing I could imagine that is happening is that Linux chooses a higher chroma subsampling than on Windows. Had that issue before with a monitor that had a wrong EDID. Unfortunately it's a real pain to set the chroma subsampling on Linux with AMD.
That's strange, 6.6.14 is the same version that's on Fedora currently. My friend with a 7900 XTX is still on 6.5.0 so I can't get him to test that version right now.
Having a bleeding edge kernel can and will come back to bite you. There's a reason why many distros hold back with kernel updates for so long, there's issues that only can be found with user feedback.
From experience, "stable" in the kernel world doesn't mean much unfortunately. I encountered dozens of issues over various versions and different hardware already and it's the main reason I don't run rolling release distros on my main rig.
There's also been enough times where the latest Nvidia driver borked my test system at work so I'm fine with just not running the latest kernel instead.
Only on Windows, on Linux it runs in user space.