Skip Navigation

Posts
1
Comments
390
Joined
2 yr. ago

  • Only on Windows, on Linux it runs in user space.

  • Just as a side note, if you play on Linux there's currently no anti cheat that runs in the kernel.

    It's all in user space and only has your user permissions.

  • SteamVR on Linux works out of the box if you have a Valve Index or a HTC Vive.

    There are some others that work via ALVR but can't speak about that.

    Two caveats though:

    • Valve likes to break SteamVR for Linux with every third update and then takes weeks to fix it
    • It works but there's a lot of issues with it. From incorrectly scaled UI, to missing features, to SteamVR Home not working for a year straight

    Most of the time there are community workarounds but there's only so much they can do.

  • Yes, I run two instances of Radarr and Sonarr. One caps out at 1080p, the other one only allows 2160p.

    Jellyfin just has two separate libraries for them.

    I'm mostly doing this to prevent unecessary transcoding away from home where streaming 4k HDR is unlikely. At some point I will merge them but bandwidth for 4k streaming is not there yet and proper HDR tone mapping is still rare.

  • Due to time and energy i tend to an out of the box, non OSS solution.

    Why not both? OPNsense and pfSense both sell official hardware.

    Both are pretty easy to configure but have pretty much no limit on how deep you can go.

    Unifi works great as well but you hit a ceiling fairly quickly if it needs to do anything advanced.

  • Same here, even asked the developer if the Steam Deck is supported but they couldn't tell me.

    Refunded it for now, might check back next sale if it works and the microtransactions are acceptable.

  • I mainly use it instead of googling and skimming articles to get information quickly and allow follow up questions.

    I do use it for boring refactoring stuff though.

    Those are also the main uses cases I use it for.

    Really good for getting a quick overview over a new topic and also really good at proposing different solutions/algorithms for issues when you describe the issue.

    Doesn't always respond correctly but at least gives you the terminology you need to follow up with a web search.

    Also very good for generating boilerplate code. Like here's a sample JSON, generate the corresponding C# classes for use with System.Text.Json.JsonSerializer.

    Hopefully the hardware requirements will come down as the technology gets more mature or hardware gets faster so you can run your own "coding assistant" on your development machine.

  • That depends, we have quite a few images that are just a single shell script or a collection of shell scripts which run as jobs or cronjobs. Most of them are used for management tasks like cleaning up, moving stuff or debugging.

    Has the big advantage of being identical on each node so you don't have to worry about keeping those shell scripts up to date with volumes. Very easy to just deploy a job with a debug image on every node to quickly check something in a cluster.

    Of course, if the shell script "belongs" to an application you might as well add the shell script in the application container and override the start arguments.

  • Environments like Kubernetes only run containers so you would deploy any shell script with containers as well.

  • Anything interesting going on in the kernel log while connection doesn't work?

    If so, you could maybe write a bug report at the amdgpu repo.

    One thing I could imagine that is happening is that Linux chooses a higher chroma subsampling than on Windows. Had that issue before with a monitor that had a wrong EDID. Unfortunately it's a real pain to set the chroma subsampling on Linux with AMD.

  • Having a bleeding edge kernel can and will come back to bite you. There's a reason why many distros hold back with kernel updates for so long, there's issues that only can be found with user feedback.

    From experience, "stable" in the kernel world doesn't mean much unfortunately. I encountered dozens of issues over various versions and different hardware already and it's the main reason I don't run rolling release distros on my main rig.

    There's also been enough times where the latest Nvidia driver borked my test system at work so I'm fine with just not running the latest kernel instead.

  • Over DisplayPort? That's interesting, I knew AMD can't do HDMI 2.0 but there shouldn't be a problem with DP.

    Might wanna try a proper new certified DP 2.1 cable, just to be safe.

    I "only" drive a AW3423DW but no issues at 3440x1440 with 165Hz.

  • Since people normally only report on negative experiences: I was lucky enough to get a reference AMD 6900 XT during the GPU shortages.

    Switched from Ubuntu to Fedora for it because Ubuntu didn't have firmware for it yet.

    Ever since then it has been a rock solid GPU. Never even had such a stable GPU under Windows.

    Have been running Fedora with Wayland for more than 2 years now and can count the crashes on a single hand, most were my fault.

    I'm sure once that issue is sorted out that GPU is going to ride along for years with minimal maintenance required.

    (You might want to downgrade your kernel until then though)

  • I would not depend on DNS records being private. On the off chance that one of the nameservers messes up, I would prefer if no subdomains are leaked.

    But you're correct, most of the time those leaks happen somewhere else.

  • You can avoid these scans by only using wildcards on your DNS entries and SSL certificates.

    Both of these are commonly used by bots to find new domains.

  • Will have to try that, also a good way to one-up my neighbor with those CDs hanging outside. :)

  • Already have a few of those, always a good party gag for the ones that know.

  • Thanks for the artist view on things. :)

    I mostly want something pretty to look at but adding a message to it is an excellent idea.