Switch 2 would arrive in the second half of 2024 and development kits are already out
Molecular0079 @ Molecular0079 @lemmy.world Posts 18Comments 603Joined 2 yr. ago
It is so annoying how companies insist on dictating what we can or cannot do with the hardware we've already paid for. It's doubly annoying when you consider how it contributes to e-waste because it limits our ability to repurpose these things for other use cases. My old PS4 is just sitting in my closet gathering dust because I don't have an old enough firmware that let's me put Linux on it.
I guess the fact that consoles are usually loss leaders are a major contributing factor to why these devices are so locked down, but still...In this case, Microsoft is preventing emulation enthusiasts from running in retail mode and partaking in their game store! Maybe they just don't want the legal repercussions of dealing with Nintendo or something.
If emulation is your jam, I highly recommend just building a small form factor PC or going with a mini-PC instead. More expensive in the short term, but less troubles and more flexibility in the long-term.
Parsec doesn't allow streaming from a Linux host.
Yes, the 0.18.3 changelog has Fixing hot_ranks and scores to append a published sort
as one of the items.
It works out of the box for me. You may need to enable Steam Input for Playstation controllers in the Steam settings in order to get it to work inside games.
This post is from 2 years ago and its popping up in my feed today. Lemmy really needs to fix its algorithm...
While I am eagerly looking towards the day Linux just works on the M1 / M2 myself and I can plop Arch Linux into a shiny new Macbook Pro, I would caution that if this is a work laptop you should not be wiping the OS without careful consideration and perhaps approval from your IT department. I don't know how chill your work environment is but a lot of companies have data security measures in place that prevent confidential info from leaking out if your laptop ever gets stolen. You should also consider whether it will impact your ability to work with other people when they're using software or other work pipelines that aren't compatible with Linux.
This is due to DMABUF being temporarily disabled on Nvidia due to bug 1788573. Without DMABUF, Firefox has to introduce extra buffer copies to get the WebGL content rendered to the screen instead of simply sharing the buffer.
You can, however, force enable it temporarily until that bug is resolved by going to about:config
and enabling widget.dmabuf.force-enabled
. Or you can just wait for the Nvidia 545 driver, which should have this fixed.
What to Look for in a NAS?
Sucks that this also happened to you. How did you end up recovering your data?
Agreed! If the application can handle these files (or other resources) disappearing for a while during network issues, then sure, they can be separate. However, if an application depends on a file for its core functionality, I do not think it is a good idea.
Mmm, not quite. I am not familiar with how picoshare works exactly, but according to the picoshare docker README, it uses the data volume to store its application sqlite database. My original suggestion is that the Docker application and its application data (configs, cache, local databases, and other files critical to the functioning of the application) should be on the same machine, but the images and other files that picoshare shares can be remote.
Basically, my rule is to never assume that anything hosted on another machine will be guaranteed to be available. If you think picoshare can still work properly when its sqlite database gets ripped out without warning, then by all means go for it. However, I don't think this is the case here. You'll risk the sqlite database getting corrupted or the application itself erroring out if there's ever a network outage.
For example, with the Jellyfin docker image, I would say that the cache
and config
volumes have to be local, while media
can be on a remote NAS. My reasoning is that Jellyfin is built to handle media files changing / adding / disappearing. It is however, not built to gracefully handle its config files and caches disappearing in the middle of operation.
I wouldn't recommend running container volumes over network shares mainly because network instability between NAS and server can cause some really weird issues. Imagine an application having its files ripped from underneath them while they're running.
I would suggest containers + volumes together on the server, and stuff that's just pure data on the NAS. So for example, if you were to run a Jellyfin media server, the docker container and its volumes will be on the server, but the video and audio files will be stored on the NAS and accessed via a network share mount.
You don't even need to automate. Certbot comes with a systemd timer called certbot-renew.timer
which does this for you.
Permanently Deleted
Diablo IV definitely works in Linux. I played Sorc all the way to level 75 just fine. You can install Battle.net via Lutris, Bottles, or even directly via Proton Experimental in Steam and then install Diablo IV from there.
why people were so quick to adopt Chrome which was Google controlled from the start.
Because for a long time Chrome was just much faster. It wasn't until a couple of months ago that Firefox started becoming performant enough for me to use as a daily driver. Even then, there's still issues with how slow it takes Mozilla to implement new web technologies like WebGPU, etc.
What to Look for in a NAS?
Whatever you do, don't buy a QNAP. I have no idea whether Synology is similar, but I am having a hell of a time recovering data off my dead QNAP TS-453 Pro because they make it almost impossible to repair without specialized tooling and the on-disk format is using a custom version of LVM that's incompatible with standard Linux. The reason why it died was because of a manufacturing defect in the onboard Intel J1900 chip that QNAP knew about, didn't do a recall for, and refused to provide proper support for. The disks themselves are fine, I am just forced to buy another QNAP just to access the data. After this experience, I swore off using turnkey NAS boxes. If your data is stored in a proprietary box that's unrepairable and using a disk format that's non-standard, that data is not safe.
Now I have a self-built DIY NAS that I've setup with Arch Linux and OpenZFS and I am pretty happy with the results. Sure, going SSD over HDD is an expensive choice, but given that I had to replace each hard drive in my QNAP 2-3 times each over the span of 7 years, I think the cost balances out and the extra performance is sooo worth it (80MB/s vs 700-1000MB/s).
No, I was just saying that with Nvidia, the need for the latest Mesa and kernel is lessened somewhat since you'll most likely be using the proprietary drivers instead. With AMD, its pretty important to be on the latest Mesa and latest kernel, especially for newer AMD GPUs. On Ubuntu, this usually means adding a bunch of additional PPAs, whereas on other distros like Fedora and Arch, those driver updates just come through the regular system updates.
On the subject of AMD vs Nvidia in general, it really depends on your usecase. I feel like a lot of Linux users on Reddit and the Fediverse are really biased towards AMD while being blind to the cons of owning an AMD card. It basically boils down to:
AMD Pros
- Better performance / dollar (for rasterized graphics only)
- Wayland
- FOSS drivers that work out of the box
- Better support for hardware video acceleration in browsers.
Nvidia Pros
- Much better raytracing performance
- DLSS
- CUDA / Optix
- Better video decoder and encoders (when they're supported by the software you use at least)
- Better support for compute and AI workloads
- Better day one support for new hardware and usually adopts Vulkan extensions faster
Corporate loyalty is stupid and should be left on Reddit. Make your own decision based on your personal needs. Anecdotally, I own both AMD (Vega 7 and Radeon 680M) and Nvidia (RTX 3090) hardware. AMD tends to be less stable in my experience, but I know others have experienced the opposite.
I believe Matrix already supports olm
which is the same encryption technique used by Signal. The main issue with Signal becoming federated is that in order to make the federation work, a lot of metadata will leak and that could be a cause for concern when using Signal as a private messenger for important things like whistleblowing, etc.
Permanently Deleted
Eh, the mobile aspect of this makes me completely uninterested. It also just means it will never actually reach the graphical fidelity of Unrecord and the comparison is just setting up the project for failure.
The biggest problem I have with full disk encryption is that there's still no way to include /boot into the BTRFS root partition for snapshotting. Having your kernel images separate from your system snapshots makes rolling back massively painful.
It would be interesting if they made the jump to x86, but in addition to what you mentioned, I don't think they'd go for it due to the extra power and thermal requirements. I think people would bash on Nintendo if the Switch 2 was way beefier in form factor compared to the Switch 1.