I've seen so many bots on lemmy summarising the contents of websites and blocked all of them, because of this. They are not reliable, and I still caught myself reading those. I don't even want to know how many summaries which are in a post body are just generated by an LLM.
If someone comes to me I'm more than happy to answer questions and help, but I won't bring it up. People don't like being told that their tool of choice is "bad" "not optimal" or anything like that. Even if it's only their choice because they grew up with it or don't want to learn anything new. And they still need to learn if it's more than browsing the web.
Also I really don't want to be the one they come running to once something doesn't work the way they expected - or not at all. I don't have the time nor the inclination to be tech support for my family and half of my friends.
Better check, you definitely already have a firewall running since docker needs it for NAT. A fresh debian has, as far as I know nftables and iptables-nft installed.
What firewall are you using? Docker doesn't like non-iptables firewalls and it has been more than once that I changed my nftables config and really the whole networking stack to figure that out.
I have a ubuntu server vm which had some iptables save-restore unit activated which was messing with my rules, that was fun to debug.
Is anything keeping you from just reinstalling the system and mounting your home into it again (maybe the majority of your customisations live in /home too)? I feel that is a lot less of a hassle than copying files around.
In principle you should be able to restore your system by just copying all of the relevant files from the backup to their correct partitions - it can't really get any worse if it doesn't work.
For the future: A backup is only any good if you know how to restore it and tested that that actually works.
Regarding the permissions: If you do a cp fileA.txt fileB.txtfileB.txt will normally be owned by the creating user. So a sudo cp ... will create the files as root.
I would personally use rsync with a few additional options, archive among them. This way the fs is restored exactly as it was. But that doesn't make a whole lot of sense if the files weren't copied that way too.
Nouveau is stable and runs, but don't expect the best performance. The official NVIDIA driver is unstable, lacks proper wayland support but has decent performance. I'd go with anything but a NVIDIA GPU.
I recently wrote a script which finds duplicate files and hard links them. I can share it with you, though there are no guarantees of it's safety. There are probably better already established tools out there.
I couldn't even work if I had aliases in my muscle memory. Imagine ssh'ing to a server and every second command you issue doesn't exist because it's some weird alias you set up for yourself.
I'll stick with the "pure" command and use tab completion.
That's also part of the reason why I don't use some of the fancy new tools like ripgrep and exa.
You can place the .xpi file in a special folder. On my linux system that is in /usr/lib/firefox/browser/extensions/. Which would be the system wide folder. There are others which only affect the current user thkugh.
The user folder is $profile_dir/extensions/. To open the profile directory you can type about:profile in you address bar and click on Open Directory besides Root Directory in the default profile section.
I just recently updated shutup10 because of another annoyance of windows and was surprised that it didn't solve my problem right away. Even with shutup10 it's barely bearable.
I've seen so many bots on lemmy summarising the contents of websites and blocked all of them, because of this. They are not reliable, and I still caught myself reading those. I don't even want to know how many summaries which are in a post body are just generated by an LLM.