New Debian release on the horizon?
Yeah, I did that in a system as well and seems to work, for for the others I'll have to wait for the final release, too critical. I'm one of those guys who runs a lot of Debian because the risks of a distro like Ubuntu Server are way over what I can be exposed to.
Actually I'm waiting on Debian 13 to get Incus 6.0 LTS! Current machines with LXD 5.0 are starting to annoy me.
Jesus, people analyzing Debian releases like if it was the stock market đ
Apple carefully made it in a way that complies with the law but doesnât really change anything. And imposed small charges and requirements here and there com make it totally impractical.
You donât understand, because side loading in iOS doesnât not work like it does in Android - not even close. Side loading in iOS is full of restrictions, apps that only work for 3 days and require reinstall etc. And it all still requires binaries to be signed with a valid certificate approved by Apple.
Yeah, this is unfortunate.
a system with the ability to run ANY generic non-signed binary if the users chooses to do so
I guess you didn't read this part.
allow installing of other app stores
Because this is all good, but it is way more simple for Apple to allow us to run unsigned code than to come up with all the infrastructure so you can sign your apps without 3rd parties.
Side loading is a pile of shit designed to make it appear that you're freedom and you aren't locked into the store when in fact, you are. They still charge bullshit charges, force app notarization etc.
Apple should be forced by someone to turn iOS into what macOS currently is, that is, a system with the ability to run ANY generic non-signed binary if the users chooses to do so. They can burry it under the settings, force people to accept a bunch of warnings but we should be able to run unsigned code.
The current state of things is bullshit, Apple is very good at sandboxing, they can keep the system signed, secure and intact while allowing unsigned code to run.
Performance optimization is hard because itâs fundamentally a brute-force task, and thereâs nothing you can do about it
There is, "common sense" seems to accelerate that bruteforce work. Also some developers seem to be better at performance bruteforce than others, some enjoy it, others hate it.
Simplicity also plays a very important role here, most software is built by adding more features and at some point you may be able to simplify things a lot and make run a lot faster if you just rewriting it with all the use cases in mind.
I wouldnât. Those kinds of maps are very powerful, they provide accurate and constantly updated information from millions of users.
and can even use as live environment, donât even need to install (in Windows this is not easy to do)
Not true, Rufus creates bootable and persistent USB flash drives with one checkbox. You can do it manually also.
I was trying to illustrate a point, you may have your distro, your packages and what think you need, but if we're talking about post-apocalyptic you'll probably need other stuff and at that point you have windows computers and windows software installed or installers available pretty much everywhere starting with your next door neighbor and with Linux not so much.
I'm not saying it is impossible, I'm just saying you need a deal with a bunch of complexities that in the post-apocalyptic wont be pretty.
Caches expire, eventually.
Did you ever see any fresh install of Windows not be able to display at least 800x600 on any GPU? You didn't. It works to the minimum, want more, sure grab an msi and install the drivers.
AppImage suffers from the same problem that Flatpak does, the tool do work offline aren't really good/solid and won't save you for sure. It also requires a bunch of very small details to all align and be correct for things to work out.
Imagine the post-apocalyptic scenario, if you're missing a dependency to get something running, or a driver, or something specific of your architecture that wasn't deployed by the friend alongside the AppImage / Flatpak (ie. GPU driver) you're cooked. Meanwhile on Windows it has basic GPU drivers for the entire OS bakes in, or you can probably fish around for an installer as fix the problem. It is way more likely that you'll find machines with Windows and windows drivers / installer than Linux ones with your very specific hardware configuration.
you are just not being helpfull
I am. When "shit hits the fan" you want to be as compatible and and frictionless as possible, because at point having a running computer will be a feat on its own and you probably won't have time/power to deal with software complexities and "ways around issues". You most likely want to boot a machine from whatever parts are available and get some data out of it or maybe in and move on to hunting or farming. No time to be there fixing xyz package with broken dependencies and whatnot. If someone gives you a flash drive with data it follows the same logic, you want to get to something as quickly as possible.
In Linux there's also an over-reliance on web-based solutions that can be self-hosted in your system or a 3rd one but that, once again, just adds extra friction that you don't have with "simple" formats and binaries like pdf, docx and others that at the end of the day are just self contained apps that can be run as is without extra fuzz nor cloud dependencies.
I'm all for Linux, alternative and open-source, but in the situation described you last concern is if you're running proprietary stuff.
This is going to be controversial but...
Linux is not really suited for the post-apocalitic no-internet world, the way the repositories are built and software is packed (almost nothing is static, a lot of dependencies on other packages everywhere) just makes it really impractical and hard to deal with those scenarios. Flatpak / containers and friends even make this situation worse because you can't easily mirror the repositories and there's no straightforward way of exporting a Flatpak as a solid file that can be shared around and installed everywhere - the current tool for that doesn't account architectures and dependencies very well.
Windows however is a much more solid and good option, yes, it's painful to hear this but in Windows you can get an exe from a friend in a flash drive and it runs as is. Same goes for installers, reinstalling the OS etc. There's only a couple of .net framework installers that will cover dependencies for 99.99% of stuff in a few MB. The same goes for macOS, however it depends on a lot of software signing nowadays and certificates that can expire and you then have a problem.
Yeah, some people don't like to run with full repo mirrors but keep updated copies of the Debian ISO that can be mounted as repositories at any point:
- https://tadeubento.com/2023/debian-iso-downloads-and-offline-archives/
- https://tadeubento.com/2023/debian-iso-images-as-apt-repositories/
It's essentially the same, but in another format.
This is great news!
Let me rewrite a part:
Why is this important? Because intelligence agencies (in Europe) allegedly don't want to depend on Google (most likely NSA controlled) and might be using the data from OSM and they want to make sure it is good.
:)
Trust me, at that point there won't be any explaining possible :D
We've been burned by a lot of distros in the past and right now it all boils down to using Debian and RHEL, everything else mostly failed at some point or will not uphold the stability guarantees. Even containers with Alpine fucked us over once with the musl DNS issues and a few other missing parts...