Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)GE
Posts
1
Comments
1,002
Joined
2 yr. ago

  • Most of Apple's history, actually.

    Macs have a reputation for being expensive because people compare the cheapest Mac to the cheapest PC, or to a custom-built PC. That's reasonable if the cheapest PC meets your needs or if you're into building your own PC, but if you compare a similarly-equipped name-brand PC, the numbers shift a LOT.

    From the G3-G5 era ('97-2006) through most of the Intel era (2006-2020), if you went to Dell or HP and configured a machine to match Apple's specs as closely as possible, you'd find the Macs were almost never much more expensive, and often cheaper. I say this as someone who routinely did such comparisons as part of their job. There were some notable exceptions, like most of the Intel MacBook Air models (they ranged from "okay" to "so bad it feels like a personal insult"), but that was never the rule. Even in the early-mid 90s, while Apple's own hardware was grossly overpriced, you could by Mac clones for much cheaper (clones were licensed third-parties who made Macs, and they were far and away the best value in the pre-G3 PowerPC era).

    Macs also historically have a lower total cost of ownership, factoring in lifespan (cheap PCs fail frequently), support costs, etc. One of the most recent and extensive analyses of this I know if comes from IBM. See https://www.computerworld.com/article/1666267/ibm-mac-users-are-happier-and-more-productive.html

    Toward the tail end of the Intel era, let's say around 2016-2020, Apple put out some real garbage. e.g. butterfly keyboards and the aforementioned craptastic Airs. But historically those are the exceptions, not the rule.

    As for the "does more", well, that's debatable. Considering this is using Apple's 90s logo, I think it's pretty fair. Compare System 7 (released in '91) to Windows 3.1 (released in '92), and there is no contest. Windows was shit. This was generally true up until the 2000s, when the first few versions of OS X were half-baked and Apple was only just exiting its "beleaguered" period, and the mainstream press kept ringing the death knell. Windows lagged behind its competition by at least a few years up until Microsoft successfully killed or sufficiently hampered all that competition. I don't think you can make an honest argument in favor of Windows compared to any of its contemporaries in the 90s (e.g. Macintosh, OS/2, BeOS) that doesn't boil down to "we're used to it" or "we're locked in".

  • Chromium itself will. Other Chromium-based browser vendors have confirmed that they will maintain v2 support for as long as they can. So perhaps try something like Vivaldi. I haven't tried PWAs in Vivaldi myself, but it supports them according to the docs.

  • Debian still supports Pentium IIs. They axed support for the i586 architecture (original Pentium) a few years back, but Debian 12 (current stable, AKA Bookworm) still supports i686 chips like the P2.

    Not sure how the rest of the hardware in that Compaq will work.

    See: https://www.debian.org/releases/stable/i386/ch02s01.en.html

  • Probably ~15TB through file-level syncing tools (rsync or similar; I forget exactly what I used), just copying up my internal RAID array to an external HDD. I've done this a few times, either for backup purposes or to prepare to reformat my array. I originally used ZFS on the array, but converted it to something with built-in kernel support a while back because it got troublesome when switching distros. Might switch it to bcachefs at some point.

    With dd specifically, maybe 1TB? I've used it to temporarily back up my boot drive on occasion, on the assumption that restoring my entire system that way would be simpler in case whatever I was planning blew up in my face. Fortunately never needed to restore it that way.

  • They're big and scary enough today, and in the past they were even bigger and scarier.

    https://en.wikipedia.org/wiki/Haast%27s_eagle

    Relationship with humans

    Some believe that these birds are described in many legends of the Māori mythology, under the names pouākai, Hakawai (or Hōkioi in the North Island).[52][53] According to an account given to Sir George Grey—an early governor of New Zealand—Hōkioi were huge black-and-white birds with yellow-green tinged wings and a red crest. In Māori mythology, Pouākai would prey and kill humans along with moa,[54][55][56] which scientists believe could have been possible if the name relates to the eagle, given the massive size and strength of the bird.[52][57] However, it has also been argued that the "hakawai" and "hōkioi" legends refer to the Austral snipe—in particular the extinct South Island species.[58]

    For context, Haast's eagle was about twice the size of today's Harpy eagle, which itself looks like it came out of a nightmare. See photos at https://www.demilked.com/giant-bird-harpy-eagle/

  • Hopefully they have better defenses against legal action from Nvidia than ZLUDA did.

    In the past, re-implementing APIs has been deemed fair use in court (for example, Oracle v Google a few years back). I'm not entirely sure why ZLUDA was taken down; maybe just to avoid the trouble of a legal battle, even if they could win. I'm not a lawyer so I can only guess.

    Validity aside, I expect Nvidia will try to throw their weight around.

  • Do you remember which features specifically were missing? It might be something I haven't used. For me it was a pretty straightforward upgrade from version 12.

    The only problem I ran into was that adding widgets was a little funky when changing the grid size. I think I had to add the widgets before changing the grid size (I use a 5x10 grid) or they didn't align correctly. I encountered similar problems with many of the launchers I tried.

  • Exactly. My nightstand has a plain ol' USB 2.0 slow charger, which is plenty to get it fully charged overnight. On light days, that's all I need.

    I have a USB-PD charger on my desk and in my travel bag, which I'll use to get me through the day as needed, but my phone only takes 20W max IIRC. There've certainly been times when traveling when I wished it could charge much much faster, because I don't have access to power for long stretches of time.

    I used to have a OnePlus phone, which used SuperVOOC instead of just USB-PD. Much faster and I never had heat issues. I'd love to see SuperVOOC adopted more outside of China. I think OnePlus is the only brand with SuperVOOC sold in my country.

  • LawnChair is the best option I've tried that has a similar design to Nova.

    I've tried at least a couple dozen launchers since Nova got bought. Most of them are either half-baked or have a very different design (e.g. based on radial menus or text-only lists). If you're into minimalism, there are a lot of good options. If you want a full-featured icon grid that behaves more or less like Nova, LawnChair is it.

    I'm running LawnChair 14 Beta now. You can get it off the GitHub. Last I checked, the version on Google Play was very old.

  • It's worth mentioning that with a large generational gap, the newer low-end CPU will often outperform the older high-end. An i3-1115G4 (11th gen) should outperform an i7-4790 (4th gen), at least in single-core performance. And it'll do it while using a lot less power.

  • I don't think there's any way to count years without rooting it somewhere arbitrary. We cannot calculate the age of the planet, the sun, or the universe to the accuracy of a year (much less a second or nanosecond). We cannot define what "modern man" is to a meaningful level of accuracy, either, or pin down the age of historical artifacts.

    Most computers use a system called "epoch time" or "UNIX time", which counts the seconds from January 1, 1970. Converting this into a human-friendly date representation is surprisingly non-trivial, since the human timekeeping systems in common use are messy and not rooted in hard math or in the scientific definition of a second, which was only standardized in 1967.

    Tom Scott has an amusing video about this: https://www.youtube.com/watch?v=-5wpm-gesOY

    There is also International Atomic Time, which, like Unix Time, counts seconds from an arbitrary date that aligns with the Gregorian calendar. Atomic Time is rooted at the beginning of 1958.

    ISO 8601 also aligns with the Gregorian calendar, but only as far back as 1582. The official standard does not allow expressing dates before that without explicit agreement of definitions by both parties. Go figure.

    The core problem here is that a year, as defined by Earth's revolution around the sun, is not consistent across broad time periods. The length of a day changes, as well. Humans all around the world have traditionally tracked time by looking at the sun and the moon, which simply do not give us the precision and consistency we need over long time periods. So it's really difficult to make a system that is simple, logical, and also aligns with everyday usage going back centuries. And I don't think it is possible to find any zero point that is truly meaningful and independent of wishy-washy human culture.

  • Interesting. I'm not sure that's a Lemmy thing per se, maybe specific to your client, or some extension or something altering CSS?

    I just checked in my browser's inspector, and the italicized text's

    <em>

    tag has the same calculated font setting as the main comment's

    <div>

    tag.

    FWIW, I'm using Firefox with my instance's default Lemmy web UI.

  • YES.

    And not just the cloud, but internet connectivity and automatic updates on local machines, too. There are basically a hundred "arbitrary code execution" mechanisms built into every production machine.

    If it doesn't truly need to be online, it probably shouldn't be. Figure out another way to install security patches. If it's offline, you won't need to worry about them half as much anyway.