Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)HE
Posts
21
Comments
603
Joined
2 yr. ago

  • The ARM architecture does apparently (I'm no expert) have some inherent power-efficiency advantages over x86

    Well, the R from ARM means RISC, and x86 (so, by extension, x86_64) is a CISC architecture, so they are not even in the same "family" of designs.

    Originally, CISC architectures were more popular, because it meant less instructions to write, read, store, etc. Which is beneficial when hardware is limited and developers write in assembly directly.

    Over time, the need for assembly programming faltered, and in the 90s, the debate for CISC vs RISC resurfaced. Most developers then wrote code in C and C++, and the underlaying architecture was losing relevance. It is also worth noting that due to a higher number of instructions, the machine code is more granular, and as a result, RISC code can inherently be further optimised. It also means that the processor design is simpler than for CISC architectures, which in turn leaves more room for innovation.

    So, all else being equal, you'd expect Qualcomm to have an advantage in laptops with this chip, but all else isn't equal because the software isn't there yet, and no one in the PC market is quite in a position to kickstart the software development like Apple is with Macs.

    Now, a key consideration here is that the x86 architecture has been dominating the personal computer market for close to half a century at this point, meaning that a lot of the hardware and software is accommodating (wrt functionality, optimisation, etc) for it specifically.

    Therefore, RISC architectures find themselves at a disadvantage: the choice in Operating Systems is limited, firmware and drivers are missing, etc. Additionally, switching to RISC means breaking legacy support, or going through emulation (like the Apple M3 does).

    However, in our modern ecosystem, the potential gain from switching to a RISC architecture is considerable (storage is cheaper than ever, RAM is cheap and fast, and seldom anyone is writing assembly anymore. Plus, those who do might enjoy the higher degree of control the additional granularity affords them, without having to do everything by hand, given the degree of assistance modern IDEs offer), and it will gradually become a necessity for every vendor.

    For now however, the most popular computer Operating System worldwide has poor performance on ARM, and no support for other RISC architectures (such as RISC-V) that I know of.

    The challenge here is in breaking a decades long dominance that originated from a monopoly: if you have paid attention to what Apple has been doing, they initially used large parts of FreeBSD to build a new Operating System that could run on their custom processors (Motorola 68k), and then built the rest of their Operating System (Darwin and Aqua) on top of it. This afforded them the possibility to switch to Intel CPUs in 2005, and back to ARM in 2020 with their M series CPUs.

    The quality of their software (in large parts derived from the quality of free software and of staggering design work) has allowed them to grow from a virtually negligible share of computer users to the second place behind windows.

    Now, other Operating Systems (such as Linux) have the same portability characteristics as FreeBSD, and can feasibly lead to such a viable commercial OS offering with support for several hardware architectures.

    "All" that is needed is a consistent operating system, based on whichever kernel fits, to supplement MacOS in the alternative offering to windows.

    Most software would be available, and a lot of firmware would too, thanks to ARM being used nearly exclusively in mobile phones, and most mobile phones running a Linux kernel.

    Once we have a (or better, a few) Linux or BSD based operating system(s) with commercial support, consistent design, and acceptable UX for "normies", such CPUs will become a very valid offering.

  • AGI

    It depresses me that we have to find new silly acronyms to mean something we already had acronyms for in the first place, just because we are simply too stupid to use our vocabulary appropriately.

    AI is what "AGI" means. Just fucking AI. It has been for more than half a century, it is sensical, and it is logical.

    However, in spite of its name, the current technology is not really capable of generating information, so it isn't capable of actual "intelligence". It is pseudo-generation, which it achieves by sequencing and combining input (AKA training) data. So it does not generate new information, but rather new variations of existing information. Due to this fact, I would prefer the name of "Artificial Adaptability" (or "AA", or " A2") to be used in lieu of "AI", or "Artificial Intelligence" (on the grounds that it means something else entirely).

    Edit: to the people it may concern: stop answering this about "Artifishual GeNeRaL intelligence". I know what AGI means. It takes all of 3 seconds to do an internet search, and it isn't even necessary: everyone has known for months. I did not bother to explicit it, because I did not imagine that anyone would be simple enough to take literally the first word starting with "g" from my comment and roll with that in a self-important diatribe on what they imagined I was wrong about. So if you feel the need to project what you imagine I meant, and then correct that, please don't. I'm sad enough already that humanity is failing, I do not need more evidence.

    Edit 2: "your opinion only matters if you have published papers". No. Also it is a really stupid argument from authority. Besides, anyone with enough time on their hands can get papers published. It is not a guarantee of quality, but merely a proof that you LARPed in academy. The hard part isn't writing, it is thinking. And as I wrote before, I already know this, I need no more proof, thank you.

  • Yeah, getting old sucks. I'm still 25 in my head, but my body constantly reminds me that I'm over 30, and that it is slowly falling apart like a decrepit abandoned car.

    I guess what I was trying to say is: we have crossposting between communities, what prevents us from crossposting between years as well? :)

  • The jwt is invalidated once you logout.

    Invalidated how?

    You can also change/reset your password to invalidate all login tokens for your account.

    OK. I was afraid this would not be the case. Thanks for confirming.

  • Honestly I don't think spez is too bright.

    He first taunted people some days ago by essentially admitting to having kompromat on most users... Which is the number one thing not to do when you have kompromat. You make sure the target knows, but you also absolutely deny it to anyone else, so you won't be scrutinised, and cannot be sued for blackmailing... 🤷‍♂️

    But then he went ahead and allowed ads to be exactly similar to user submitted content, which, again, is not something you want your users to know about, as it is definitely destroying the last remaining shreds of trust that anyone might still have in reddit...

    And, finally, he goes ahead with an IPO, knowing fully that the most potent user centric trading community is from reddit, and that they successfully fucked with several established brokers and traders already. All WSB needs to do at this point is set up a back-up community, probably on discord or some shit, and they will literally be able to own reddit.

    What is he, trying to make musk look smart?

  • That isn't real. It wouldn't pass peer review. Here is the actual code:

     js
        
    function GetCookieValue(x) {
      return JSON.stringify(x);
    }
    
    user.cookies.agreed = Boolean(GetCookieValue(true));
    
    if(!DarkPatternPopup()) {
      // Make sure we respect the user choice
      user.cookies.agreed = Boolean(GetCookieValue(false));
    }
    
    if(user.cookies.agreed) CollectData(user);