Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)OR
Posts
0
Comments
209
Joined
2 yr. ago

  • I still won't buy one just because of this news - they have done lots, lots of shitty things in the past. GameWorks, PhysX, Geforce Partnership Program, etc. While AMD is not exactly a saint when it comes to open sourcing, they still commit far more than Nvidia to open standards.

  • Debian is my go-to distro whenever stability is desired.

    I use Arch btw (on my desktop), but I would never run it on my server... I feel that I could easily ruin my database (Postgres) if I am not careful enough with the rolling release.

  • Rosetta certainly does emulate* x86. It can dynamically recompile x86 instructions to ARM instructions, otherwise applications that include an x86 JIT wouldn't work at all on ARM Macs.

    I know people will be pedantic about this... but other emulators (Dolphin, PCSX2 etc) have included a recompiler for ages and no one seemed to have a problem calling them emulators.

  • Overall everyone will use less data when there's a data cap, I found.

    My ISP implemented data caps back then too (thankfully it's all removed now, but 60GB was really bonkers!) and I just find it fascinating how much traffic I generate nowadays, when I don't have to care how much data I have left this month.

    Anyways, data caps shouldn't be relevant anymore in 2023 when absolutely everything can handle gigabits and more. It's interesting how American ISPs still implement them.

  • I sorta understand why data caps were implemented in the past. Some people hosted servers on their home connection, and their total internet traffic in a week would far exceed that of a normal user's. Data caps were meant to force people to be conservative on their internet usage so this would not happen.

    But come on now, it's 2023. If your internet infrastructure could not handle that amount of traffic, you are a laughing stock of ISPs.

  • If I follow your search terms, the first Google result is this. If the data were to be trusted, then most of the employees are absolutely paid much less than the "top positions".

    If you are taking the CEO's income literally (without considering their assets), then you are hopeless.

  • Where did you pull those numbers from, then?

    Edit: And yes, if you are not productive, you get paid less. That's the whole point. If you are not 100x productive, you don't deserve to get 100x the wages of a regular employee too.

  • I am pretty sure we don't need to raise every employee's wages. Some of the upper management who sit in their offices biting nails, for example.

    The point is to reduce the wage inequality inside a company.

  • As an anecdotal though, I once saw someone simply forwarding (ie. copy and pasting) their exam questions to ChatGPT. His answers are just ChatGPT responses, but paraphrased to make it look less GPT-ish. I am not even sure whether he understood the question itself.

    In this case, the only skill that is tested... is English paraphrasing.

  • People are running KDE desktop on the VisionFive 2.

    Arch Linux has had a RISC-V port for quite a while now - FYI, just in case you don't know, Felix (the guy running the website I linked) is one of the Arch Linux package maintainers.

  • Then that means two major Wayland compositors (KDE and GNOME) support per monitor fractional scaling.

    Which makes me more confused about the "global setting" problem as mentioned by the previous commenter...

  • My laptop on battery lasts about 6 to 7 hours on Linux. It's about an hour shorter than Windows but nowhere near "drained fast" territory.

    Now... if I use X11 that's a whole other story! Somehow the battery life is cut in half because of higher GPU usage, and I still can't figure out what causes it.