Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)GA
Posts
1
Comments
634
Joined
2 yr. ago

  • They were largely unaffected by the tariffs targeting China, because US trade policy distinguishes between mainland China and Taiwan. Problem was that Trump announced huge tariffs on everyone, including a 32% tariff on Taiwan.

  • I wonder what the use case is for 480W though. Gigantic 80" screens generally draw something like 120W. If you're going bigger than that, I would think the mounting/installation would require enough hardware and labor that running out a normal outlet/receptacle would be trivial.

  • This is pretty normal, in my opinion. Every time people complain about common core arithmetic there are dozens of us who come out of the woodwork to argue that the concepts being taught are important for deeper understanding of math, beyond just rote memorization of pencil and paper algorithms.

  • Therefore, I think they'd get out a microscope and oscilloscope and start trying to reverse-engineer it. Probably speed up the development of computer technology quite a bit, by giving them clues on what direction to go.

    Knowing what something is doesn't necessarily teach people how it was made. No matter how much you examine a sheet of printed paper, someone with no conception of a laser printer would not be able to derive that much information about how something could have produced such precise, sharp text on a page. They'd be stuck thinking about movable metal type dipped in ink, not lasers burning powdered toner onto a page.

    If you took a modern finFET chip from, say, the TSMC 5nm process nodes, and gave it to electrical engineers of 1995, they'd be really impressed with the physical three dimensional structure of the transistors. They could probably envision how computers make it possible to design those chips. But they'd had no conception of how to make EUV at wavelengths necessary to make the photolithography possible at those sizes. No amount of the examination of the chip itself will reveal the secrets of how it was made: very bright lasers pointed at an impossibly precise stream of liquid tin droplets against highly polished mirrors that focus that EUV radiation against the silicon and masks that make the 2-dimensional planar pattern, then advanced techniques for lining up 2-dimensional features into a three dimensional stack.

    It's kinda like how we don't actually know how Roman concrete or Damascus steel was made. We can actually make better concrete and steel today, but we haven't been able to reverse engineer how they made those materials in ancient times.

  • Do you have a source for AMD chips being especially energy efficient?

    I remember reviews of the HX 370 commenting on that. Problem is that chip was produced on TSMC's N4P node, which doesn't have an Apple comparator (M2 was on N5P and M3 was on N3B). The Ryzen 7 7840U was N4, one year behind that. It just shows that AMD can't get on a TSMC node even within a year or two of Apple.

    Still, I haven't seen anything really putting these chips through the paces and actually measuring real world energy usage while running a variety of benchmarks. And the fact that benchmarks themselves only correlate to specific ways that computers are used, aren't necessarily supported on all hardware or OSes, and it's hard to get a real comparison.

    SoCs are inherently more energy efficient

    I agree. But that's a separate issue from instruction set, though. The AMD HX 370 is a SoC (well, technically, SiP as pieces are all packaged together but not actually printed on the same piece of silicon).

    And in terms of actual chip architectures, as you allude, the design dictates how specific instructions are processed. That's why the RISC versus CISC concepts are basically obsolete. These chip designers are making engineering choices on how much silicon area to devote to specific functions, based on their modeling of how that chip might be used: multi threading, different cores optimized for efficiency or power, speculative execution, various specialized tasks related to hardware accelerated video or cryptography or AI or whatever else, etc., and then deciding how that fits into the broader chip design.

    Ultimately, I'd think that the main reason why something like x86 would die off is licensing reasons, not anything inherent to the instruction set architecture.

  • it's kinda undeniable that this is where the market is going. It is far more energy efficient than an Intel or AMD x86 CPU and holds up just fine.

    Is that actually true, when comparing node for node?

    In the mobile and tablet space Apple's A series chips have always been a generation ahead of Qualcomm's Snapdragon chips in terms of performance per watt. Meanwhile, Samsung's Exynos has always been behind even more. That's obviously not an instruction set issue, since all 3 lines are on ARM.

    Much of Apple's advantage has been a willingness to pay for early runs on each new TSMC node, and a willingness to dedicate a lot of square millimeters of silicon to their gigantic chips.

    But when comparing node for node, last I checked AMD's lower power chips designed for laptop TDPs, have similar performance and power compared to the Apple chips on that same TSMC node.

  • Yeah, Firefox in particular gave me the most issues.

    Configuring each app separately is also annoying.

    And I definitely never got things to work on an external monitor that was a different DPI from my laptop screen. I wish I had the time or expertise to be able to contribute, but in the meantime I'm left hoping that the Wayland and DE devs find a solution to be at least achieve feature parity with Windows or MacOS.

  • I mean, that's basically the author's problem, then. I suspect the software support just isn't there for the hardware that ships on this particular laptop, to where it's easiest to manually put it in some blurry non native resolution, as the least crappy solution.

  • What's the current state of Linux support for high dpi screens? As of two years ago I had some issues with getting things to work right in KDE, especially with GTK apps, by manually fiddling with system font sizes and button sizes, before I ended up donating that laptop to someone else.

  • What’s tricky is figuring out the appropriate human baseline, since human drivers don’t necessarily report every crash.

    Also, I think it's worth discussing whether to include in the baseline certain driver assistance technologies, like automated braking, blind spot warnings, other warnings/visualizations of surrounding objects, cars, bikes, or pedestrians, etc. Throw in other things like traction control, antilock brakes, etc.

    There are ways to make human driving safer without fully automating the driving, so it may not be appropriate to compare fully automated driving with fully manual driving. Hybrid approaches might be safer today, but we don't have the data to actually analyze that, as far as I can tell.

  • Permanently Deleted

    Jump
  • What's annoying, too, is that a lot of the methods that have traditionally been used for discounts (education, nonprofit, employer-based discounts) are now only applicable to the subscriptions. So if you do want to get a standalone copy and would ordinarily qualify for a discount, you can't apply that discount to that license.

  • Permanently Deleted

    Jump
  • Is it just me, or do new office features seem kinda pointless or unnecessary?

    I feel like almost all the updates of the last two decades have been:

    • Security updates in a code base that was traditionally quite vulnerable to malware.
    • Technical updates in taking advantage of the advances in hardware, through updated APIs in the underlying OS. We pretty seamlessly moved from single core, 32-bit x86 CPU tasks to multicore x86-64 or ARM, with some tasks offloaded to GPUs or other specialized chips.
    • Some improvement in collaboration and sharing, unfortunately with a thumb on the scale to favor other Microsoft products like SharePoint or OneDrive or Outlook/Exchange.
    • Some useless nonsense, like generative AI.

    Some of these are important (especially the first two), but the user experience shouldn't change much for them.

  • Rechargeable batteries weren't really a thing in the 70's. For consumer electrical devices, batteries were one use, and anything that plugged in needed to stay plugged in while in operation.

    Big advances in battery chemistry made things like cordless phones feasible by the 80's, and all sorts of rechargeable devices in the 90's.