Skip Navigation

narc0tic_bird
Posts
1
Comments
1,215
Joined
2 yr. ago

  • What you're describing as "DisplayPort alt mode" is DisplayPort Multi-Stream Transport (MST). Alt mode is the ability to pass native DisplayPort stream(s) via USB-C, which all M chip Macs are capable of. MST is indeed unsupported by M chip hardware, and it's not supported in macOS either way - even the Intel Macs don't support it even though the hardware is capable of it.

    MST is nice for a dual WQHD setup or something (or dual UHD@60 with DisplayPort 1.4), but attempt to drive multiple (very) high resolution and refresh rate displays and you'll be starved for bandwidth very quickly. Daisy-chaining 6 displays might technically be possible with MST, but each of them would need to be set to a fairly low resolution for today's standards. Macs that support more than one external display can support two independent/full DisplayPort 1.4 signals per Thunderbolt port (as per the Thunderbolt 4 spec), so with a proper Thunderbolt hub you can connect two high resolution displays via one port no problem.

    I agree that even base M chips should support at least 3 simultaneous displays (one internal and two external, or 3 external in clamshell mode), and they should add MST support for the convenience to be able to connect to USB-C hubs using MST with two (lower-resolution) monitors, and support proper sub-pixel font anti-aliasing on these low-DPI displays (which macOS was perfectly capable of in the past, but they removed it). Just for the convenience of being able to use any random hub you stumble across and it "just works", not because it's necessarily ideal.

    But your comparison is blown way out of proportion. "Max" Macs support the internal display at full resolution and refresh rate (120 Hz), 3 external 6K 60Hz displays and an additional display via HDMI (4K 144 Hz on recent models). Whatever bandwidth is left per display when daisy-chaining 6 displays to a single Thunderbolt port on a Windows machine, it won't be anywhere near enough to drive all of them at these resolutions.

  • welp

    Jump
  • Many (most?) captchas I stumbled upon weren't case sensitive.

  • Max saying the car doesn't turn at all doesn't really fit Horner's statement about the high downforce setup, unless they just got the balance horribly wrong.

  • I personally can't really play at 30 FPS (anymore), especially in games where aim is important.

  • I'd pay for YouTube Premium Lite if it didn't state "Note: Ads will still show on music content and outside of videos." and if that'd make them stop harvesting all my data.

  • I agree, once you factor in a power supply (or PoE hat), case and storage a Raspberry Pi really isn't all that cheap anymore nowadays. Unless you have a project that specifically benefits from the GPIO pins or the form factor, just get a cheap barebones mini PC or a used one with RAM and SSD already included.

    This will get you a system that's way more powerful even if it's a couple of years old (the Pi's SoC is fairly weak) and I/O throughput is no contest, normally with at least a dozen PCIe lanes to use for NVMe storage or 10 gigabit network cards, if you so desire.

  • Around 15 TB migrating to a new NAS.

  • That's the way the Fediverse works. When you post a comment, an update is distributed with your comment. Once you delete it, an update is distributed that you deleted your comment. Lemmy then still shows that there was a comment but it has been deleted. Federation delays could mean that they didn't see your comment as deleted instantly. In theory, a Fediverse instance could also just ignore the deletion update and keep showing the deleted comment.

  • Hardly surprising considering that Brave, Vivaldi and Edge are all based on Chromium. The Brave and Vivaldi team won't have the resources to maintain Manifest v2 support for each new Chromium version, and Microsoft doesn't have any reason to support v2 with Edge outside of goodwill.

  • Yes, but even then the Phoronix results seem to suggest a larger gap in performance.

  • Something is wrong with the Windows scheduler and these new chips. The Linux results aren't revolutionary, but they're about what you'd expect from what AMD marketed in terms of IPC uplift.

    More reviewers should benchmark hardware on multiple operating systems.

  • Regarding 2.): Disable "Enable GPU accelerated rendering in web views (requires restart)" under Settings > Interface in Steam. This should fix the hangs.

  • If there's no setting in the iOS Settings app to take away the camera permission (which isn't even given by default and the app has to ask for it), it can't access the camera (unless it exploits a potential vulnerability in iOS, which I highly doubt).

    It probably used data from motion sensors and the reason you saw your room was because of the glossy display. Or you have allowed the YouTube app to access your camera.

  • I've kind of given up waiting and even though I own the PS3 and 360 version, I'm now playing it on PC on an emulator, with 64 FPS (odd I know, but that's where it caps at) and an internal resolution of like 5760x3240.

    A tad too late to sell me your overpriced "Remaster" Rockstar, by the time this arrives I'll probably have played through it. If it's a good working port, does away with the FPS cap entirely and adds good keyboard + mouse support, I would've likely bought it otherwise.

  • I'm waiting to see how DeepComputing's RISC-V mainboard for the Framework turns out. I'm aware that this is very much a development platform and far from an actual end-user product, but if the price is right, I might jump in to experiment.

  • What I mean by that is that they will take a huge disservice to their customers over a slight financial inconvenience (packaging and validating an existing fix for different CPU series with the same architecture).

    I don't classify fixing critical vulnerabilities from products as recent as the last decade as "goodwill", that's just what I'd expect to receive as a customer: a working product with no known vulnerabilities left open. I could've bought a Ryzen 3000 CPU (maybe as part of cheap office PCs or whatever) a few days ago, only to now know they have this severe vulnerability with the label WONTFIX on it. And even if I bought it 5 years ago: a fix exists, port it over!

    I know some people say it's not that critical of a bug because an attacker needs kernel access, but it's a convenient part of a vulnerability chain for an attacker that once exploited is almost impossible to detect and remove.

  • That's so stupid, also because they have fixes for Zen and Zen 2 based Epyc CPUs available.

    Intel vs. AMD isn't "bad guys" vs. "good guys". Either company will take every opportunity to screw their customers over. Sure, "don't buy Intel" holds true for 13th and 14th gen Core CPUs specifically, but other than that it's more of a pick your poison.