Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)VI
Posts
0
Comments
533
Joined
2 yr. ago

  • For no reason. No one is saying that it is different, only that it's impossible to prove one way or the other. Light traveling the same speed in all directions, and light traveling at 2x c away from an observer and instantaneously on the return, and every other alternative that averages out to c for the round trip, are indistinguishable to any experiment we can conduct.

  • Another interesting way to conceptualize it is that the speed of light is infinite and it's causality/information that is limited to c. You shine a light at the moon and it takes 1.3 seconds for the "fact" that the light was turned on to propagate that far.

  • I haven't used the PlayStation or Xbox ones, but the Switch mobile device media transfer is remarkably awkward to use. The Switch will broadcast an ad hoc SSID which you can connect a device to and then hit a web server hosted by the Switch to get your pictures. As if that wasn't already kind of awkward lots of phones will fight you on connecting to the SSID because it doesn't have an internet gateway, so they consider it a bad connection and just automatically switch back to mobile data.

  • When I heard that BIOS updates were going out automatically via Windows update I had just assumed the devices in question must be using an A/B update scheme to prevent the risk of accidentally bricking the system, because obviously they should.

    Absolutely insane that's not the case.

  • Hell, current plant-based alternatives would be doing great too if they weren't inexplicably more expensive. Impossible Meat has a lower environmental impact and requires fewer resources to make than beef? Great! Why does it cost more then? I'm not even vegetarian but I'd happily switch to fake burgers if they weren't double the price.

  • This is manipulation to sell more copies and nothing more.

    ...yes? People who make games do things to make their game appeal to people. Framing that as a negative or unusual is kind of weird. Literally everything any game developer does to make the game entertaining or appealing is "a manipulation to sell more copies".

  • It's not really laziness. Storing as JSON solves or prevents a lot of problems you could run into with something bespoke and "optimally packed", you just have the tradeoff of needing more storage for it. Even then, the increased storage can be largely mitigated with compression. JSON compresses very well.

    The problem is usually what they're storing, not how they're storing it. For example, The Witcher (first one) has ~20MB save files. These are mostly a bespoke packed binary format, but contain things like raw strings of descriptions in multiple localisations for items being carried, and complete descriptors of game quests. Things that should just be ID values that point to that data in the game files. It also leads with like... 13KB of zero-padding for some reason.

  • deleted by creator

    Jump
  • True. Until you responded I actually completely forgot that you can selectively download torrents. Would be nice to not have to manually manage that at the user level though.

    Some kind of bespoke torrent client that managed it under the hood could probably work without having to invent your own peer-to-peer protocol for it. I wonder how long it would take to compute the torrent hash values for 100PB of data? :D

  • deleted by creator

    Jump
  • Yes, it's a big ask, because it's a lot of data. Any distributed solution will require either a large number of people or a huge commitment of storage capacity. Both 100,000 people and 1TB per node is a lot to ask for, but that's basically the minimum viable level for that much data. Ten million people each committing 50GB would be great, and offer sufficient redundancy that you could lose 80% of the nodes before losing data, but that's not a realistic number to expect to participate.

  • deleted by creator

    Jump
  • That wouldn't distribute the load of storing it though. Anyone on the torrent would need to set aside 100PBs of storage for it, which is clearly never going to happen.

    You'd want a federated (or otherwise distributed) storage scheme where thousands of people could each contribute a smaller portion of storage, while also being accessible to any federated client. 100,000 clients each contributing 1TB of storage would be enough to get you one copy of the full data set with no redundancy. Ideally you'd have more than that so that a single node going down doesn't mean permanent data loss.

  • Depends on what your goal is. Strictly speaking cc by sa is more permissive than putting no copyright notice at all, since copyright is automatic, and the cc licenses grant various permissions not contained in standard copyright. It's just a fancy legalistic way of saying "please credit me if you use this, continue to share in a similar fashion, but not for any commercial purpose".

    So if you want people to share your work, cc by sa makes sense.