Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)TA
Posts
1
Comments
48
Joined
2 yr. ago

  • We secure your account against SIM swaps…with modern cryptography protocols.

    This just dosent make ANY sense. Sim swaps are done via social engeneering.

    See this for details. Their tech support people do not have the access necessary to move a line so there’s nobody to social engineer. Only the customer can start the process to move a line after cryptographic authentication using BIP-39.

    proprietary signaling protection

    If they wanted to be private, it would be Open source.

    I’m really tired of this trope in the privacy community. Open source does not mean private. Nobody is capable of reviewing the massive amount of code used by a modern system as complex as a phone operating system and cellular network. There’s no way to audit the network to know that it’s all running the reciewed open source code either.

    Voicemails can hold sensitive information like 2FA codes.

    Since when do people send 2fa codes via voicemail? The fuck? Just use signal.

    There are many 2FA systems that offer to call your number so the system can tell you your 2FA code.

    The part where I share your reaction to Cape is about identifying customers. This page goes into detail about these aspects, and it has a lot of things that are indeed better than any other carrier out there.

    But it’s a long distance short of being private. They’re a “heavy MVNO”. This means their customers’ phones are still using other carriers’ cell towers, and those can still collect and log IMSI and device location information. Privacy researchers have demonstrated that it is quite easy to deanonymize someone with very little location information.

    On top of that, every call or text goes to another device. If it goes through another core network, most call metadata is still collected, logged, and sold.

    If we accept all of Cape’s claims, it’s significantly better than any other carrier I’m aware of, but it’s still far from what most people in this community would consider private.

  • UI designs are rarely exactly the same as the final product. There’s many tweaks that occur after the design is implemented. Sometimes doing exactly what the design requiress is too difficult or requires too many resources.

  • I’ve presented a few WWDC sessions including two video sessions, though nothing as huge as the keynote or platform state of the union. I can answer most questions you have about the process.

    The screens shown in WWDC sessions are usually screen captures from real devices. Development of the slide decks starts with a template deck that has the styles, fonts, and color themes for that year’s sessions. It includes slides that look like the latest devices, with precise rectangles the right size where screen captures will fit. As people develop their sessions they use these slides as placeholders for screenshots, animations and videos.

    During development of the OSes the code branches for what will become the first developer seed. Before WWDC, one of the builds of this branch gets marked as ready for final screenshots/videos. The idea is that the UI is close enough to what will ship in the first developer seed that the OS and sessions will match.

    Once that build is marked, the presenters take their screenshots and those get incorporated into the slides.

    You wrote “It wasn’t just a screen recorder thing”. What makes you say that?

    You asked about specialized software. Apple OS engineers have to use what are called “internal variants” of the OSes during development. These have special controls for all sorts of things. One fun thing to look for in WWDC sessions: the status bar almost always has the same details, with the same time, battery level, Wi-Fi signal strength, etc. These are real screenshots, but the people taking the videos used special overrides in the internal variants to force the status bar to show those values rather than the actual values. That makes things consistent. I think it avoids weird things like viewers being distracted by a demo device with a low battery.

  • Part of that is the responsibility of the app developer, since they define the payload that appears in the APNs push message. It’s possible for them to design it such that the push message really just says “time to ping your app server because something changed”. That minimizes the amount of data exposed to Apple, and therefore to law enforcement.

    For instance the MDM protocol uses APNS. It tells the device that it’s time to reach out to the MDM server for new commands. The body of the message does not contain the commands.

    That still necessarily reveals some metadata, like the fact that a message was sent to a device at a particular time. Often metadata is all that law enforcement wants for fishing expeditions. I think we should be pushing back on law enforcement’s use of broad requests (warrants?) for server data. We can and should minimize the data that servers have, but there’s limits. If servers can hold nothing, then we no longer have a functional Internet. Law enforcement shouldn’t feel entitled to all server data.

  • Side note: Any decent kid tracker thingies that respect privacy?

    Apple Watch works well as a kid tracker if they’re old enough to wear it safely, and I think the privacy aspects are very good. It uses the FindMy network, and Apple can’t see the location. There’s a bunch of specifics here. Apple Watch used to require an iPhone, but Apple made it so you can add a kid’s watch to the family so it uses a parent’s iPhone instead.

  • The original paper about microplastics in the brain seems to have a serious methodological flaw that undermines the conclusion that our brains are swimming in microplastics.

    “False positives of microplastics are common to almost all methods of detecting them,” Jones says. “This is quite a serious issue in microplastics work.”

    Brain tissue contains a large amount of lipids, some of which have similar mass spectra as the plastic polyethylene, Wagner says. “Most of the presumed plastic they found is polyethylene, which to me really indicates that they didn’t really clean up their samples properly.” Jones says he shares these concerns.

    This is from other microplastics researchers. See this article. So before we panic about this, let’s wait for some independent replication and more agreement in the scientific community.

    Microplastics are a serious concern, and we need to deal with plastic pollution. Let’s just stick to high quality science while we do that.

  • Permanently Deleted

    Jump
  • I haven’t seen any evidence that this is solvable. You can feed in more training data, but that doesn’t mean generative AI technology is capable of using that in the way you describe.

  • Passkeys are a replacement for passwords. Passwords don’t solve the problem of a lost password, and passkeys don’t solve the problem of a lost passkey. How a site deals with lost credentials is up to them. It doesn’t need to be password + 2FA.

  • The 1:1 matching and the porn detection were separate capabilities.

    Porn detection is called Communication Safety, and it only warms the user. If it’s set up in Screen Time as a child’s device, someone has to enter the parent’s Screen Time passcode to bypass the warning. That’s it. It’s entirely local to the device. The parent isn’t notified or shown the image, and Apple doesn’t get the image. It’s using an ML model, so it can have false positives.

    CSAM detection was exact 1:1 matching using a privacy-preserving hashing system. It prevented users uploading known CSAM to iCloud, and that’s it. Apple couldn’t tell if there was a match or find out the hashes of images being evaluated.

    Many people misunderstood and conflated the two capabilities, and often claimed without evidence that they did things that they were designed never to do. Apple abandoned the CSAM detection capability.

  • This is too techno-utopian. There’s also a place for governments. Comprehensive privacy legislation would also change the world for the better. Ignoring that is exactly what the largest invaders of privacy want.

  • Do you have any other resource hogs running in the background? Perhaps a poorly-coded VoIP or VPN extension could do that.

    If you have access to a Mac you can use Console.app to see what log events there are about Voyager when you switch away. That would explain why it’s being killed.

  • There’s a difference between iOS killing the app and suspending it. When suspended an app remains in memory and the OS doesn’t give it any opportunities to run any code. When the user switches back the app resumes without any change in its state.

  • Privacy @lemmy.ml

    Cops can force suspect to unlock phone with thumbprint, US court rules