[-] [email protected] 4 points 12 hours ago* (last edited 12 hours ago)

We secure your account against SIM swaps…with modern cryptography protocols.

This just dosent make ANY sense. Sim swaps are done via social engeneering.

See this for details. Their tech support people do not have the access necessary to move a line so there’s nobody to social engineer. Only the customer can start the process to move a line after cryptographic authentication using BIP-39.

proprietary signaling protection

If they wanted to be private, it would be Open source.

I’m really tired of this trope in the privacy community. Open source does not mean private. Nobody is capable of reviewing the massive amount of code used by a modern system as complex as a phone operating system and cellular network. There’s no way to audit the network to know that it’s all running the reciewed open source code either.

Voicemails can hold sensitive information like 2FA codes.

Since when do people send 2fa codes via voicemail? The fuck? Just use signal.

There are many 2FA systems that offer to call your number so the system can tell you your 2FA code.

The part where I share your reaction to Cape is about identifying customers. This page goes into detail about these aspects, and it has a lot of things that are indeed better than any other carrier out there.

But it’s a long distance short of being private. They’re a “heavy MVNO”. This means their customers’ phones are still using other carriers’ cell towers, and those can still collect and log IMSI and device location information. Privacy researchers have demonstrated that it is quite easy to deanonymize someone with very little location information.

On top of that, every call or text goes to another device. If it goes through another core network, most call metadata is still collected, logged, and sold.

If we accept all of Cape’s claims, it’s significantly better than any other carrier I’m aware of, but it’s still far from what most people in this community would consider private.

[-] [email protected] 36 points 1 week ago

It sure revealed something about the person who used ChatGPT, so mission accomplished.

[-] [email protected] 16 points 1 week ago

It’s usually harder to do for admins. They’re usually the ones who do the suspending.

[-] [email protected] 19 points 2 weeks ago

I’ve presented a few WWDC sessions including two video sessions, though nothing as huge as the keynote or platform state of the union. I can answer most questions you have about the process.

The screens shown in WWDC sessions are usually screen captures from real devices. Development of the slide decks starts with a template deck that has the styles, fonts, and color themes for that year’s sessions. It includes slides that look like the latest devices, with precise rectangles the right size where screen captures will fit. As people develop their sessions they use these slides as placeholders for screenshots, animations and videos.

During development of the OSes the code branches for what will become the first developer seed. Before WWDC, one of the builds of this branch gets marked as ready for final screenshots/videos. The idea is that the UI is close enough to what will ship in the first developer seed that the OS and sessions will match.

Once that build is marked, the presenters take their screenshots and those get incorporated into the slides.

You wrote “It wasn’t just a screen recorder thing”. What makes you say that?

You asked about specialized software. Apple OS engineers have to use what are called “internal variants” of the OSes during development. These have special controls for all sorts of things. One fun thing to look for in WWDC sessions: the status bar almost always has the same details, with the same time, battery level, Wi-Fi signal strength, etc. These are real screenshots, but the people taking the videos used special overrides in the internal variants to force the status bar to show those values rather than the actual values. That makes things consistent. I think it avoids weird things like viewers being distracted by a demo device with a low battery.

[-] [email protected] 51 points 4 months ago

The original paper about microplastics in the brain seems to have a serious methodological flaw that undermines the conclusion that our brains are swimming in microplastics.

“False positives of microplastics are common to almost all methods of detecting them,” Jones says. “This is quite a serious issue in microplastics work.”

Brain tissue contains a large amount of lipids, some of which have similar mass spectra as the plastic polyethylene, Wagner says. “Most of the presumed plastic they found is polyethylene, which to me really indicates that they didn’t really clean up their samples properly.” Jones says he shares these concerns.

This is from other microplastics researchers. See this article. So before we panic about this, let’s wait for some independent replication and more agreement in the scientific community.

Microplastics are a serious concern, and we need to deal with plastic pollution. Let’s just stick to high quality science while we do that.

[-] [email protected] 18 points 6 months ago

I’m wondering why clergy were consulted. I can’t imagine a worse place to go for insight into the ethics of human sexuality. Was it a Catholic hospital?

[-] [email protected] 74 points 8 months ago

When you first boot up a device, most data on that device is encrypted. This is the Before First Unlock (BFU) state. In order to access any of that data, someone must enter the passcode. The Secure Enclave uses it to recreate the decryption keys that allow the device to access that encrypted data. Biometrics like Face ID and Touch ID won’t work: they can’t be used to recreate the encryption keys.

Once you unlock the device by entering the passcode the device generates the encryption keys and uses them to access the data. It keeps those keys in memory. If it didn’t, you’d have to enter your passcode over and over again in order to keep using your device. This is After First Unlock (AFU) state.

When you’re in AFU state and you lock your device, it doesn’t throw away the encryption keys. It just doesn’t permit you to access your device. This is when you can use biometrics to unlock it.

In some jurisdictions a judge can legally force someone to enter biometrics, but can’t force them give up their passcode. This legal distinction in the USA is that giving a passcode is “testimonial” because it requires giving over the contents of your mind, and forcing suspects to do that is not legal in the USA. Biometrics aren’t testimonial, and so someone can be forced to use them, similar to how arrested people are forced to give fingerprints.

Of course, in practical terms this is a meaningless distinction because both biometrics and a passcode can grant access to nearly all data on a device. So one interesting thing about BFU vs AFU is that BFU makes this legal hair-splitting moot: biometrics don’t work in BFU state.

But that’s not what the 404 Media articles are about. It’s more about the forensic tools that can sometimes extract data even from a locked device. A device in AFU state has lots of opportunities for attack compared to BFU. The encryption keys exist, some data is already decrypted in memory, the lightning port is active, it will connect to Wi-Fi networks, and so on. This constitutes a lot of attack surface that hackers could potentially exploit to pull data off the device. In BFU state, there’s very little data available and almost no attack surface. Automatically returning a device to BFU state improves resistance to hacking.

[-] [email protected] 71 points 1 year ago

I still wouldn’t trust it because of homograph attacks.

[-] [email protected] 30 points 1 year ago

There’s a fatal flaw in the premise. It is impossible to fasten something to a cat.

[-] [email protected] 48 points 1 year ago

Authorities with a warrant can drill into a safe to get to its contents. That’s legally distinct from forcing someone to unlock the safe by entering the combination. It takes some mental effort to enter a combination, so it counts as “testimony”, and in the USA people can’t be forced to testify against themselves.

The parallel in US law is that people can be forced to unlock a phone using biometrics, but they can’t be forced to unlock a phone by entering a passcode. The absurd part here is that the actions have the same effect, but one of them can be compelled and the other cannot.

328
submitted 1 year ago by [email protected] to c/[email protected]

The legal situation is more complex and nuanced than the headline implies, so the article is worth reading. This adds another ruling to the confusing case history regarding forced biometric unlocking.

[-] [email protected] 17 points 2 years ago

This is a terrible idea. It’s negligibly better than writing down the passwords, because it’s trivially easy to try every password represented on this card. Once someone has the card, your entropy is just two characters, which is the two characters you memorize for the site. In effect, you have a 2 character password.

[-] [email protected] 28 points 2 years ago

You can’t fight brainwashing by providing more facts. It doesn’t work. Brainwashing gives the victim mechanisms to reject new facts that contradict the false beliefs. The false beliefs become a part of a person’s identity, so it’s tied into self esteem and confidence. So that’s how you have to approach it: find ways to challenge the false beliefs that don’t also challenge their sense of self. For adults this is very difficult.

But for children, it’s easier. During the teen years children are trying on identities like they’re trying on clothes. Give you child a look at a good, comfortable identity. It should make them confident, give them a community they feel comfortable in, and not make enemies of the ones they love.

I find that scientific skepticism does this by giving people the tools to think rationally about the world, spot ways that the world tries to deceive them, and giving an understanding of why those deceptions are effective.

view more: next ›

TaviRider

0 post score
0 comment score
joined 2 years ago