HN.zip

FBI couldn't get into WaPo reporter's iPhone because Lockdown Mode enabled

508 points by robin_reala - 415 comments
bwoah [3 hidden]5 mins ago
nova22033 [3 hidden]5 mins ago
Remember...they can make you use touch id...they can't make you give them your password.

https://x.com/runasand/status/2017659019251343763?s=20

The FBI was able to access Washington Post reporter Hannah Natanson's Signal messages because she used Signal on her work laptop. The laptop accepted Touch ID for authentication, meaning the agents were allowed to require her to unlock it.

wackget [3 hidden]5 mins ago
Link which doesn't directly support website owned by unscrupulous trillionaire: https://xcancel.com/runasand/status/2017659019251343763?s=20
throwawayfour [3 hidden]5 mins ago
Good reminder to also set up something that does this automatically for you:

https://news.ycombinator.com/item?id=46526010

forgotTheLast [3 hidden]5 mins ago
I actually think it is fitting to read about a government agency weaponized by an unscrupulous billionaire going after journalists working for an unscrupulous billionaire on an unscrupulous trillionaire owned platform.
asadm [3 hidden]5 mins ago
[flagged]
pyrophane [3 hidden]5 mins ago
Maybe. I don't think we yet have a good understanding of how many deaths he will have caused as a result of DOGE so abruptly cutting off assistance to so many vulnerable people around the world, but I've heard estimates hover around 600,000.

Assuming that number turns out to be close to reality, how do you weigh so many unnecessary deaths against VTL rockets and the electric cars?

Perhaps a practitioner of Effective Altruism could better answer that question.

C6JEsQeQa5fCjE [3 hidden]5 mins ago
> I don't think we yet have a good understanding of how many deaths he will have caused as a result of DOGE so abruptly cutting off assistance to so many vulnerable people around the world

Nor how many deaths will be caused by his support for far right parties across Europe, when they start ethnic cleansings.

asadm [3 hidden]5 mins ago
I have FIRST-HAND seen corruption around USAID-style "assistance" back home. I fully support that work of his.
roboror [3 hidden]5 mins ago
I've seen corruption in the police. Government. Hospitals. Do you support immediately shuttering those offices with no replacements?
NoMoreNicksLeft [3 hidden]5 mins ago
>of how many deaths he will have caused as a result of DOGE so abruptly cutting off assistance to so many vulnerable people around the world

The US taxpayer has no moral obligation to send welfare "around the world". If you personally find this frustrating, you're welcome to donate that money yourself, directly. No one will stop you. If the world wishes to partake in the benefits of the American government, it should apply for statehood.

ceejayoz [3 hidden]5 mins ago
> The US taxpayer has no moral obligation to send welfare "around the world".

Sure. It's a transactional purchase of stability and goodwill, via which the US has benefited enormously.

mptest [3 hidden]5 mins ago
> The US taxpayer has no moral obligation to send welfare "around the world".

I mean, by way of the atrocities we've committed around the world, we kinda do.

Even if we buy your thesis, foregoing morals, geopolitics, and history, it's a useful soft power strategy...

I'm not saying fund USAID before healthcare for all in america. I'm saying of all the insane things our government wastes money on, USAID was far down on the list of most egregious.

asadm [3 hidden]5 mins ago
Correct. But also, it's a bandaid (and a really ineffective one ie. 99% lossy) on real issues of that world.
Dylan16807 [3 hidden]5 mins ago
Even if his total contribution is positive, his current contribution is quite bad. And most of that bad has been tied directly to x.
asadm [3 hidden]5 mins ago
I can atleast still voice against Israeli genocide there. I am good for now.
frereubu [3 hidden]5 mins ago
How many people do you think see those tweets, how many minds do you think you have changed, and at what mental cost to yourself?
asadm [3 hidden]5 mins ago
I see other's tweets. I don't think most are being shadowbanned. I am doing fine myself and pretty productive actually.
crumpled [3 hidden]5 mins ago
What's the point of these questions? Seems like, "what's the point of dissent if the cards are stacked against you?"
ebbi [3 hidden]5 mins ago
He was begging to go party with someone that spent time in prison for child exploitation.

That in itself should make you hate the dude.

asadm [3 hidden]5 mins ago
Yup. Hate him as person. But he is still net positive with his scientific/engineering contributions, is he not?

Wasn't Edison an asshole?

ebbi [3 hidden]5 mins ago
Dunno, I'd rather have unabused kids than the technological breakthroughs he has contributed to. Anyone being giddy to meet with a convicted pedo is very sus in my books, and deserves no respect, regardless of their prior contributions.

Children were exploited, and we're doing this net positive analysis on whether he should face the scorn. I'm not having a go at you - it's just frustrating to see very little happening after so much has been exposed, and I think part of it comes from this mindset - 'oh he's a good guy, this is a mistake/misstep' while people that were exploited as children can't even get their justice.

It's sickening.

JumpCrisscross [3 hidden]5 mins ago
> I'd rather have unabused kids than the technological breakthroughs he has contributed to

I'd rather have both. Hawthorne doesn't get nuked if Elon Musk goes to jail.

> Children were exploited

Abuse. Exploitation. CSAM. We're mushing words.

Child rape. These men raped children. Others not only stayed silent in full knowledge of it, but supported it directly and indirectly. More than that, they arrogantly assumed–and, by remaining in the United States, continue to assume–that they're going to get away with it.

Which category is Elon Musk in? We don't know. Most of the people in the Epstein files are innocent. But almost all of them seem to have been fine with (a) partying with an indicted and unrepentant pedophile [1] and (b) not saying for decades–and again, today–anything to the cops about a hive of child rape.

A lot of them should go to jail. All of them should be investigated. And almost all of them need to be retired from public life.

[1] https://web.archive.org/web/20220224113217/https://www.theda...

TylerLives [3 hidden]5 mins ago
Is there any evidence that Epstein was a pedophile?
JumpCrisscross [3 hidden]5 mins ago
Direct? No. That he was indicted for it? Yes [1].

(Clarification: I’m using the term colloquially. Whether Epstein had a mental condition is unclear.)

[1] https://www.justice.gov/usao-sdny/press-release/file/1180481...

andwhatisthis [3 hidden]5 mins ago
How so?
asadm [3 hidden]5 mins ago
nasa is fucked up. spacex is US’s only shot.
b8 [3 hidden]5 mins ago
They can hold you in contempt for 18 months for not giving your password, https://arstechnica.com/tech-policy/2020/02/man-who-refused-....
ElevenLathe [3 hidden]5 mins ago
Being held in contempt at least means you got a day in court first. A judge telling me to give up my password is different than a dozen armed, masked secret police telling me to.
C6JEsQeQa5fCjE [3 hidden]5 mins ago
> A judge telling me to give up my password is different than a dozen armed, masked secret police telling me to.

Yes, a judge is unlikely to order your execution if you refuse. Based on recent pattern of their behavior, masked secret police who are living their wildest authoritarian dreams are likely to execute you if you anger them (for example by refusing to comply with their desires).

noident [3 hidden]5 mins ago
That's a very unusual and narrow exception involving "foregone conclusion doctrine", an important fact missed by Ars Technica but elaborated on by AP: https://apnews.com/general-news-49da3a1e71f74e1c98012611aedc...
OGWhales [3 hidden]5 mins ago
> Authorities, citing a “foregone conclusion exception” to the Fifth Amendment, argued that Rawls could not invoke his right to self-incrimination because police already had evidence of a crime. The 3rd Circuit panel agreed, upholding a lower court decision.

I do not follow the logic here, what does that even mean? It seems very dubious. And what happens if one legitimately forgets? They just get to keep you there forever?

seanw444 [3 hidden]5 mins ago
You're delusional. When ICE starts executing people on the spot for not giving up iPhone passwords, I'll eat my words.
goda90 [3 hidden]5 mins ago
Remember that our rights aren't laws of nature. They have to be fought for to be respected by the government.
teejmya [3 hidden]5 mins ago
I previously commented a solution to another problem, but it assists here too:

https://news.ycombinator.com/item?id=44746992

This command will make your MacBook hibernate when lid is closed or the laptop sleeps, so RAM is written to disk and the system powers down. The downside is that it does increase the amount of time it takes to resume.

A nice side benefit though, is that fingerprint is not accepted on first unlock, I believe secrets are still encrypted at this stage similar to cold boot. A fingerprint still unlocks from screensaver normally, as long as the system does not sleep (and therefore hibernate)

patrickmay [3 hidden]5 mins ago
Is the knowledge of which finger to use protected as much as a passcode? Law enforcement might have the authority to physically hold the owner's finger to the device, but it seems that the owner has the right to refuse to disclose which finger is the right one. If law enforcement doesn't guess correctly in a few tries, the device could lock itself and require the passcode.

Another reason to use my dog's nose instead of a fingerprint.

parl_match [3 hidden]5 mins ago
I really wish Apple would offer a pin option on macos. For this reason, precisely. Either that, or an option to automatically disable touchid after a short amount of time (eg an hour or if my phone doesn't connect to the laptop)
fpoling [3 hidden]5 mins ago
You can setup a separated account with a long password on MacOS and remove your user account from accounts that can unlock FileVault. Then you can change your account to use a short password. You can also change various settings regarding how long Mac has to sleep before requiring to unlock FileVault.
AnonHP [3 hidden]5 mins ago
I didn’t understand how a user that cannot unlock FileVault helps. Can you please elaborate on this setup? Thanks.
xoa [3 hidden]5 mins ago
As another alternative, rather than using Touch ID you can setup a Yubikey or similar hardware key for login to macOS. Then your login does indeed become a PIN with 3 tries before lockout. That plus a complex password is pretty convenient but not biometric. It's what I've done for a long time on my desktop devices.
NetMageSCW [3 hidden]5 mins ago
You can script a time out if desired.
Wistar [3 hidden]5 mins ago
On my Macbook Pro, I usually need to use both touch and a password but that might be only when some hours have passed between log ins.
thecapybara [3 hidden]5 mins ago
There's only ten possible guesses, and most people use their thumb and/or index finger, leaving four much likelier guesses.

Also, IANAL, but I'm pretty sure that if law enforcement has a warrant to seize property from you, they're not obligated to do so immediately the instant they see you - they could have someone follow you and watch to see how you unlock your phone before seizing it.

z3phyr [3 hidden]5 mins ago
0.1 in itself is a very good odd, and 0.1 * n tries is even more laughable. Also most people have two fingers touchID, which makes this number close to half in reality.
notyourwork [3 hidden]5 mins ago
I don't get why I can be forced to use my biometrics to unlock but I cannot be forced to give a pin. Doesn't jive in my brain.
joecool1029 [3 hidden]5 mins ago
> they can't make you give them your password.

Except when they can: https://harvardlawreview.org/print/vol-134/state-v-andrews/

p0w3n3d [3 hidden]5 mins ago
Allowed to require - very mildly constructed sentence, which could include torture or force abuse...

https://xkcd.com/538/

mbil [3 hidden]5 mins ago
Reminder that you can press the iPhone power button five times to require passcode for the next unlock.
thecapybara [3 hidden]5 mins ago
Did you know that on most models of iPhone, saying "Hey Siri, who's iPhone is this?" will disable biometric authentication until the passcode is entered?
rconti [3 hidden]5 mins ago
hm. didn't work on my 17 pro :( might be due to a setting i have.
fragmede [3 hidden]5 mins ago
They disabled that in like iOS 18.
rawgabbit [3 hidden]5 mins ago
Serious question. If I am re-entering the US after traveling abroad, can customs legally ask me to turn the phone back on and/or seize my phone? I am a US citizen.

Out of habit, I keep my phone off during the flight and turn it on after clearing customs.

verall [3 hidden]5 mins ago
my understanding is that they can hold you for a couple days without charges for your insubordination but as a citizen they have to let you back into the country or officially arrest you, try to get an actual warrant, etc.
Analemma_ [3 hidden]5 mins ago
If you are a US citizen, you legally cannot be denied re-entry into the country for any reason, including not unlocking your phone. They can make it really annoying and detain you for a while, though.
paulsmith [3 hidden]5 mins ago
Alternately, hold the power button and either volume button together for a few seconds.
tosapple [3 hidden]5 mins ago
This is the third person advocating button squeezing, as a reminder: IF a gun is on you the jig is up, you can be shot for resisting or reaching for a potential weapon. Wireless detonators do exist, don't f around please.
kstrauser [3 hidden]5 mins ago
Or squeeze the power and volume buttons for a couple of seconds. It’s good to practice both these gestures so that they become reflex, rather than trying to remember them when they’re needed.
regenschutz [3 hidden]5 mins ago
Sad, neither of those works on Android. Pressing the power button activates the emergency call screen with a countdown to call emergency services, and power + volume either just takes a screenshot or enables vibrations/haptics depending on which volume button you press.
silisili [3 hidden]5 mins ago
Did you check your phone settings? Mine has an option to add it to the power menu, so you get to it by whichever method you use to do that (which itself is sad that phones are starting to differ in what the power key does).
thallium205 [3 hidden]5 mins ago
On Pixel phones, Power + Volume Up retrieves a menu where you can select "Lockdown".
rationalist [3 hidden]5 mins ago
Not on my Pixel phone, that just sets it to vibrate instead of ring. Holding down the power button retrieves a menu where you can select "Lockdown".
zerocrates [3 hidden]5 mins ago
On my 9 you get a setting to choose if holding Power gets you the power menu or activates the assistant (I think it defaulted to assistant? I have it set to the power menu because I don't really ever use the assistant.)
pkulak [3 hidden]5 mins ago
Oh wow, just going into the "should I shutdown" menu also goes into pre-boot lock state? I didn't know that.
duskwuff [3 hidden]5 mins ago
It doesn't reenter a BFU state, but it requires a passcode for the next unlock.
snuxoll [3 hidden]5 mins ago
It's close enough, because (most of) the encryption keys are wiped from memory every time the device is locked, and this action makes the secure enclave require PIN authentication to release them again.
overfeed [3 hidden]5 mins ago
> It's close enough

Not really, because tools like Cellbrite are more limited with BFU, hence the manual informing LEO to keep (locked) devices charged, amd the countermeasures being iOS forcefully rebooting devices that have been locked for too long.

CGMthrowaway [3 hidden]5 mins ago
There is a way now to force BFU from a phone that is turned on, I can't remember the sequence
fogzen [3 hidden]5 mins ago
In case anyone is wondering: In newer versions of MacOS, the user must log out to require a password. Locking screen no longer requires password if Touch ID is enabled.
alistairSH [3 hidden]5 mins ago
Is that actually true? I'm fairly confident my work Mac requires a password if it's idle more than a few days (typically over the weekend).
raw_anon_1111 [3 hidden]5 mins ago
Settings -> lock screen -> “Require password after screen saver begins or display is turned off”
jen729w [3 hidden]5 mins ago
Shift+Option+Command+Q is your fastest route there, but unsaved work will block.
neves [3 hidden]5 mins ago
I just searched the case. I'm appalled. It looks like USA doesn't have legal protection for reporter sources. Or better, Biden created some, but it was revoked by the current administration.

The real news here isn't privacy control in a consumer OS ir the right to privacy, but USA, the leader of the free world, becoming an autocracy.

raw_anon_1111 [3 hidden]5 mins ago
As if the government is not above breaking the law and using rubber hose decryption. The current administration’s justice department has been caught lying left and right
TheDong [3 hidden]5 mins ago
I find it so frustrating that Lockdown Mode is so all-or-nothing.

I want some of the lockdown stuff (No facetime and message attachments from strangers, no link previews, no device connections), but like half of the other ones I don't want.

Why can't I just toggle an iMessage setting for "no link preview, no attachments", or a general setting for "no automatic device connection to untrusted computers while locked"? Why can't I turn off "random dickpicks from strangers on iMessage" without also turning off my browser's javascript JIT and a bunch of other random crap?

Sure, leave the "Lockdown mode" toggle so people who just want "give me all the security" can get it, but split out individual options too.

Just to go through the features I don't want:

* Lockdown Mode disables javascript JIT in the browser - I want fast javascript, I use some websites and apps that cannot function without it, and non-JIT js drains battery more

* Shared photo albums - I'm okay viewing shared photo albums from friends, but lockdown mode prevents you from even viewing them

* Configuration profiles - I need this to install custom fonts

Apple's refusal to split out more granular options here hurts my security.

Terretta [3 hidden]5 mins ago
The profiles language may be confusing -- what you can't do is change them while in Lockdown mode.
ectospheno [3 hidden]5 mins ago
Family albums work with lockdown mode. You can also disable web restrictions per app and website.
everdrive [3 hidden]5 mins ago
>* Lockdown Mode disables javascript JIT in the browser - I want fast javascript, I use some websites and apps that cannot function without it, and non-JIT js drains battery more

This feature has the benefit of teaching users (correctly) that browsing the internet on a phone has always been a terrible idea.

rantingdemon [3 hidden]5 mins ago
I'll bite. Why is it so terrible? I'm browsing this site right now on my phone and don't see the horror.
mghackerlady [3 hidden]5 mins ago
Phone networks by design track you more precisely than possible over a conventional internet connection to facilitate the automatic connection to the nearest available network. Also, for similar reasons it requires the phone network to know that it is your phone
LoganDark [3 hidden]5 mins ago
You don't need to connect to the internet for that. It has nothing to do with web browsing at all.
jgwil2 [3 hidden]5 mins ago
I think that ship has sailed.
827a [3 hidden]5 mins ago
Is there an implication here that they could get into an iPhone with lower security settings enabled? There's Advanced Data Protection, which E2EEs more of your data in iCloud. There's the FaceID unlock state, which US law enforcement can compel you to unlock; but penta-click the power button and you go into PIN unlock state, which they cannot compel you to unlock.

My understanding of Lockdown Mode was that it babyifies the device to reduce the attack surface against unknown zero-days. Does the government saying that Lockdown Mode barred them from entering imply that they've got an unknown zero-day that would work in the PIN-unlock state, but not Lockdown Mode?

kingnothing [3 hidden]5 mins ago
It's relatively well know that the NSO Group / Pegasus is what governments use to access locked phones.
zymhan [3 hidden]5 mins ago
Yes
nxobject [3 hidden]5 mins ago
Sadly, they still got to her Signal on her Desktop – her sources might still be compromised. It's sadly inherent to desktop applications, but I'm sad that a lot more people don't know that Signal for Desktop is much, much less secure against adversaries with your laptop.
tadzikpk [3 hidden]5 mins ago
> I'm sad that a lot more people don't know that Signal for Desktop is much, much less secure against adversaries with your laptop

Educate us. What makes it less secure?

armadyl [3 hidden]5 mins ago
In addition to what the other person who replied said, ignoring that iOS/Android/iPadOS is far more secure than macOS, laptops have significantly less hardware-based protections than Pixel/Samsung/Apple mobile devices do. So really the only way a laptop in this situation would be truly secure from LEO is if its fully powered off when it’s seized.
digiown [3 hidden]5 mins ago
The key in the desktop version is not always stored in the secure enclave, is my assumption (it definitely supports plaintext storage). Theoretically this makes it possible to extract the key for the message database. Also a different malicious program can read it. But this is moot anyway if the FBI can browse through the chats. This isn't what failed here.
anigbrowl [3 hidden]5 mins ago
Also last time I looked (less than 1 year ago) files sent over Signal are stored in plain, just with obfuscated filenames. So even without access to Signal it's easy to see what message attachments a person has received, and copy any interesting ones.
stronglikedan [3 hidden]5 mins ago
If people don't have Signal set to delete sensitive messages quickly, then they may as well just be texting.
AdamN [3 hidden]5 mins ago
That's a strong statement. Also imho it's important that we use Signal for normal stuff like discussing where to get coffee tomorrow - no need for disappearing messages there.
CGMthrowaway [3 hidden]5 mins ago
Strong and accurate. Considering non-disappearing messages the same as texts is not the same thing as saying all Signal messages ought to be disappearing or else the app is useless.

Telegram allows you to have distinct disappearing settings for each chat/group. Not sure how it works on Signal, but a solution like this could be possible.

aschobel [3 hidden]5 mins ago
I'm weird, i even have disappearing messages for my coffee chats. It's kind of refreshing not having any history.
zikduruqe [3 hidden]5 mins ago
I'm an inbox zero person... I keep even my personal notes to disappear after 2 days. For conversations 1 day.
tptacek [3 hidden]5 mins ago
Not if you're using Signal for life-and-death secure messaging; in that scenario it's table stakes.
mrandish [3 hidden]5 mins ago
I would have thought reporters with confidential sources at that level would already exercise basic security hygiene. Hopefully, this incident is a wake up call for the rest.
NewsaHackO [3 hidden]5 mins ago
Yea, I also would want to question the conclusions in the article. Was the issue that they couldn't unlock the iPhone, or that they had no reason to pursue the thread? To my understanding, the Apple ecosystem means that everything is synced together. If they already got into her laptop, wouldn't all of the iMessages, call history, and iCloud material already be synced there? What would be the gain of going after the phone, other than to make the case slightly more watertight?
pbhjpbhj [3 hidden]5 mins ago
Did she have Bitlocker or FileVault or other disk encryption that was breeched? (Or they took the system booted as TLAs seek to do?)
bmicraft [3 hidden]5 mins ago
There was a story here the other day, bitlocker keys stored in your Microsoft account will be handed over.
MoonWalk [3 hidden]5 mins ago
breached
macintux [3 hidden]5 mins ago
> Natanson said she does not use biometrics for her devices, but after investigators told her to try, “when she applied her index finger to the fingerprint reader, the laptop unlocked.”

Curious.

QuantumNomad_ [3 hidden]5 mins ago
Probably enabled it at some point and forgot. Perhaps even during setup when the computer was new.
intrasight [3 hidden]5 mins ago
My recollection is the computers do by default ask the user to set up biometrics
NewsaHackO [3 hidden]5 mins ago
I want to say that is generous of her, but one thing that is weird is if I didn’t want someone to go into my laptop and they tried to force me to use my fingerprint to unlock it, I definitely wouldn’t use the finger I use to unlock it on the first try. Hopefully, Apple locks it out and forces a password if you use the wrong finger “accidentally” a couple of times.
altairprime [3 hidden]5 mins ago
Correct. That’s why my Touch ID isn’t configured to use the obvious finger.
b112 [3 hidden]5 mins ago
Very much so, because the question is... did she set it up in the past?

How did it know the print even?

ezfe [3 hidden]5 mins ago
Why is this curious?
macintux [3 hidden]5 mins ago
There appear to be a relatively few possibilities.

* The reporter lied.

* The reporter forgot.

* Apple devices share fingerprint matching details and another device had her details (this is supposed to be impossible, and I have no reason to believe it isn't).

* The government hacked the computer such that it would unlock this way (probably impossible as well).

* The fingerprint security is much worse than years of evidence suggests.

Mainly it was buried at the very end of the article, and I thought it worth mentioning here in case people missed it.

orwin [3 hidden]5 mins ago
My opinion is that she set it up, it didn't work at first, she didn't use it, forgot that it existed, and here we are.

> Apple devices share fingerprint matching details and another device had her details

I looked into it quite seriously for windows thinkpads, unless Apple do it differently, you cannot share fingerprint, they're in a local chip and never move.

fragmede [3 hidden]5 mins ago
So how does TouchID on an external keyboard work without having to re-set up fingerprints?
piperswe [3 hidden]5 mins ago
Presumably the fingerprint data is stored in the Mac's Secure Enclave, and the external keyboard is just a reader
ezfe [3 hidden]5 mins ago
The reporter lying or forgetting seems to be the clear answer, there's really no reason to believe it's not one of those. And the distinction between the two isn't really important from a technical perspective.

Fingerprint security being poor is also unlikely, because that would only apply if a different finger had been registered.

dyauspitr [3 hidden]5 mins ago
She has to have set it up before. There is no way to divine a fingerprint any other way. I guess the only other way would be a faulty fingerprint sensor but that should default to a non-entry.
quesera [3 hidden]5 mins ago
> faulty fingerprint sensor

The fingerprint sensor does not make access control decisions, so the fault would have to be somewhere else (e.g. the software code branch structure that decides what to do with the response from the secure enclave).

giraffe_lady [3 hidden]5 mins ago
Could be a parallel construction type thing. They already have access but they need to document a legal action by which they could have acquired it so it doesn't get thrown out of court.

I think this is pretty unlikely here but it's within the realm of possibility.

tsol [3 hidden]5 mins ago
Seems like it would be hard to fake. The was she tells it she put her finger on the pad and the OS unlocked the account. Sounds very difficult to do
operator-name [3 hidden]5 mins ago
I think they mean if they already have her fingerprint from somewhere else, and a secret backdoor into the laptop. Then they could login, setup biometrics and pretend they had first access when she unlocked it. All without revealing their backdoor.
1vuio0pswjnm7 [3 hidden]5 mins ago
"Lockdown Mode is a sometimes overlooked feature of Apple devices that broadly make[sic] them harder to hack."

Funny to see disabling "features" itself described as "feature"

Why not call it a "setting"

Most iPhone users do not change default settings. That's why Google pays Apple billions of dollars for a default setting that sends data about users to Google

"Lockdown Mode" is not a default setting

The phrase "sometimes overlooked" is an understatement. It's not a default setting and almost no one uses it

If it is true Lockdown Mode makes iPhones "harder to hack", as the journalist contends, then it is also true that Apple's default settings make iPhones "easier to hack"

1vuio0pswjnm7 [3 hidden]5 mins ago
A "reduced attack surface" can also be a reduced surface for telemetry, data collection, surveillance and advertising services, thereby directly or indirectly causing a reduction in Apple revenues

Perhaps this could be a factor in why it's not a default setting

rick_dalton [3 hidden]5 mins ago
The intention behind lockdown mode is protection for a select few groups of people such as journalists, that are at risk of having software like Pegasus used against them. It’s to reduce the attack surface. The average user wouldn’t want most of it as a default setting, for example: almost no message attachments allowed, no FaceTime calls from people you haven’t called and safari is kneecapped. Making this a default setting for most people is unrealistic and also probably won’t help their cybersecurity as they wouldn’t be targeted anyway.
throwmeaway820 [3 hidden]5 mins ago
It seems unfortunate that enhanced protection against physically attached devices requires enabling a mode that is much broader, and sounds like it has a noticeable impact on device functionality.

I never attach my iPhone to anything that's not a power source. I would totally enable an "enhanced protection for external accessories" mode. But I'm not going to enable a general "Lockdown mode" that Apple tells me means my "device won’t function like it typically does"

jonpalmisc [3 hidden]5 mins ago
There is a setting as of iOS 26 under "Privacy & Security > Wired Accessories" in which you can make data connections always prompt for access. Not that there haven't been bypasses for this before, but perhaps still of interest to you.
H8crilA [3 hidden]5 mins ago
GrapheneOS does this by default - only power delivery when locked. Also it's a hardware block, not software. Seems to be completely immune to these USB exploit tools.
aaronmdjones [3 hidden]5 mins ago
It also has various options to adjust the behaviour, from no blocks at all, to not even being able to charge the phone (or use the phone to charge something else) -- even when unlocked. Changing the mode of operation requires the device PIN, just as changing the device PIN does.

Note that it behaves subtly differently to how you described in case it was connected to something before being locked. In that case data access will remain -- even though the phone is now locked -- until the device is disconnected.

Terretta [3 hidden]5 mins ago
> I would totally enable an "enhanced protection for external accessories" mode.

Anyone can do this for over a decade now, and it's fairly straightforward:

- 2014: https://www.zdziarski.com/blog/?p=2589

- recent: https://reincubate.com/support/how-to/pair-lock-supervise-ip...

This goes beyond the "wired accessories" toggle.

pkteison [3 hidden]5 mins ago
It isn’t. Settings > Privacy & Security > Wired Accessories

Set to ask for new accessories or always ask.

sodality2 [3 hidden]5 mins ago
I have to warn you, it does get annoying when you plug in your power-only cable and it still nags you with the question. But it does work as intended!
neilalexander [3 hidden]5 mins ago
You might want to check that charger. I have the same option set to ask every time and it never appears for chargers.
mrandish [3 hidden]5 mins ago
> it has a noticeable impact on device functionality.

The lack of optional granularity on security settings is super frustrating because it leads to many users just opting out of any heightened security.

UltraSane [3 hidden]5 mins ago
Computer security is generally inversely proportional to convenience. Best opsec is generally to have multiple devices.
ur-whale [3 hidden]5 mins ago
> I never attach my iPhone to anything that's not a power source.

It's "attached" to the wifi and to the cell network. Pretty much the same thing.

boring-human [3 hidden]5 mins ago
Can a hacked phone (such as one that was not in Lockdown Mode at one point in time) persist in a hacked state?

Obviously, the theoretical answer is yes, given an advanced-enough exploit. But let's say Apple is unaware of a specific rootkit. If each OS update is a wave, is the installed exploit more like a rowboat or a frigate? Will it likely be defeated accidentally by minor OS changes, or is it likely to endure?

This answer is actionable. If exploits are rowboats, installing developer OS betas might be security-enhancing: the exploit might break before the exploiters have a chance to update it.

quenix [3 hidden]5 mins ago
Forget OS updates. The biggest obstacle to exploit persistence: a good old hard system reboot.

Modern iOS has an incredibly tight secure chain-of-trust bootloader. If you shut your device to a known-off state (using the hardware key sequence), on power on, you can be 99.999% certain only Apple-signed code will run all the way from secureROM to iOS userland. The exception is if the secureROM is somehow compromised and exploited remotely (this requires hardware access at boot-time so I don't buy it).

So, on a fresh boot, you are almost definitely running authentic Apple code. The easiest path to a form of persistence is reusing whatever vector initially pwned you (malicious attachment, website, etc) and being clever in placing it somewhere iOS will attempt to read it again on boot (and so automatically get pwned again).

But honestly, exploiting modern iOS is already difficult enough (exploits go for tens millions $USD), persistence is an order of magnitude more difficult.

doublerabbit [3 hidden]5 mins ago
It's why I keep my old iPhone XR on 15.x for jail breaking reasons. I purchased an a new phone specially for the later versions and online banking.

Apple bought out all the jail breakers as Denuvo did for the game crackers.

noname120 [3 hidden]5 mins ago
> Apple bought out all the jail breakers > Denuvo did for the game crackers

Do you have sources for these statements?

digiown [3 hidden]5 mins ago
Secure boot and verified system partition is supposed to help with that. It's for the same reason jailbreaks don't persist across reboots these days.
nxobject [3 hidden]5 mins ago
Re: reboots – TFA states that recent iPhones reboot every 3 days when inactive for the same reasons. Of course, now that we know that it's linked to inactivity, black hatters will know how to avoid it...
maldev [3 hidden]5 mins ago
You should read into IOS internals before commenting stuff like this. Your answer is wrong, and rootkits have been dead on most OS's for years, but ESPECIALLY IOS. Not every OS is like Linux where security is second.

Even a cursory glance would show it's literally impossible on IOS with even a basic understanding.

ramuel [3 hidden]5 mins ago
Can't they just use Pegasus or Cellebrite???
cdrnsf [3 hidden]5 mins ago
Given Cook's willing displays of fealty to Trump this time around I wouldn't be shocked if they were to remove lockdown mode in a future release.
KKKKkkkk1 [3 hidden]5 mins ago
What is she investigated for?
buckle8017 [3 hidden]5 mins ago
They're not actually investigating her, they're investigating a source that leaked her classified materials.
zozbot234 [3 hidden]5 mins ago
If they're not investigating her she doesn't have any 5th-amendment protection and can be compelled to testify on anything relevant, including how to unlock her devices.
jimt1234 [3 hidden]5 mins ago
Did the individual store the classified material in the bathroom at his beach-side resort?
mmooss [3 hidden]5 mins ago
Don't be idiots. The FBI may say that whether or not they can get in:

1. If they can get in, now people - including high-value targets like journalists - will use bad security.

2. If the FBI (or another agency) has an unknown capability, the FBI must say they can't get in or reveal their capabilities to all adversaries, including to even higher-profile targets such as counter-intelligence targets. Saying nothing also risks revealing the capability.

3. Similarly if Apple helped them, Apple might insist that is not revealed. The same applies to any third party with the capability. (Also, less significantly, saying they can't get in puts more pressure on Apple and on creating backdoors, even if HN readers will see it the other way.)

Also, the target might think they are safe, which could be a tactical advantage. It also may exclude recovered data from rules of handling evidence, even if it's unusable in court. And at best they haven't got in yet - there may be an exploit to this OS version someday, and the FBI can try again then.

coppsilgold [3 hidden]5 mins ago
I would not recommend that one trust a secure enclave with full disk encryption (FDE). This is what you are doing when your password/PIN/fingerprint can't contain sufficient entropy to derive a secure encryption key.

The problem with low entropy security measures arises due to the fact that this low entropy is used to instruct the secure enclave (TEE) to release/use the actual high entropy key. So the key must be stored physically (eg. as voltage levels) somewhere in the device.

It's a similar story when the device is locked, on most computers the RAM isn't even encrypted so a locked computer is no major obstacle to an adversary. On devices where RAM is encrypted the encryption key is also stored somewhere - if only while the device is powered on.

aquir [3 hidden]5 mins ago
We need a Lockdown mode for MacBooks as well!
steve-atx-7600 [3 hidden]5 mins ago
Looks like it’s a feature: https://support.apple.com/en-us/105120
LordGrey [3 hidden]5 mins ago
To save a click:

* Lockdown Mode needs to be turned on separately for your iPhone, iPad, and Mac.

* When you turn on Lockdown Mode for your iPhone, it's automatically turned on for your paired Apple Watch.

* When you turn on Lockdown Mode for one of your devices, you get prompts to turn it on for your other supported Apple devices.

PlatoIsADisease [3 hidden]5 mins ago
Little too late for 1000 people hacked by pegasus.
ChrisArchitect [3 hidden]5 mins ago
Previously, direct link to the court doc:

FBI unable to extract data from iPhone 13 in Lockdown Mode in high profile case [pdf]

https://storage.courtlistener.com/recap/gov.uscourts.vaed.58...

(https://news.ycombinator.com/item?id=46843967)

mrexcess [3 hidden]5 mins ago
I trust 404 media more than most sources, but I can’t help but reflexively read every story prominently showcasing the FBI’s supposed surveillance gaps as attempted watering hole attacks. The NSA almost certainly has hardware backdoors in Apple silicon, as disclosed a couple of years ago by the excellent researchers at Kaspersky. That being the case, Lockdown Mode is not even in play.
chuckadams [3 hidden]5 mins ago
The NSA is not going to tip its hand about any backdoors it had built into the hardware for something as small as this.
ddtaylor [3 hidden]5 mins ago
It depends on if parallel reconstruction can be used to provide deniability.
chuckadams [3 hidden]5 mins ago
Even a parallel construction has limited uses, since you can't use the same excuse every time. The NSA probably doesn't trust the FBI to come up with something plausible.
kittikitti [3 hidden]5 mins ago
It sounds like almost all of our devices have security by annoyance as default. Where are the promises of E2E encryption and all the privacy measures? When I turned on lockdown mode on my iPhone, there were a few notifications where the random spam calls I get were attempting a FaceTime exploit. How come we have to wait until someone can prove ICE can't get into our devices?
davidfekke [3 hidden]5 mins ago
I guess they got a 404
UltraSane [3 hidden]5 mins ago
Samsung phones have the Secure Folder which can have a different, more secure password and be encrypted when the phone is on.
Itoldmyselfso [3 hidden]5 mins ago
Secure folder uses or is in the process of starting to use Android native feature private space, which is available on all Android 15 phones.
delichon [3 hidden]5 mins ago
I use the Cryptomator app for this, it works as advertised. I keep ~60 GiB of personal files in there that would be an easy button to steal my identity and savings. I'm just hoping it doesn't include an NSA back door.
piperswe [3 hidden]5 mins ago
The NSA definitely has easier ways to steal your identity and savings if they wanted to anyways
vorticalbox [3 hidden]5 mins ago
you can check the github https://github.com/cryptomator/ios
delichon [3 hidden]5 mins ago
Even if I had the skills to confirm the code is secure, how could I know that this is the code running on my phone, without also having the skills to build and deploy it from source?
warkdarrior [3 hidden]5 mins ago
Also, you need to make sure that the installation process does not insert a backdoor into the code you built from source.
fragmede [3 hidden]5 mins ago
mandeepj [3 hidden]5 mins ago
For now! They’ll get something from open market like the last time when Apple refused to decrypt (or unlock?) a phone for them.
PlatoIsADisease [3 hidden]5 mins ago
Yeah this is low stakes stuff, Pegasus historically breaks Apple phones easy. Bezos's nudes and Khashoggi knows. (not really Khashoggi is dead)
hnrayst [3 hidden]5 mins ago
[flagged]
boston_clone [3 hidden]5 mins ago
Both of your comments here, posted just one minute apart yet with completely different content, reek of LLM output.
dang [3 hidden]5 mins ago
Jensson [3 hidden]5 mins ago
People probably didn't see the other post, but both posts are several paragraphs and posted the same minute. No human would do that.

Its also a new account that only posted these two posts.

coldpie [3 hidden]5 mins ago
Good spot, thanks for pointing it out. I normally don't like the LLM accusation posts, but two posts from a brand new user in the same minute is a pretty huge red flag for bad behavior.

https://news.ycombinator.com/item?id=46886472

https://news.ycombinator.com/item?id=46886470

rob [3 hidden]5 mins ago
This is another bot I pointed out yesterday:

https://news.ycombinator.com/threads?id=Soerensen

Their comment got flagged, but looks like they made a new one today and is still active.

That account ('Soerensen') was created in 2024 and dormant until it made a bunch of detailed comments in the past 24-48 hrs. Some of them are multiple paragraph comments posted within 1 minute of each other.

One thing I've noticed is that they seem to be getting posted from old/inactive/never used accounts. Are they buying them? Creating a bunch and waiting months/years before posting?

Either way, both look like they're fooling people here. And getting better at staying under the radar until they slip up in little ways like this.

josefresco [3 hidden]5 mins ago
I wonder if it's actual users with dormant accounts who just setup their Moltbot?
hypfer [3 hidden]5 mins ago
Some, maybe, but that's just another nice layer of plausible deniability.

The truth is that the internet is both(what's the word for 'both' when you have three(four?) things?) dead, an active cyber- and information- warzone and a dark forest.

I suppose it was fun while it lasted. At least we still have mostly real people in our local offline communities.

js2 [3 hidden]5 mins ago
Gives this old cartoon new meaning, I suppose.

https://en.wikipedia.org/wiki/On_the_Internet%2C_nobody_know...

datsci_est_2015 [3 hidden]5 mins ago
Old account, fresh comments - to make it more clear. Freaky.
bradley13 [3 hidden]5 mins ago
So what, if the content is good?

Also, some of us draft our comments offline, and then paste them in. Maybe he drafted two comments?

tonyedgecombe [3 hidden]5 mins ago
Posting sibling comments is unusual.
crazygringo [3 hidden]5 mins ago
Funny, you're definitely right -- I've done it probably just 2 or 3 times over a decade, when I felt like I had two meaningful but completely unrelated things to say. And it always felt super weird, almost as if I was being dishonest or something. Could never quite put my finger on why. Or maybe I was worried it would look like I was trying to hog the conversation?
xpe [3 hidden]5 mins ago
I don’t know about the particular claim about the new account — if true, based on what people have said, this would be consistent with an LLM bot with high probability … (but not completely out of the question for a person) … I’ll leave that analysis up to the moderators who have a better statistical understanding of server logs, etc.

That said, as a general point, it’s reasonable to make scoped comments in the corresponding parts of the conversation tree. (Is that what happened here?)

About me: I try to pay attention to social conventions, but I rarely consider technology offered to me as some sort of intrinsically correct norm; I tend to view it as some minimally acceptable technological solution that is easy enough to build and attracts a lowest common denominator of traction. But most forums I see tend to pay little attention to broader human patterns around communication; generally speaking, it seems to me that social technology tends to expect people to conform to it rather than the other way around. I think it’s fair to say that the history of online communication has demonstrated a tendency of people to find workarounds to the limitations offered them. (Using punctuation for facial expressions comes to mind.)

One might claim such workarounds are a feature rather than a bug. Maybe sometimes? But I think you’d have to dig into the history more and go case by case. I tend to think of features as conscious choices not lucky accidents.

skeptic_ai [3 hidden]5 mins ago
Still go to prison for not showing. So until devices have multiple pins for plausible deniability we are still screwed.

What’s so hard to make 2-3 pins and each to access different logged in apps and files.

If Apple/android was serious about it would implement it, but from my research seems to be someone that it’s against it, as it’s too good.

I don’t want to remove my Banking apps when I go travel or in “dangerous” places. If you re kidnapped you will be forced to send out all your money.

stouset [3 hidden]5 mins ago
Absolutely every aspect of it?

What’s so hard about adding a feature that effectively makes a single-user device multi-user? Which needs the ability to have plausible deniability for the existence of those other users? Which means that significant amounts of otherwise usable space needs to be inaccessibly set aside for those others users on every device—to retain plausible deniability—despite an insignificant fraction of customers using such a feature?

What could be hard about that?

gabeio [3 hidden]5 mins ago
> despite an insignificant fraction of customers using such a feature?

Isn't that the exact same argument against Lockdown mode? The point isn't that the number of users is small it's that it can significantly help that small set of users, something that Apple clearly does care about.

achierius [3 hidden]5 mins ago
Lockdown mode costs ~nothing for devices that don't have it enabled. GP is pointing out that the straightforward way to implement this feature would not have that same property.
stouset [3 hidden]5 mins ago
Lockdown mode doesn’t require everyone else to lose large amounts of usable space on their own devices in order for you to have plausible deniability.
PunchyHamster [3 hidden]5 mins ago
now I want to know what dirty laundry are their upper management hiding on their devices...
tosapple [3 hidden]5 mins ago
The 'extra users" method may not work in the face of a network investigation or typical file forensics.

Where CAs are concerned, not having the phone image 'cracked' still does not make it safe to use.

billfor [3 hidden]5 mins ago
Android phones are multi-user, so if they can do it then Apple should be able to.
Gud [3 hidden]5 mins ago
And how do you explain your 1TB phone that has 2GB of data, but only 700GB free?
deno [3 hidden]5 mins ago
The "fake" user/profile should work like a duress pin with addition of deniability. So as soon as you log in to the second profile all the space becomes free. Just by logging in you would delete the encryption key of the other profile. The actual metadata that show what is free or not were encrypted in the locked profile. Now gone.
tosapple [3 hidden]5 mins ago
Good idea, but this is why you image devices.
morkalork [3 hidden]5 mins ago
The same way when you buy a brand new phone with 200GB of storage that only has 50GB free on it haha
davidwritesbugs [3 hidden]5 mins ago
"Idunno copper, I'm a journalist not a geek"
heraldgeezer [3 hidden]5 mins ago
System files officer ;)
stouset [3 hidden]5 mins ago
That is about one fiftieth of the work that needs to go into the feature the OP casually “why can’t they just”-ed.
jb1991 [3 hidden]5 mins ago
This is called whataboutism. This particular feature aside, sometimes there are very good reasons not to throw the kitchen sink of features at users.
NitpickLawyer [3 hidden]5 mins ago
Truecrypt had that a decade+ ago.
ratg13 [3 hidden]5 mins ago
Not sure if you know the history behind it, but look up Paul Le Roux

Also would recommend the book called The Mastermind by Evan Ratliff

edm0nd [3 hidden]5 mins ago
imo Paul Le Roux has nothing to do with TrueCrypt
ratg13 [3 hidden]5 mins ago
He wrote the code base that it is based on in combination with code he stole. The name is also based on an early name he chose for the software.

Whether he was involved in the organization and participated in it, is certainly up for debate, but it's not like he would admit it.

https://en.wikipedia.org/wiki/E4M

hackerfoo [3 hidden]5 mins ago
Maybe one PIN could cause the device to crash. Devices crash all the time. Maybe the storage is corrupted. It might have even been damaged when it was taken.

This could even be a developer feature accidentally left enabled.

greesil [3 hidden]5 mins ago
Android has work profiles, so that could be done in Android. iPhone still does not.
skeptic_ai [3 hidden]5 mins ago
Police ask: give me pass for work profile. If you don’t: prison.
reaperducer [3 hidden]5 mins ago
Android has work profiles

Never ever use your personal phone for work things, and vice versa. It's bad for you and bad for the company you work for in dozens of ways.

Even when I owned my own company, I had separate phones. There's just too much legal liability and chances for things to go wrong when you do that. I'm surprised any company with more than five employees would even allow it.

greesil [3 hidden]5 mins ago
What's the risk? On Android, the company can remotely nuke the work profile. The work profile has its own file system and apps. You can turn it off when to don't want work notifications.
PunchyHamster [3 hidden]5 mins ago
you're surprise corporations are cheap
izzydata [3 hidden]5 mins ago
It doesn't seem fundamentally different from a PC having multiple logins that are accessed from different passwords. Hasn't this been a solved problem for decades?
paulryanrogers [3 hidden]5 mins ago
Apple's hardware business model incentivizes only supporting one user per device.

Android has supported multiple users per device for years now.

bsharper [3 hidden]5 mins ago
You can have a multiuser system but that doesn't solve this particular issue. If they log in to what you claim to be your primary account and see browser history that shows you went to msn.com 3 months ago, they aren't going to believe it's the primary account.
inetknght [3 hidden]5 mins ago
My browser history is cleared every time I close it.

It's actually annoying because every site wants to "remember" the browser information, and so I end up with hundreds of browsers "logged in". Or maybe my account was hacked and that's why there's hundreds of browsers logged in.

compiler-guy [3 hidden]5 mins ago
Multi-user has been solved for decades.

Multi-user that plausibly looks like single-user to three letter agencies?

Not even close.

izzydata [3 hidden]5 mins ago
Doesn't having standard multi-user functionality automatically create the plausible deniability? If they tried so hard to create an artificial plausible deniability that would be more suspicious than normal functionality that just gets used sometimes.
wtallis [3 hidden]5 mins ago
What needs to be plausibly denied is the existence of a second user account, because you're not going to be able to plausibly deny that the account belongs to you when it resides on the phone found in your pocket.
vlovich123 [3 hidden]5 mins ago
iPhone and macOS are basically the same product technically. The reason iPhone is a single user product is UX decisions and business/product philosophy, not technical reasons.

While plausible deniability may be hard to develop, it’s not some particularly arcane thing. The primary reasons against it are the political balancing act Apple has to balance (remember San Bernardino and the trouble the US government tried to create for Apple?). Secondary reasons are cost to develop vs addressable market, but they did introduce Lockdown mode so it’s not unprecedented to improve the security for those particularly sensitive to such issues.

achierius [3 hidden]5 mins ago
> iPhone and macOS are basically the same product technically

This seems hard to justify. They share a lot of code yes, but many many things are different (meaningfully so, from the perspective of both app developers and users)

ashdksnndck [3 hidden]5 mins ago
You think iPhones aren’t multi-user for technical reasons? You sure it’s not to sell more phones and iPads? Should we ask Tim “buy your mom an iPhone” Cook?
palmotea [3 hidden]5 mins ago
> Still go to prison for not showing. So until devices have multiple pins for plausible deniability we are still screwed.

> What’s so hard to make 2-3 pins and each to access different logged in apps and files.

Besides the technical challenges, I think there's a pretty killer human challenge: it's going to be really hard for the user to create an alternate account that looks real to someone who's paying attention. Sure, you can probably fool some bored agent in customs line who knows nothing about you, but not a trained investigator who's focused on you and knows a lot about you.

fluoridation [3 hidden]5 mins ago
But at that point it turns from "the person refused to unlock the device" to "we think the person has unlocked the device into a fake account".
skeptic_ai [3 hidden]5 mins ago
That’s what plausible deniability. How can you even tell?
ashdksnndck [3 hidden]5 mins ago
Doesn’t matter if the agent believes you. Only matters if the court jails you on a contempt charge.
davidwritesbugs [3 hidden]5 mins ago
Background agent in the decoy identity that periodically browses the web, retrieves email from a banal account etc.?
stouset [3 hidden]5 mins ago
Even more complications for a “why can’t they just…”. It’s almost as if this kind of thing is difficult to do in practice.
palmotea [3 hidden]5 mins ago
> Background agent in the decoy identity that periodically browses the web, retrieves email from a banal account etc.?

No. Think about it for a second: you're a journalist being investigated to find your sources, and your phone says you mainly check sports scores and send innocuous emails to "grandma" in LLM-speak? It's not going to fool someone who's actually thinking.

skeptic_ai [3 hidden]5 mins ago
Just use an account for “regular” stuff. And only use the “secret” account as needed.
ryanmcbride [3 hidden]5 mins ago
It's more a policy problem than a phone problem. Apple could add as many pins as they want but until there are proper legal based privacy protections, law enforcement will still just be like "well how do we know you don't have a secret pin that unlocks 40TB of illegal content? Better disappear you just to be sure"

For as long as law enforcement treats protection of privacy as implicit guilt, the best a phone can really do is lock down and hope for the best.

Even if there was a phone that existed that perfectly protected your privacy and was impossible to crack or was easy to spoof content on, law enforcement would just move the goal post of guilt so that owning the phone itself is incriminating.

Edit: I wanna be clear that I'm not saying any phone based privacy protections are a waste of time. They're important. I'm saying that there is no perfect solution with the existing policy being enforced, which is "guilty until proven dead"

Cthulhu_ [3 hidden]5 mins ago
How does "go to prison for not showing" work when a lot of constitutions have a clause for a suspect not needing to participate in their own conviction / right to remain silent?

A detective can have a warrant to search someone's home or car, but that doesn't mean the owner needs to give them the key as far as I know.

SoftTalker [3 hidden]5 mins ago
It does mean that. You can't be forced to divulge information in your head, as that would be testimonial. But if there are papers, records, or other evidentiary materials that are e.g. locked in a safe you can be compelled to open it with a warrant, and refusal would be contempt.
Steltek [3 hidden]5 mins ago
They need to prove that those materials exist on the device first. You can't be held in contempt for a fishing expedition.
SoftTalker [3 hidden]5 mins ago
You need "probable cause to believe" which is not as strong as "prove" but yes, it can't be a pure fishing expedition.
lostlogin [3 hidden]5 mins ago
FaceID and TouchID aren’t protected by that as I understand it.
plagiarist [3 hidden]5 mins ago
That's correct, they are not. A complete failing of legislation and blatant disregard of the spirit of the 5th Amendment.

So do not have biometrics as device unlock if you are a journalist protecting sources.

SoftTalker [3 hidden]5 mins ago
They are considered to be more like keys to a safe than private knowledge. They also can't be changed if compromised. A sufficiently unguessable PIN or passphrase is better than biometrics.
parineum [3 hidden]5 mins ago
I know it seems like an incredibly dubious claim but the "I forgot" defense actually works here.

It's not really that useful for a safe since they aren't _that_ difficult to open and, if you haven't committed a crime, it's probably better to open your safe for them than have them destroy it so you need a new one. For a mathematically impossible to break cipher though, very useful.

jibe [3 hidden]5 mins ago
Hannah Natanson is not in prison though.
Zak [3 hidden]5 mins ago
Assuming the rule of law is still functioning, there are multiple protections for journalists who refuse to divulge passwords in the USA. A journalist can challenge any such order in court and usually won't be detained during the process as long as they show up in court when required and haven't tried to destroy evidence.

Deceiving investigators by using an alternate password, or destroying evidence by using a duress code on the other hand is almost always a felony. It's a very bad idea for a journalist to do that, as long as the rule of law is intact.

dboreham [3 hidden]5 mins ago
I think it's pretty clear at this point that rule of law isn't functioning. Perhaps it never was. It was just rule of law theater.
Blackthorn [3 hidden]5 mins ago
They are willing to kill people and then justify it by calling them terrorists. Plausible deniability is pointless.
jb1991 [3 hidden]5 mins ago
Uh, that escalated quickly.
ryanmcbride [3 hidden]5 mins ago
Actually it's been escalating pretty steadily for 250 years
cr125rider [3 hidden]5 mins ago
Fourth and Fifth amendments disagree
twelvedogs [3 hidden]5 mins ago
I don't think we're doing amendments any more
ddtaylor [3 hidden]5 mins ago
And if we are it will be a new one with a high number and it will be pure insanity
lm28469 [3 hidden]5 mins ago
Sure but in the real world it can take months or years, Francis Rawls stayed 4 years in jail because he didn't want to unlock hard drives.
kyrra [3 hidden]5 mins ago
People are jailed for contempt of court for failing to provide passwords.

https://reason.com/2017/05/31/florida-man-jailed-180-days-fo...

fluoridation [3 hidden]5 mins ago
Wow, so US judges are just making it up as they go along, huh? It's like every case is a different judgement with no consistent criterion.

>Doe vs. U.S. That case centered around whether the feds could force a suspect to sign consent forms permitting foreign banks to produce any account records that he may have. In Doe, the justices ruled that the government did have that power, since the forms did not require the defendant to confirm or deny the presence of the records.

Well, what if the defendant was innocent of that charge but guilty of or involved in an unrelated matter for which there was evidence in the account records?

eviks [3 hidden]5 mins ago
There is no plausible deniability here, that's only relevant in a rule-of-law type of situation, but then you wouldn't need it as you can't be legally compelled to do that anyway. "We don't see any secret source communication on your work device = you entered the wrong pin = go think about what your behavior in jail"
AdamN [3 hidden]5 mins ago
Even if this worked (which would be massively expensive to implement) the misconfiguration possibilities are endless. It wouldn't be customer-centric to actually release this capability.

Better for the foreseeable future to have separate devices and separate accounts (i.e. not in the same iCloud family for instance)

snowwrestler [3 hidden]5 mins ago
“Plausible deniability” is a public relations concept. It doesn’t confer any actual legal protection.
dkarras [3 hidden]5 mins ago
It absolutely offers some legal protection. If it is implemented correctly, no legal framework for it is required. Government forces you to enter your password. You comply and enter "a" password. The device shows contents. You did what you were asked to do. If there is no way for the government to prove that you entered a decoy password that shows decoy contents, you are in the clear. Done correctly (in device and OPSEC) government can't prove you entered your decoy password so you can't be held in contempt. And that is the entire point. It is not like asking the government to give your "plausible deniability" rights. It is about not potentially incriminating yourself against people that abuse the system to force you to incriminate yourself.
snowwrestler [3 hidden]5 mins ago
> You comply and enter "a" password. The device shows contents. You did what you were asked to do.

No, you did something fake to avoid doing what you were asked to do.

> If there is no way for the government to prove that you entered a decoy password that shows decoy contents, you are in the clear.

But there are very effective ways to find hidden encrypted volumes on devices. And then you’ll be asked to decrypt those too, and then what?

This sort of thing is already table stakes for CSAM prosecutions, for example. Law enforcement can read the same blog posts and know as much about technology as you do. Especially if we are hypothesizing an advertised feature of a commercial OS!

dkarras [3 hidden]5 mins ago
>No, you did something fake to avoid doing what you were asked to do.

Yes, that is what plausible deniability is.

>But there are very effective ways to find hidden encrypted volumes on devices. And then you’ll be asked to decrypt those too, and then what?

I emphasized "done right". If existence of hidden encryption can be proven, then you don't have plausible deniability. Something has gone wrong.

My point was: OP claimed plausible deniability does not apply in legal cases which is a weird take. If you can have plausible deniability, then it can save you legally. This does not only apply to tech of course, but encryption was the subject here. In all cases though, if your situation is not "plausible" (due to broken tech, backdoors, poor OPSEC in tech, and / or damning other evidence in other cases as well) then you don't have plauisble deniability by definition.

Having ways of definitively detecting hidden encrypted volumes might be the norm today, might be impossible tomorrow. Then you will have plausible deniability and it will work legally as far as that piece of "evidence" is concerned.

pluralmonad [3 hidden]5 mins ago
I always wondered if this was the feature of TrueCrypt that made it such a big target. LUKS is fine, I guess, but TrueCrypt felt like actual secrecy.
lm28469 [3 hidden]5 mins ago
Yep, you need an emergency mode that completely resets the phone to factory settings, maybe triggered with a decoy pin. Or a mode that physically destroys the chip storing the keys
bitexploder [3 hidden]5 mins ago
You do not. We have this thing in our constitution called the 5th amendment. You cannot be forced to divulge the contents of your mind, including your pin or passwords. Case law supports this. For US citizens at least. Hopefully the constitution is still worth something.
lm28469 [3 hidden]5 mins ago
That's in the fantasy world of constitution maximalists. In real world it doesn't work like that and you might still lose money/time/your sanity fighting a system who cares less and less about your rights
bitexploder [3 hidden]5 mins ago
The case law on this specific topic is convincing. If you are ever in that situation it is usually going to be worth your time and money to assert the right and see it through. Case law supports this. The general maximum “penalty” is being held in contempt of court. And if the government is wrongly persecuting you, it is lose / lose if you divulge.
carlosjobim [3 hidden]5 mins ago
Do you think this is for fighting parking tickets? It is for journalists to not reveal their sources, whom might be at risk of severe consequences including death.

That's a whole lot more to loose than your money and time.

lm28469 [3 hidden]5 mins ago
That's not what we're discussing here, you can't just say "I plead the fifth" and walk away if the people in charge decided you wouldn't walk away, no matter what's right or "legal"

Francis Rawls stayed 4 years in jail despite pleading the fifth all day long

bitexploder [3 hidden]5 mins ago
That case also established 18 months as an upper limit. If you are in that situation it is usually better to simply jot divulge. Especially if there is incriminating evidence. Or you are a journalist being harassed by the DOJ. It can only bring you more pain. They will always find something.
lm28469 [3 hidden]5 mins ago
Yeah well that's what I'm saying... "just plead the fifth" is nice on paper, in practice you're going to suffer for a long time.
lostlogin [3 hidden]5 mins ago
> You cannot be forced to divulge the contents of your mind, including your pin or passwords.

Biometric data doesn’t need the password.

And good luck depending on the US constitution.

stackghost [3 hidden]5 mins ago
You're forgetting about the Constitution-Free Zone within 100 miles of all points of entry including international airports that covers essentially all of the 48.
Zak [3 hidden]5 mins ago
This is a misunderstanding. That's the area in which the border patrol has jurisdiction to can conduct very limited searches of vehicles and operate checkpoints without individualized suspicion in order to enforce immigration law. It does not allow searches of electronic devices.

There is a separate border search exception at the point a person actually enters the country which does allow searches of electronic devices. US citizens entering the country may refuse to provide access without consequences beyond seizure of the device; non-citizens could face adverse immigration actions.

To be clear, I do think all detentions and searches without individualized suspicion should be considered violations of the 4th amendment, but the phrase "constitution-free zone" is so broad as to be misleading.

lostlogin [3 hidden]5 mins ago
With ICE on the prowl, I’d have thought ‘Constitution Free Zone’ a fitting description of how they operate.
bitexploder [3 hidden]5 mins ago
I am not. You can still assert your rights at border points. It is very inconvenient. I have done it. If you are returning from international travel there is little they can do. If you are trying to leave the country they can make that difficult to impossible. Otherwise your rights still apply.
eduction [3 hidden]5 mins ago
Completely separate decision with a higher legal bar for doing that.

It's one thing to allow police to search a phone. Another to compel someone to unlock the device.

We live in a world of grays and nuance and an "all or nothing" outlook on security discourages people from taking meaningful steps to protect themselves.

frogcommander [3 hidden]5 mins ago
Why are you on a website for programmers and software developers if you arent a software developer and you know nothing of the subject?
DamnInteresting [3 hidden]5 mins ago
> What’s so hard to make 2-3 pins and each to access different logged in apps and files.

I've been advocating for this under-duress-PIN feature for years, as evidenced by this HN comment I made about 9 years ago: https://news.ycombinator.com/item?id=13631653

Maybe someday.

pc86 [3 hidden]5 mins ago
Serious question: What are the "valid concerns" about people securing their computing devices against third parties?
hypfer [3 hidden]5 mins ago
This (I think) refers not to the people securing their devices against third parties but the vendors "securing" the devices against loss of profits.

Essentially, the question referenced here is that of ownership. Is it your device, or did you rent it from Apple/Samsung/etc. If it is locked down so that you can't do anything you want with it, then you might not actually be its owner.

___

_Ideally_ you wouldn't need to trust Apple as a corp to do the right thing. Of course, as this example shows, they seem to actually have done one right thing, but you do not know if they will always do.

That's why a lot of people believe that the idea of such tight vendor control is fundamentally flawed, even though in this specific instance it yielded positive results.

For completeness, No, I do not know either how this could be implemented differently.

pbhjpbhj [3 hidden]5 mins ago
We don't know if they did the right thing here. With a previous case it seemed (to me) like Apple might have pushed an update to give access ... they presumably could do that, remotely copy all the data, then return the device to the former state. One can't know, and this sort of thing seems entirely tenable.

FBI don't have to tell anyone they accessed the device. That maintains Apples outward appearance of security; FBI just use parallel construction later if needed.

Something like {but an actually robust system} a hashed log, using an enclave, where the log entries are signed using your biometric, so that events such a network access where any data is exchanged are recorded and can only be removed using biometrics. Nothing against wrench-based attacks, of course.

GeekyBear [3 hidden]5 mins ago
> With a previous case it seemed (to me) like Apple might have pushed an update to give access

You're going to have to provide a cite here, since Apple has publicity stated that they have not and will not ever do this on behalf of any nation state.

For instance, Apple's public statement when the FBI ordered them to do so:

https://www.apple.com/customer-letter/

bigyabai [3 hidden]5 mins ago
> Apple has publicity stated that they have not and will not ever do this

Apple has also said that the US required them to hide evidence of dragnet surveillance: https://arstechnica.com/tech-policy/2023/12/apple-admits-to-...

  Apple has since confirmed in a statement provided to Ars that the US federal government “prohibited” the company “from sharing any information,” but now that Wyden has outed the feds, Apple has updated its transparency reporting and will “detail these kinds of requests” in a separate section on push notifications in its next report.
Apple statements are quite distinct from what they do behind the scenes.
GeekyBear [3 hidden]5 mins ago
Providing a copy of push notification data (or any data) that you host on your server in response to a warrant is not what we are talking about.

No company can refuse to do that.

hypfer [3 hidden]5 mins ago
I mean arguably, we do not even fully know if even if they did as claimed, they did the _right_ thing.

The underlying assumption we base our judgement on is that "journalism + leaks = good" and "people wanting to crack down on leaks = bad". Which is probably true, but also an assumption where something unwanted and/or broken could hide in. As with every assumption.

Arguably, in a working and legit democracy, you'd actually want the state to have this kind of access, because the state, bound by democratically governed rules, would do the right thing with it.

In the real world, those required modifiers unfortunately do not always hold true, so we kinda rely on the press as the fourth power, which _technically_ could be argued is some kind of vigilante entity operating outside of the system.

I suppose it's also not fully clear if there can even be something like a "working and legit democracy" without possibly inevitable functionally vigilantes.

Lots of stuff to ponder.

____

Anyway, my point is that I have no point. You don't have to bother parsing that, but it might possibly be interesting if you should decide to do so.

It might also confuse the LLM bots and bad-faith real humans in this comment section, which is good.

mschuster91 [3 hidden]5 mins ago
> Essentially, the question referenced here is that of ownership. Is it your device, or did you rent it from Apple/Samsung/etc. If it is locked down so that you can't do anything you want with it, then you might not actually be its owner.

Both goals actually are possible to implement at the same time: Secure/Verified Boot together with actually audited, preferably open-source, as-small-as-possible code in the boot and crypto chain, for the user, the ability to unlock the bootloader in the EFI firmware and for those concerned about supply chain integrity, a debug port muxed directly (!) to the TPM so it can be queried for its set of whitelisted public keys.

pbhjpbhj [3 hidden]5 mins ago
The TPM can be programmed (ie designed) to lie about the whitelist though.
nicoburns [3 hidden]5 mins ago
One valid concern about "locked down computing" is the potential for 3rd parties to secure computing devices against their owners.
zuminator [3 hidden]5 mins ago
In this case I think "valid concerns about locked down computing" is referring to the owner's use of the phone being restricted, so that they can't download applications they want to use, they don't have unrestricted access to the filesystem, they are forced to pay an Apple commission to engage in certain forms aloft commerce, etc. These may be acceptable tradeoffs but they're valid concerns nonetheless.
bayindirh [3 hidden]5 mins ago
I don't have to have any concern to be able to secure my device against third parties, it's just good operational discipline.

I don't do anything classified, or store something I don't want to be found out. On the other hand, equally I don't want anyone to be able to get and fiddle a device which is central to my life.

That's all.

It's not "I have nothing to hide" (which I don't actually have), but I don't want to put everything in the open.

Security is not something we shall earn, but shall have at the highest level by default.

shaky-carrousel [3 hidden]5 mins ago
Corrupt government officials gunning down inconvenient people.
pc86 [3 hidden]5 mins ago
I'd love to hear what you think that has to do with this?
shaky-carrousel [3 hidden]5 mins ago
Sure you will.
nutjob2 [3 hidden]5 mins ago
If we've learned anything from this administration it is that the government can ignore the law longer than you can stay alive. Arming yourself against lawless government in every legal way is advisable.
pc86 [3 hidden]5 mins ago
I'm not even saying you're wrong, I'm saying what does that have to do with a valid search warrant being executed?
macintux [3 hidden]5 mins ago
There's a fair bit of dispute about whether this is valid. The active criminalization of journalism is worrisome.
pc86 [3 hidden]5 mins ago
It's signed by a judge, it's valid. What is in dispute, exactly?
macintux [3 hidden]5 mins ago
> The Justice Department failed to tell a magistrate judge about a 1980 law protecting journalists in its application materials for a warrant

https://www.nytimes.com/2026/02/02/us/politics/doj-press-law...

Previously:

> U.S. Magistrate Judge William B. Porter wrote in his order that the government must preserve any materials seized during the raid and may not review them until the court authorizes it

https://san.com/cc/judge-blocks-fbis-access-to-washington-po...

buckle8017 [3 hidden]5 mins ago
Lockdown mode significantly effects the usability of the phone.

It completely disables JIT js in Safari for example.

pc86 [3 hidden]5 mins ago
"Don't secure your phone it might mess up JavaScript" is not something I had on my 2026 bingo card.
odo1242 [3 hidden]5 mins ago
JavaScript is actually the only reason that the iPhone has runtime code generation capabilities at all, so it kinda makes sense
buckle8017 [3 hidden]5 mins ago
I mean I tried it for a bit and I have to say it was a significant compromise.

All kinds of random things don't work.

Marsymars [3 hidden]5 mins ago
I find all kinds of random things already don't work on mobile Safari - the web is effectively unusable without an adblocker, and over the past few months I've seen an explosion in the use of sites using "AdShield" which, if they detect ad-blocking, breaks websites (and lies to the user about the cause). Desktop browsers are able to handle this still, but on mobile Safari it just results in a bunch of the web being broken.
prophesi [3 hidden]5 mins ago
You can choose to exclude Safari from these protections[0]. Honestly, looking at the list of "limitations" you'll have while running Lockdown mode, I'm surprised most of them aren't the system default.

[0] https://support.apple.com/en-us/105120 - under "How to exclude apps or websites from Lockdown Mode"

buckle8017 [3 hidden]5 mins ago
Sure but the JIT js disable and limiting of image/video decoders are combined basically all the security from lockdown mode, so disabling it seems pointless.
prophesi [3 hidden]5 mins ago
I do wish it worked more like GrapheneOS, but the other protections outside of web browsing seem to make it worth enabling lockdown mode. Personally, I'm only reading articles on my phone's browser so I'd wonder if I'd be fine with disabled JIT and crippled decoders.
peterspath [3 hidden]5 mins ago
I do have it enabled and webbrowsing is still fine, the things I use are or websites or simple web apps that aren't javascript heavy anyway...

when I want to do something for longer I will pickup my MacBook anyway.

blibble [3 hidden]5 mins ago
you can enable it for certain trusted websites
reactordev [3 hidden]5 mins ago
Pegasus.

Jedi.

SKyWIper.

Rogue Actors.

Rogue thief’s.

Rogue governments.

Your spouse.

Separating corporate IT from personal IT.

There’s plenty of reasons.

pc86 [3 hidden]5 mins ago
These are reasons to be able to secure your devices against third parties, not reasons you shouldn't be able to.
blitzar [3 hidden]5 mins ago
Oh, come on. Don't look at another man's Portal Gun history. We all go to weird places.
whynotminot [3 hidden]5 mins ago
I get so annoyed by this Socratic line of questioning because it’s extremely obvious.

Terrorist has plans and contacts on laptop/phone. Society has a very reasonable interest in that information.

But of course there is the rational counter argument of “the government designates who is a terrorist”, and the Trump admin has gleefully flouted norms around that designation endangering rule of law.

So all of us are adults here and we understand this is complicated. People have a vested interest in privacy protections. Society and government often have reasonable interest in going after bad guys.

Mediating this clear tension is what makes this so hard and silly lines of questioning like this try to pretend it’s simple.

anonymous908213 [3 hidden]5 mins ago
The better rational counter argument is that "privacy is a human right enshrined in international law". Society has zero business knowing anyone's private communications, whether or not that person is a terrorist. There is nothing natural about being unable to talk to people privately without your speech being recorded for millions of people to view forever. Moreover, giving society absolute access to private communications is a short road to absolute dystopia as government uses it to completely wipe out all dissent, execute all the Jews or whatever arbitrary enemy of the state they decide on, etc.

You do not get to dispense with human rights because terrorists use them too. Terrorists use knives, cars, computers, phones, clothes... where will we be if we take away everything because we have a vested interested in denying anything a terrorist might take advantage of?

whynotminot [3 hidden]5 mins ago
Who decided absolute privacy in all circumstances is a fundamental human right? I don’t think any government endorses that position. I don’t know what international law you speak of. You’re basing your argument on an axiom that I don’t think everyone would agree with.

This sounds like a Tim Cook aphorism (right before he hands the iCloud keys to the CCP) — not anything with any real legal basis.

anonymous908213 [3 hidden]5 mins ago
Article 12 of the United Nation's Declaration of Human Rights:

> No one shall be subjected to arbitrary interference with his privacy [...]

which has later been affirmed to include digital privacy.

> I don’t think any government endorses that position.

Many governments are in flagrant violation of even their own privacy laws, but that does not make those laws any less real.

The UN's notion of human rights were an "axiom" founded from learned experience and the horrors that were committed in the years preceding their formation. Discarding them is to discard the wisdom we gained from the loss of tens of millions of people. And while you claim that society has a vested interest in violating a terrorist's privacy, you can only come to that conclusion if you engage in short-term thinking that terminates at exactly the step you violate the terrorist's rights and do not consider the consequences of anything beyond that; if you do consider the consequences it becomes clear that society collectively has a bigger vested interest in protecting the existence of human rights.

whynotminot [3 hidden]5 mins ago
> No one shall be subjected to arbitrary interference with his privacy

“Arbitrary” meaning you better have good reasons! Which implies there are or can be good reasons for which your privacy can be violated.

You’re misreading that to mean your privacy is absolute by UN law.

anonymous908213 [3 hidden]5 mins ago
Admittedly "arbitrary" is something of a legal weasel word that leaves a lot of room for interpretation. I lean towards a strong interpretation for two reasons: the first is because it is logically obvious why you must give it a strong interpretation; if the people responsible for enforcing human rights can arbitrarily decide you don't have them, you don't have human rights. The second is because we have seen this play out in the real world and it is abundantly clear that the damage to society is greater than any potential benefits. The US in particular has made an adventure out of arbitrarily suspending human rights, giving us wonderful treats like Guantanamo Bay and the black sites across the Middle East. I don't know what part of that experiment looked remotely convincing to you, but to me they only reinforced how clearly necessary inviolable human rights are for the greater good of society.
pbhjpbhj [3 hidden]5 mins ago
>if the people responsible for enforcing human rights can arbitrarily decide you don't have them, you don't have human rights

But the "arbitrary" there is too account for the situation where the democratic application of the law wants to inspect the communications of suspected terrorists, and where a judge agrees there is sufficient evidence to grant a warrant.

Unfortunately, that law does nothing against situations like the USA/Russia regime where a ruler dispenses with the rule of law (and democratic legal processes too).

You can't practically have that sort of liberalism, where society just shrugs and chooses not to read terrorists communications, those who wish to use violence make it unworkable.

danaris [3 hidden]5 mins ago
But if you want to make it possible for the Feds to break into a terrorist's secure phone, you have to make it impossible for anyone to have a secure phone.

That is arbitrary interference with all our privacy.

PatentlyDC123 [3 hidden]5 mins ago
Usually such "international laws" are only advisory and not binding on member nations. After decades of member nations flouting UN "laws" I can't see them as reliable or effective support in most arguments. I support the policy behind the privacy "laws" of the UN, but enforcing them seems to fall short.
anonymous908213 [3 hidden]5 mins ago
Enforcement mechanisms are weak, but they still exist to set a cultural norm and an ideal to strive towards. Regardless, I have also laid out an argument at length as to why society would logically want to have this be a human right for its own good, regardless of any appeal to existing authority.
Brian_K_White [3 hidden]5 mins ago
This means there are no valid concerns.

There are just things some people want and the reasons they want them.

So the question that you are so annoyed by remains unanswered (by you anyway), and so, valid, to all of us adults.

@hypfer gives a valid concern, but it's based on a different facet of lockdown. The concern is not that the rest of us should be able to break into your phone for our safety, it's the opposite, that you are not the final authority of your own property, and must simply trust Apple and the entire rest of society via our ability to compel Apple, not to break into your phone or it's backup.

pc86 [3 hidden]5 mins ago
At the risk of being kind of ass, which I've been trying to be better about lately, I'm going to offer some advice. If you can't even respond to a question about secure computing without bringing American presidential politics into things, perhaps you need to take a break from the news for a few weeks.

The reason I asked that question is because I don't think it's complicated. I should be able to lock down my device such that no other human being on the planet can see or access anything on it. It's mine. I own it. I can do with it whatever I please, and any government that says otherwise is diametrically opposed to my rights as a human being.

You are more likely to be struck by lightning while holding two winning lottery tickets from different lotteries than you are to be killed by an act of terrorism today. This is pearl-clutching, authoritarian nonsense. To echo the sibling comment, society does not get to destroy my civil rights because some inbred religious fanatics in a cave somewhere want to blow up a train.

Edit: And asking for someone to says "there are concerns!" to proffer even a single one is not a Socratic line of questioning, it's basic inquiry.

adleyjulian [3 hidden]5 mins ago
The line of reasoning is more like this: if you make and sell safe-cracking tools then it would not be unreasonable for the government to regulate it so only registered locksmiths could buy it. You don't want people profiting from the support of criminal acts.

The government could similarly argue that if a company provides communication as a service, they should be able to provide access to the government given they have a warrant.

If you explicitly create a service to circumvent this then you're trying to profit from and aid those with criminal intent. Silkroad/drug sales and child sexual content are more common, but terrorism would also be on the list.

I disagree with this logic, but those are the well-known, often cited concerns.

There is a trade-off in personal privacy versus police ability to investigate and enforce laws.

whynotminot [3 hidden]5 mins ago
This article is about the Trump admin seizing a reporter’s phone. The politics was here from the start.
hypfer [3 hidden]5 mins ago
> I get so annoyed by this Socratic line of questioning because it’s extremely obvious.

Yeah after seeing the additional comments, my gut also says "sea lion".

Truly a shame

handedness [3 hidden]5 mins ago
> ...the Trump admin has gleefully flouted norms around that designation...

One would have to hold a fairly uninformed view of history to think the norms around that designation are anything but invasive. The list since FDR is utterly extensive.

whynotminot [3 hidden]5 mins ago
I didn’t say he was the first to abuse powers. Indeed it’s kind of silly to even have to clarify “but other administrations…” because that’s fairly obvious to anyone old enough to have seen more than one president.

But the article is literally referencing the Trump administration seizing a reporter’s phone so the current administration’s overreach seems relevant here.

handedness [3 hidden]5 mins ago
But that's not what I said.

My point was that your stated assumption of what the norms are is inaccurate. If nearly every modern administration does it, that is literally the norm. The present administration, like many before it, is following the norm. The norm is the broader issue.

Which makes the rest of it (and your followup) come across as needlessly tribal, as both major parties are consistently guilty of tending to object to something only when the other side does it.

whynotminot [3 hidden]5 mins ago
Frankly I really don’t care about both sides-ism anymore. I can agree with you that a lot of administrations have been irresponsible on this point while also believing that the current administration is particularly dangerous in this area.

If I lose you here because of “needless tribalism” oh well.

Joel_Mckay [3 hidden]5 mins ago
Some platforms will side-load anything the telecom carrier sends.

It is naive to assume iOS can be trusted much more than Android. =3

pc86 [3 hidden]5 mins ago
Let's assume for the sake of argument you're making a valid point. What does that have to do with my question?
Joel_Mckay [3 hidden]5 mins ago
Location telemetry, listening devices, and exfiltration of protected sources.

A 3rd party locked down system can't protect people from what the law should. =3

ambicapter [3 hidden]5 mins ago
Think of the children
horacemorace [3 hidden]5 mins ago
The leaders of US government certainly do. Much too fondly.
ExoticPearTree [3 hidden]5 mins ago
> It's a real world example of how these security features aren't just for "paranoid people" but serve a legit purpose for people who handle sensitive info.

Because they're in the US things might be easier from a legal standpoint for the journalist, but they also have precedent on forcing journalist to expose their sources: https://en.wikipedia.org/wiki/Branzburg_v._Hayes

In other parts of the world this applies https://xkcd.com/538/ when you don't provide the means to access your phone to the authorities.

It just depends on how much a government wants the data that is stored there.

nickff [3 hidden]5 mins ago
Which countries actually grant reporters immunity from having to reveal information related to criminal investigations (where others would be compelled to, and without criminal penalties)? Such immunity may be desirable (at least in some circumstances), but I am not aware of any jurisdiction that actually grants it.
jampekka [3 hidden]5 mins ago
At least in Finland there's a specific law about journalistic source protection (lähdesuoja) explicitly saying journalists have the right to not reveal sources.

In serious crime cases in some circumstances a court may order a journalist to reveal sources. But it's extremely rare and journalists don't comply even if ordered.

https://fi.wikipedia.org/wiki/L%C3%A4hdesuoja

Edit: the source protection has actually probably never been broken (due to a court order at least): https://yle.fi/a/3-8012415

nickff [3 hidden]5 mins ago
Thanks for the info & link! After some searching, I found this rather interesting study on source protection in many (international) jurisdictions, and it calls out Finland, though other countries have interesting approaches as well: https://canadianmedialawyers.com/wp-content/uploads/2019/06/...
Joel_Mckay [3 hidden]5 mins ago
Indeed, likely as secure as the VPNs run by intelligence contractors.

1. iOS has well-known poorly documented zero-click exploits

2. Firms are required to retain your activity logs for 3 months

3. It is illegal for a firm to deny or disclose sealed warrants on US soil, and it is up to 1 judge whether to rummage through your trash. If I recall it was around 8 out of 18000 searches were rejected.

It is only about $23 to MITM someones phone now, and it is not always domestic agencies pulling that off. =3

quesera [3 hidden]5 mins ago
> 1. iOS has well-known poorly documented zero-click exploits

PoC || GTFO, to use the vernacular.

If you're talking about historical bugs, don't forget the update adoption curves.

Joel_Mckay [3 hidden]5 mins ago
No one will hand over the several $1m 0-day as PoC for free, as there are grey-market products based on the same tired exploits.

"Not My Circus, Not My Monkeys" as they say. =3

quesera [3 hidden]5 mins ago
My understanding is that there is current consensus that active iOS 0days are not likely to be available at the LE level.
sigmoid10 [3 hidden]5 mins ago
With the US descending more and more into fascism (as this case highlights yet again), I wonder what will happen to these features in the future. Especially now that the tech moguls of silicon valley stopped standing up to Trump and instead started kissing his ass. Tim Cook in particular seems to be the kind of person that rather is on the rich side of history than the right side. What if the administration realizes they can easily make Apple et al. give up their users by threatening their profits with tariffs and taxes?
vincenzothgreat [3 hidden]5 mins ago
How is it turning into fascism?
text0404 [3 hidden]5 mins ago
- Concentration of power in the executive, dismantling checks and balances

- Hyper-nationalism and white supremacist messaging

- Scapegoating of minorities

- Attacks on the press

- Attacks on constitutional rights

- Militarization of police, violence normalized

- Expansion of surveillance state

- Combination of state and corporate power

- Strongman authoritarianism

- Historical revisionism

- Interference in elections

Cheers!

shermantanktop [3 hidden]5 mins ago
- State-aligned media outlets, where media consumption choice is a political act

- Grandiose architecture projects for historically important sites

- Obsession with massive monuments - the tallest, the most gold, the most expensive

- Military parades and lionization of the military, while demanding political support from military leadership

- A population which become keenly interested in whether something does or doesn’t benefit the leader personally

I think the terms fascism or authoritarianism are close enough to be helpful, even if some of the specifics don’t align perfectly. But the ones that do align are oddly specific sometimes.

pbhjpbhj [3 hidden]5 mins ago
It turned.
thatswrong0 [3 hidden]5 mins ago
js2 [3 hidden]5 mins ago
Appreciate the gift link.
learingsci [3 hidden]5 mins ago
Apple seems to strongly discourage the use of lockdown mode. Presumably it is in conflict with their concern over share price and quarterly earnings.
robot_jesus [3 hidden]5 mins ago
Citation needed?

Apple does a lot of things I don't agree with in the interest of share price (like cozying up to authoritarian governments) but this seems like a reach to criticize them for a feature they have put extensive effort into, rather than applauding that they resist spying and enhance customer privacy. Sure, it's an optional feature and maybe they don't push broad acceptance of it, but it's important for those that need it.

learingsci [3 hidden]5 mins ago
Indeed. It maybe the best reason to use their products, but then why not make it default or do more to encourage its use?
groundzeros2015 [3 hidden]5 mins ago
Didn’t they make it?
learingsci [3 hidden]5 mins ago
Is it supported in iOS 18? They seem to suggest in their own documentation that very few people need or should use it. They could do much more to encourage and support its use. Even the naming “lockdown” vs “secure” is a big tell.
Analemma_ [3 hidden]5 mins ago
How do they discourage it? It’s a clearly-labeled button in the Settings app, which brings up one modal sheet explaining what will change if you turn it on, then one more button press and it’s on.
dist-epoch [3 hidden]5 mins ago
[flagged]
bob001 [3 hidden]5 mins ago
Do you disagree with the facts of the article? Or is it propaganda simply because the facts doesn't support your narrative and ideological inclinations?
summa_tech [3 hidden]5 mins ago
Selective amplification of true events as well as selective reporting are bread and butter of modern propaganda. It works a lot better than saying outright falsehoods, which - in the long-term - cause people to lose faith in everything you have to say. And there's always someone jumping to your defense - after all you did not outright lie...
bob001 [3 hidden]5 mins ago
That is again a claim with no backing that can be applied to anything without actual data to back it up.

For example. I can just as equally state with the same data to back me up (ie: none as it stands right now) that you are a US government plant posting propaganda to encourage people to not use safer technologies and as a result make their data easier to spy on.

cromka [3 hidden]5 mins ago
> Selective amplification

You can't possibly know this is what happened here, it's an observational bias.

UltraSane [3 hidden]5 mins ago
Man people are whiny about this on Hacker News when they should know better. There is no real computer security without hardware roots of trust and keystores
hnrayst [3 hidden]5 mins ago
[flagged]
rob [3 hidden]5 mins ago
`hnrayst` seems to be another AI (?) bot account created in 2022 with only two comments, both being in this very thread we're in today:

https://news.ycombinator.com/threads?id=hnrayst

Something weird is going on at Hacker News recently. I've been noticing these more and more.

bob001 [3 hidden]5 mins ago
Takeaway is to not enable biometric unlock if you are concerned about your data being accessed by authorities.
littlecranky67 [3 hidden]5 mins ago
Trick is not to use your right index finger as a biometric unlock finger (the button sits on the top right corner of the keyboard). If you are "forced" to unlock, the agents will guide your fingers and probably try that first 2-3 times. 2 more tries, and fingerprint reading gets disabled. Quite good odds.
Arubis [3 hidden]5 mins ago
This has long been true. In a pinch you can mash the power button 5+ times to require a key code at next unlock.
steve-atx-7600 [3 hidden]5 mins ago
Also, on iPhone, if you have face ID turned on, you can hold power+volume down (may differ depending on model) to force a passcode.
3pt14159 [3 hidden]5 mins ago
This doesn’t work for my iPhone that’s about three years old.
WorldMaker [3 hidden]5 mins ago
It's hold power+volume up (the "top two buttons" when reaching down into a pocket or purse and the phone) until the phone vibrates (~2s).

If you can see the screen, it's the fastest shortcut gesture to the screen that has "Slide to Power Off", "Medical ID", and "Emergency Call". Any other way to get to that screen also works to require a PIN before next unlock.

ezfe [3 hidden]5 mins ago
If your phone has home button, then you don't need to press the volume button. Otherwise, yes it does work.
bawolff [3 hidden]5 mins ago
So in america, they can force you to use a biometric but they can't compel you to reveal your password?

I mean, i agree with you, but its a really weird line in the sand to draw

forgotaccount3 [3 hidden]5 mins ago
One is knowledge the user has, and the other is a physical key they own.

Providing your 'finger' to unlock a device is no different than providing your 'key' to unlock something. So you can be compelled to provide those biometrics.

Compelling you to reveal a password is not some *thing* you have but knowledge you contain. Being compelled to provide that knowledge is no different than being compelled to reveal where you were or what you were doing at some place or time.

afavour [3 hidden]5 mins ago
That is genuinely the current state of law, yes. There's no real logic at work, just attempts at clawing back control whenever a new gray area appears.
intrasight [3 hidden]5 mins ago
It is very logical, as revealing a password is considered testimonial and is protected by the fifth amendment.
afavour [3 hidden]5 mins ago
Right, and pressing your finger on a fingerprint sensor is also revealing a password, just via different means.
intrasight [3 hidden]5 mins ago
But is not legal testimonial
afavour [3 hidden]5 mins ago
Right. Like I said, that's not logical, that's just legalese to gain access where you didn't have it before.
benterix [3 hidden]5 mins ago
> So in america, they can force you to use a biometric but they can't compel you to reveal your password?

I don't get it, touching finger is easy, but how do you compel someone to reveal their password?

rtkwe [3 hidden]5 mins ago
Put them in jail until they do or charge them with whatever the local flavor for "obstruction" is. In places where they're allowed by law to require you to give up a password not doing so when the proper steps are taken would usually be it's own crime, usually phrased as some sort of "obstruction" charge with it's own sentence. And that's just places where the law and citizen rights are a meaningful concept in restraining state power.
jon-wood [3 hidden]5 mins ago
Depending on the country and the willingness to comply with legal norms somewhere between putting you in prison until you give it up and hitting you with a stick until you give it up.
mock-possum [3 hidden]5 mins ago
And to be clear, in other words, that means you can’t be compelled. You can effectively resist giving up your password, you cannot effectively resist giving up your finger, gruesome though the prospect might be.
bob001 [3 hidden]5 mins ago
The UK simply puts you in jail for not doing so.
bawolff [3 hidden]5 mins ago
Tell us the password or we throw you in jail, shoot you, etc. The legal system is always ultimately backed by the state's monopoly on violence.
Arubis [3 hidden]5 mins ago
Pretty much.

Something you are: can be legally compelled Something you have: can be legally compelled Something you know: cannot be legally compelled

zozbot234 [3 hidden]5 mins ago
You can still be legally compelled to provide testimony, the catch is merely that you have to be granted immunity from being charged with a crime on the basis of any derived evidence. In this case, it seems that the WaPo journalist could still be compelled to provide such information if she's not charged for any crime.
rtkwe [3 hidden]5 mins ago
Yes the difference come from a close parsing of the 5th amendment, telling cops the password or code for a device or safe is pretty clearly compelling speech and adverse testimony while allowing cops to gather fingerprints and DNA has long been held as allowed so biometrics were analogized to that. It's also similar to the rule that cops can't force you to tell them the code to a safe but they're allowed with a warrant to destructively open the safe (if it falls under the terms of the warrant). Combine those too legal threads and it's at least reasonable to see how that line gets drawn from previous rulings.
ExpertAdvisor01 [3 hidden]5 mins ago
Germany does the same thing too . They can force you to unlock via faceid/biometric but can't force you to enter password.
deaux [3 hidden]5 mins ago
It's interesting because the latest Cellebrite data sheets showed them to support all iPhones including e.g. unbooted, but apparently not lockdown mode? It also showed they hadn't cracked GrapheneOS.
rdudek [3 hidden]5 mins ago
Wait, was this an oversight on his part about the biometric unlock? My MacBook biometric gets disabled after a bit and requires a password if the lid was closed for substantial amount of time.
asimovDev [3 hidden]5 mins ago
Does anyone know if iOS in lockdown mode stops syncing mail, imessage, call history etc to your other apple devices? I am wondering if reporter's stuff was all synced to the non lockdown MacBook from the iPhone
supriyo-biswas [3 hidden]5 mins ago
They usually ask you to enable lockdown mode on all your devices for advanced protection, even though you can skip it if you want.
bilbo0s [3 hidden]5 mins ago
Yeah.

This reporter very likely knew who she was dealing with. For users like her, everything is likely locked down and she probably didn't do much sharing.

I'm thinking that, to her, her sources would be probably one of the most important things in her life to protect.

macintux [3 hidden]5 mins ago
https://support.apple.com/en-us/105120

Looks like lockdown mode is focused on blocking inbound threats, not the sharing of data from the device.

rtkwe [3 hidden]5 mins ago
I can't imagine it would. The accounts don't flow through the phone you're just logged in to them on both devices.
Aurornis [3 hidden]5 mins ago
> (forced her finger on Touch ID per the warrant)

Can anyone link a source for this? I’ve been seeing conflicting claims about this part.

Aurornis [3 hidden]5 mins ago
I understand that it’s within the law. I’m looking for specific evidence that this is what happened in this specific case. Not conjecture.
JasonADrury [3 hidden]5 mins ago
> forced her finger on Touch ID per the warrant

She was not forced, and the warrant does not state that she could be forced. The warrant, almost certainly deliberately, uses far milder language.

rtkwe [3 hidden]5 mins ago
The warrant is the force, current jurisprudence largely says warrant do compel people to provide biometric unlocks because it's not speech the same way giving up a password/passcode would be. Blocking or not complying with a signed warrant from a judge is it's own crime and the only safe way to fight them is with a lawyer in court not with the officer holding the paper (and gun/taser/etc with the power of the state behind them).
_qua [3 hidden]5 mins ago
What do you think warrants are? You think they get a warrant and they say, "Can you put your finger on the device?" You say, "No," and that's it? If all they wanted to do was ask you, they would just ask you without the warrant.
JasonADrury [3 hidden]5 mins ago
I think you should simply try to read the warrant in question.
pc86 [3 hidden]5 mins ago
Perhaps you should? From pages 20 and 22:

> 52. These warrants would also permit law enforcement to obtain from Natanson the display of physical biometric characteristics (e.g., fingerprint, thumbprint, or facial characteristics) in order to unlock devices subject to search and seizure pursuant to the above referenced warrants

> 60. Accordingly, if law enforcement personnel encounter a device that is subject to search and seizure pursuant to the requested warrants and may be unlocked using one of the aforementioned biometric features, the requested warrants would permit law enforcement personnel to (1) press or swipe the fingers (including thumbs) of the Subject to the fingerprint scanner of the device(s); or (2) hold the devices in front of the Subject's face for the purpose of attempting to unlock the device(s) in order to search the contents as authorized by the warrants

So yes law enforcement had the right to grab her hand and press it against the laptop to unlock before seizing it if that's what they had to do.

[0] https://www.rcfp.org/wp-content/uploads/2026/01/2026-01-30-I...

JasonADrury [3 hidden]5 mins ago
>From pages 20 and 22:

From pages 20 and 22 of ... not the warrant:

It'd certainly be a good first step to figure out how to identify whether or not the PDF you're linking to is in fact a warrant at all before trying to educate others on them.

pc86 [3 hidden]5 mins ago
So post a link to the warrant.

This document is specifically asking for the right to force biometric access. It seems based on reporting that biometric access was granted.

If you're claiming the warrant doesn't force biometric access despite it being request, you need to substantiate the claim.

_qua [3 hidden]5 mins ago
"...the requested warrants would permit law enforcement personnel to (1) press or swipe the fingers (including thumbs) of the subject to the fingerprint scanner of the devices..."
JasonADrury [3 hidden]5 mins ago
You're citing an affidavit produced by a FBI agent, the author is most likely not even a lawyer.

They're merely presenting a wishlist to the judge.

cm2012 [3 hidden]5 mins ago
By definition a warrant is force backed by state violence
mock-possum [3 hidden]5 mins ago
You’re saying she complied willingly?
rtkwe [3 hidden]5 mins ago
If the police get the warrant you either allow them to take it or you face an obstruction charge. The only safe way to fight a warrant like that when signed is after the gathering is done in court or at trial.
JasonADrury [3 hidden]5 mins ago
You would at the very least make them guess which finger, there's no indication that happened here.

The court can compel you to make your fingers available, it can not force you to disclose which finger or the manner in which you touch that finger on the fingerprint sensor. Apple devices allow only limited attempts.

If you're not being actively helpful, the investigators may end in a rather awkward position.

rtkwe [3 hidden]5 mins ago
I'd be wary of trying this as it reeks of "one neat trick" thinking applied to law based on a small technicality where law is often subject to the spirit instead of strictly hewing to the most favorable interpretation the exact wording for the citizen. The warrant can just state you're required to unlock the system not simply "make your fingers available".

It's fun to try to find places where the rules seem to leave holes but it's important to remember the courts don't have to hew precisely to how you read the law. I see that a lot on tech centric boards where the law is treated like it's strictly, precisely, and impartially interpreted down to the exact words (though often not using the legal meaning of words which have decades of caselaw and interpretation informing their legal meaning).

JasonADrury [3 hidden]5 mins ago
Sounds like it, yeah.

Touch ID allows only limited attempts, so odds are the FBI wouldn't just try to wrestle her to attempt different fingers on the spot even if they were allowed to do so.

theragra [3 hidden]5 mins ago
[flagged]
digiown [3 hidden]5 mins ago
> full-drive encryption

Note that these are not crackable only if you have a strong password (random one will work). Unlike on phones, there is nothing slowing down brute force attempts, only the comparatively much weaker PBKDFs if you use a password. You want at least about 64 bits of entropy, and you should never use that password anywhere else, since they would basically run "strings" on your stuff to attempt the brute force.

ddtaylor [3 hidden]5 mins ago
Worse than that most phones are using smart enclave like chips protected by a 4 digit PIN that can be voltage drained to try every combo without a wipe.
quenix [3 hidden]5 mins ago
> ---- All above is pure fantasy and never happened, as you probably have already guessed.

Ah, while I was a bit suspicious, I thought it might be real (weirdly worded). What exactly is the point of fabricating this?- Is there a joke I'm blind to?

theragra [3 hidden]5 mins ago
No joke, it is just I don't like to leave any trail about law issues, even if it is hardly a menace. This last sentence is for law enforcement in the really hard to imagine case it might be relevant sometime.
PunchyHamster [3 hidden]5 mins ago
They just need to ask apple to unlock it. And they can't really refuse under US law
quesera [3 hidden]5 mins ago
They can refuse, and they have refused. See San Bernardino and the concept of "compelled work".
dec0dedab0de [3 hidden]5 mins ago
Every time something like this happens I assume it is a covert marketing campaign.

If the government wants to get in they’re going to get in. They can also hold you in contempt until you do.

Don’t get me wrong, it’s a good thing that law enforcement cant easily access this on their own. Just feels like the government is working with Apple here to help move some phones.

Cthulhu_ [3 hidden]5 mins ago
Better to be held in contempt than to give up constitutional rights under pressure - most functioning democracies have and defend the right to free press, protecting said press sources, and can't make you incriminate yourself.

Anyway, it's a good thing to be skeptical about claims that iphones can't be hacked by government agencies, as long as it doesn't mean you're driven to dodgier parties (as those are guaranteed honeypots).

pc86 [3 hidden]5 mins ago
"Government propaganda to help one of the richest companies in the history of the world sell 0.000000001% more phones this quarter" is quite frankly just idiotic.

You only said half the sentence anyway. The full sentence is: "If the government wants to get in they're going to get in, unless they want to utilize the courts in any way, in which case they have to do things the right way."

If this reporter was a terrorist in Yemen they would have just hacked her phone and/or blown up her apartment. Or even if they simply wanted to knock off her source they probably could have hacked it or gotten the information in some other illicit fashion. But that's not what is happening here.