PDA

View Full Version : iPhone encryption - John McAfee can break it, or he'll eat his shoe on TV



LHutton
February 20th, 2016, 08:59 AM
http://www.foxnews.com/tech/2016/02/19/john-mcafee-vows-can-break-iphone-encryption-promises-to-eat-his-shoe-on-live-tv-if-cant.html


John McAfee vows he can break iPhone encryption, promises to eat his shoe on live TV if he can't

Rare White Ape
February 22nd, 2016, 12:19 AM
Someone's gotta do it of Apple won't.

http://www.apple.com/customer-letter/

Bless them.

thesameguy
February 22nd, 2016, 08:55 AM
That guy is insane. Of course it's breakable, but the time involved makes doing so almost useless.

Dicknose
February 22nd, 2016, 03:30 PM
Some of these things are very hard to break when the encryption and keys are done in hardware. They even tend to put it inside a single chip so you can't even spy on it by taping onto the circuit board.
As for time, again I think the delay is done at the chip level, not the OS level, so it might not be possible to just jailbreak and load new code to bypass the delay between attempts.
This could make the time for a brute force attack in the order of many years.

MR2 Fan
February 22nd, 2016, 04:59 PM
I think John McAfee is crazy enough to eat his own shoe either way

thesameguy
February 22nd, 2016, 10:29 PM
Some of these things are very hard to break when the encryption and keys are done in hardware. They even tend to put it inside a single chip so you can't even spy on it by taping onto the circuit board.
As for time, again I think the delay is done at the chip level, not the OS level, so it might not be possible to just jailbreak and load new code to bypass the delay between attempts.
This could make the time for a brute force attack in the order of many years.

Your only current option is brute force 256 bit AES. The encryption key is unique to each device, and the key for the key is not readable directly by hardware or firmware. Once the drive is encrypted - on boot - it is protected by the passcode delay and wipe after attempts. If you don't get the passcode in five (?) attempts, the key erases itself and you're back to brute force. Unless McAfee knows either a secret about AES or has devised a way to read or write the UID that defies reason, that is. ;) The jailbreaking gets you nowhere because in order to jail break the phone you already knew the passcode. You obviously can't jailbreak a phone if it's locked. There are rumors someone has devised a way to beat the wipe after attempts thing, but you are still brute forcing a six digit passcode with a five second delay (through hardware) between tries - even if that tech exists it's not an hour.

But, McAfee doesn't know what every government in the world doesn't. That's nuts.

Kchrpm
February 23rd, 2016, 06:07 AM
The article states that he plans to do it through social engineering.


We will primarily use social engineering, and it will take us three weeks.

Random
February 23rd, 2016, 09:31 AM
That sounds like he's figuring out the passcode, not breaking the encryption.

Kchrpm
February 23rd, 2016, 09:39 AM
Which will be fun to try, since the passcode was recently reset by the employer at the FBI's request. You'd think if that's all it was, they'd have done that by now.

Random
February 23rd, 2016, 10:43 AM
Which will be fun to try, since the passcode was recently reset by the employer at the FBI's request. You'd think if that's all it was, they'd have done that by now.

If I'm understanding the articles correctly, the iCloud pwd was reset, but not the phone's pwd.

thesameguy
February 23rd, 2016, 10:52 AM
Getting someone to cough up their passcode isn't rocket science. People have social engineered the root passwords out of Nokia executives. But, grifting a passcode out of someone isn't science, it's luck and based on a very specific scenario. Touting "breaking encryption" when you're really talking about tricking someone into giving you their password. :down:

Kchrpm
February 23rd, 2016, 10:58 AM
Ok, so how is he going to get the phone password, using social engineering, from a dead guy?

Random
February 23rd, 2016, 11:02 AM
Birthdays, anniversaries, contents of the home (if he could get access, which is doubtful), etc.

Kchrpm
February 23rd, 2016, 11:08 AM
Well he has five chances!

thesameguy
February 23rd, 2016, 11:25 AM
I found out it's ten, but whatever. It's still madness, unless there are other details we don't know... like the phone is still running iOS 7, or there are iTunes/iCloud backups, or access to a computer that has a trust relationship with the phone. If that stuff exists, things get easier. Now that I know he is just trying to get the password and not break the encryption, the whole thing smells like lies and anything is possible. ;)

I am the only person in the world who knows my passcode, and it's based on nothing. I have no backups and the phone has never been plugged into iTunes so there is no keybag to gain access to. I want McAfee to "break the encryption" on my phone.

LHutton
February 24th, 2016, 08:12 AM
Which will be fun to try, since the passcode was recently reset by the employer at the FBI's request.
So it's now 'computer'?


I found out it's ten, but whatever. It's still madness, unless there are other details we don't know... like the phone is still running iOS 7, or there are iTunes/iCloud backups, or access to a computer that has a trust relationship with the phone. If that stuff exists, things get easier. Now that I know he is just trying to get the password and not break the encryption, the whole thing smells like lies and anything is possible. ;)

I am the only person in the world who knows my passcode, and it's based on nothing. I have no backups and the phone has never been plugged into iTunes so there is no keybag to gain access to. I want McAfee to "break the encryption" on my phone.
If he was that good, surely he'd be able to write decent anti-virus software.

thesameguy
February 24th, 2016, 11:24 AM
McAfee has been out of his company for years, and Intel now owns it anyway. It just bears his name... and back when he was involved, it was the best.

LHutton
February 25th, 2016, 02:47 AM
I was always told it just gave a load of false positives and missed the important stuff.

I guess the way to hack the iPhone encryption is to hack the Apple computers where the original design is stored, or reconstruct it from the info on the HDDs. I'm surprised the NSA hasn't just done that already. In fact, can't they just get a warrant ordering Apple to provide the data on the iPhone in question anyway?

LHutton
February 25th, 2016, 02:53 AM
Getting someone to cough up their passcode isn't rocket science. People have social engineered the root passwords out of Nokia executives. But, grifting a passcode out of someone isn't science, it's luck and based on a very specific scenario. Touting "breaking encryption" when you're really talking about tricking someone into giving you their password. :down:
Derren Brown could guess their password in 5 seconds.:)

Kchrpm
February 25th, 2016, 04:30 AM
In fact, can't they just get a warrant ordering Apple to provide the data on the iPhone in question anyway?
Yes, but the phone hadn't done a backup to iCloud for a while before the shooting, and can't now because of some password reset or something they did. But they still want to see if there's something on there that wouldn't have been sync'ed.

LHutton
February 25th, 2016, 07:06 AM
So same question again. Can't they just get a warrant calling for Apple to use the encryption they know to access the data on the actual phone? That way the FBI gets the terrorist data and nobody else finds out the encryption? Or are we dealing with some kind of chain of evidence issue, where the FBI has to retain possession of the phone for legal reasons?

Kchrpm
February 25th, 2016, 07:47 AM
The court would shoot that down. And that's not what the FBI wants.

The FBI wants to set the precedent that, if a judge will sign the warrant, Apple is willing and able to "unlock" one of their phones so that the FBI can search it for the data inside. They have done this by using a very special case:

1) It involves a highly publicized case of homegrown terrorism, arguably one of the scariest thing to the American people
2) The device in question, while secure, is not nearly as secure as the devices that have come after it, or the ones that are being developed.

So not only is Apple able to hack the phone in question (in a way that the FBI has detailed, including providing provisions so that the hack as requested will only work on this one particular phone), but public sentiment would default to being in their favor because it's a matter of grave national security, not just some suspected pot dealer.

However, once their is a ruling in the book's showing that, upon receipt of a warrant from the FBI for data on an encrypted phone, Apple has developed software to unlock that phone, every law enforcement organization in the country is going to use that as precedent. No matter how good the encryption is on iPhones going forward, Apple will be forced to develop backdoor software so that government agencies can get access to it, because they have done it before. And once an encryption backdoor is developed for one person, it's only a matter of time before it gets out there and everyone that wants it can get it.

Which is fine with law enforcement agencies. They don't want anyone's info to be encrypted, except their own. The only privacy that matters is their own.

LHutton
February 25th, 2016, 08:47 AM
I don't necessarily see it that way. And TBH surely a court should be able to consider every case on a case-by-case basis. So I wouldn't expect them to issue a warrant in the case of a suspected pot dealer but in the case of a definite terrorist (which is what we're talking about here) then they would. Hell after Boston they did random house searches of innocent people, here we're talking about a terrorist's iPhone.

That said, I watched Deutschland 83 and things were much simpler back then, we banned the shipment of western electronics to the Soviet Union.

Kchrpm
February 25th, 2016, 09:39 AM
ISo I wouldn't expect them to issue a warrant in the case of a suspected pot dealer but in the case of a definite terrorist
What about an indefinite terrorist? What about someone the FBI is SURE is a terrorist, they just need what's on the phone to prove it? What about someone the FBI is SURE is doing SOMETHING and they just need the phone to figure out what?

That's for a judge to decide, but at some point is an avenue for abuse. And even if it isn't abused, you're telling someone making the toughest safe in the world that they have to leave a keyhole for the FBI to get in to, just in case. Guess what the hacking community will be putting out ransoms to crack.

thesameguy
February 25th, 2016, 11:23 AM
Yeah - if the system exists to beat encryption, it will be leaked. Or sold. And then the whole thing is worthless.

Rare White Ape
February 25th, 2016, 12:49 PM
It's… just too hard for the cops (not just FBI, but all levels) to do the job with the extensive laws they already have at their disposal.

So they lobby like crazy to change the rules.

It's like saying it's too hard to prove that string theory works so why don't we just change the way science works to allow some scientists to apply string theory to everything? How convenient!

As far as I know it's pretty well proven that these people killed members of the public. That's murder. It's already illegal.

Now they have to try and find proof of terrorism? Isn't that begging the question? Are they now embarrassed that the myriad surveillance and anti-terrorism laws that have been introduced in the west since 2001 haven't worked?

Kchrpm
February 25th, 2016, 12:56 PM
They're trying to find proof that they were contacting and planning with a terrorist organization, so they can track and go after said terrorist organization.

Now, considering these people had multiple other phones, all of which they burned while leaving this iPhone unharmed, it's unlikely that they would have left one phone intact with evidence against a co-conspirator while destroying the others, but it's possible, and the FBI is using that possibility.

LHutton
February 27th, 2016, 06:57 AM
What about an indefinite terrorist? What about someone the FBI is SURE is a terrorist, they just need what's on the phone to prove it? What about someone the FBI is SURE is doing SOMETHING and they just need the phone to figure out what?

That's for a judge to decide, but at some point is an avenue for abuse. And even if it isn't abused, you're telling someone making the toughest safe in the world that they have to leave a keyhole for the FBI to get in to, just in case. Guess what the hacking community will be putting out ransoms to crack.
Surely it's not a problem if the hack stays with Apple employees who already know how to make the back door anyway. No extra vulnerability has been created and no exploit is with anyone who didn't already know. If the keyhole wasn't effectively already there, it simply wouldn't be possible to make one but the truth is, the people who made it know how to put it there, with or without this FBI request.

I'm as reluctant as you to concede privacy, but then I don't use a mobile for that very reason. I don't like the idea of a precedent being set to allow hacking on the basis of suspicion, and this precedent should therefore only apply to proven terrorists and potentially murders and serious organised criminals too.

Kchrpm
February 27th, 2016, 08:30 AM
Surely it's not a problem if the hack stays with Apple employees who already know how to make the back door anyway.
That's not how that works. If a back door is left, obvious or not, people will find it. If there is a weakness, someone will exploit it. That's how hacking works; and with people saving their credit cards on phones, and using them as their work phones, with access to sensitive information and emails, the interest in hacking phones is only going to get larger.

LHutton
February 27th, 2016, 08:37 AM
Hang on, I'm not getting this right. The hack is on one phone only, or maybe 2 if they both had one. No back door is being left out on some cloud right? The data is then removed and sent to the FBI for examination. I'm not seeing how this back door becomes any more public than it is right now.

Kchrpm
February 27th, 2016, 10:24 AM
Like I said, it sets a precedent. The head of the FBI made a statement saying it wasn't, but then had to recant it because it actually is: there are already a dozen other phones that they will request be unlocked once this one passes, and the NYPD has already said they will use this ruling as legal precedent to force Apple to hack phones for them as well.

This specific hack will be for one phone only, but the judge will be ruling whether Apple is legally forced to hack an iPhone when they receive a warrant. If the answer is yes, then Apple is not legally allowed to make a phone that they cannot hack. They must design all future devices so that they have a backdoor that can be accessed when they receive a warrant. The same backdoor which some other nefarious organization can figure out and use themselves.

Freude am Fahren
February 27th, 2016, 10:43 AM
So I'm with Apple on this based on the knowledge I have of it now.

The analogy for a physical room I keep hearing would only be relevant if the FBI could force lockmakers to unlock doors that they can't find their own way into. Can they?

I'm not so sure however that the precedent would make it illegal for Apple to make a phone that they themselves can't unlock. And assuming Apple admits they can unlock this phone, then isn't the damage is already done as they've confirmed it's doable. Actually doing it doesn't make it any easier for someone else to do, does it?

LHutton
February 27th, 2016, 10:56 AM
Like I said, it sets a precedent. The head of the FBI made a statement saying it wasn't, but then had to recant it because it actually is: there are already a dozen other phones that they will request be unlocked once this one passes, and the NYPD has already said they will use this ruling as legal precedent to force Apple to hack phones for them as well.

This specific hack will be for one phone only, but the judge will be ruling whether Apple is legally forced to hack an iPhone when they receive a warrant. If the answer is yes, then Apple is not legally allowed to make a phone that they cannot hack. They must design all future devices so that they have a backdoor that can be accessed when they receive a warrant. The same backdoor which some other nefarious organization can figure out and use themselves.
Well a warrant is a warrant but what's required is a law clarifying the issue on when a warrant can be served demanding such a hack. Or maybe a privacy law, saying when you relinquish your right to privacy, and committing a proven terrorist offence would be top of the list.

If you know the encryption because you design it then there's always a back door, unless it's quantum encryption, or based on muscle memory or something.

What happens if you forget your password, is that basically a dead phone then?

LHutton
February 27th, 2016, 11:05 AM
The analogy for a physical room I keep hearing would only be relevant if the FBI could force lockmakers to unlock doors that they can't find their own way into. Can they?
Pretty sure they can given a similar situation.



I'm not so sure however that the precedent would make it illegal for Apple to make a phone that they themselves can't unlock. And assuming Apple admits they can unlock this phone, then isn't the damage is already done as they've confirmed it's doable. Actually doing it doesn't make it any easier for someone else to do, does it?
You wouldn't have thought that breaking their own encryption on one phone makes a difference to anyone else. I think it's already known that it's doable, because Apple designed it, and if it wasn't then there wouldn't be this debate, because the FBI surely can't force them to do the impossible. So the fact we're in a debate is a tacit admission that it is possible.

Kchrpm
February 27th, 2016, 11:34 AM
What happens if you forget your password, is that basically a dead phone then?
For this particular phone, you have the password reset, using your iCloud password in the case of iOS. The FBI tried this, and screwed up at some point in the process, I forget the details. That's why they're coming to Apple for help. The normal solution for most people would be to factory reset the phone and restore all of your information from iCloud, but the FBI thinks there are things stored on the phone that haven't been backed up to iCloud that they want.

Apparently the most secure physical objects we have, safes, are designed so that, even if there is a known way to physically crack them, it would take 3-5 days and make a lot of noise, and therefore not be useful to criminals in most situations. A software hack for a phone is unlikely to have either of those limitations.

It is indeed possible to hack iOS as it is on the phone they're using, the FBI detailed exactly how they wanted it done and were correct in the way they asked for it. However, future iPhones were not designed the same way. They have a separate internal encrypted section (the secure element), which is physically separated, not just a different partition. FWIU it's designed so that, even if someone hacks the main OS of your phone, they would still have to hack into that separately.

Edit: here's an article that actually says they can't reset your phone passcode remotely as of iOS 8.
http://arstechnica.com/apple/2014/09/apple-expands-data-encryption-under-ios-8-making-handover-to-cops-moot/

And one explaining iOS's encryption and security
http://searchmobilecomputing.techtarget.com/tip/How-iOS-encryption-and-data-protection-work

Kchrpm
February 27th, 2016, 11:44 AM
There's plenty of info in articles out there, with more (and more accurate) explanations than us, if you really want to understand what's going on I suggest you check those. Last I checked, none of us on here are IT security or hacking experts.

I have found enough to convince me of what the FBI is trying to do and therefore why Apple should try to prevent them from doing it, but if you need more information, it's readily available.

Rare White Ape
February 27th, 2016, 01:54 PM
The analogy for a physical room I keep hearing would only be relevant if the FBI could force lockmakers to unlock doors that they can't find their own way into. Can they?

The difference here is, if the cops want access to one physical room, they get a warrant and it only applies to that one space, but if Apple is forced to open one iPhone, it weakens the integrity of every iPhone.

Freude am Fahren
February 27th, 2016, 04:15 PM
I keep hearing that, and actually put it in my original analogy, but I don't get why that is.

Rare White Ape
February 27th, 2016, 05:02 PM
Unfortunately I'm not enough of an expert on the matter, or with tech security in general, to be able to explain why.

But I'll put it this way; the FBI wants access to one phone, which is OK, but the method they require will likely affect every iPhone, and possibly every Android or Windows (and others) phones on the planet.

To elaborate: the vast majority of these phones don't exist within the jurisdiction of the FBI, so MY phone will be affected as well, and I don't want that to happen. It leaves me and millions of other users open to [tin-foil-hat mode] extra risk of being unfairly targeted by law enforcement* or [less-tin-foil-hatty] opens the risk of criminals accessing our bank accounts, viruses, or causing other havoc that makes us liable for a crime.



*Cops are NOT your friends, no matter what anyone tells you. We've seen people having their lives ruined by lazy, overzealous, careless policing and it's happening more and more with the extra powers that they're being given on a regular basis. I think it's best we don't allow them to have extra access into our lives.

thesameguy
February 27th, 2016, 07:27 PM
The FBI is not asking Apple to break the encryption - they are asking Apple to disable the phone's self-destruct. The FBI is confident that given enough time/attempts, they can crack the encryption... which is true - Given enough time ANYONE can crack the encryption, which is why Apple built in the self destruct in the first place. What grander implications breaking the self-destruct might have is unclear, because nobody is discussion how exactly that would be done. Whether that modification could be done on other phones without Apple's help is unknown. It's impossible to know how this precedent would affect iPhone users at large.

I think Apple resents being asked to help law enforcement in the same way people like AT&T came to resent all the John Doe Napster etc. IP requests... eventually they just stopped complying. Apple has helped law enforcement in the past, but something about the here and now has caused them re-evaluate their position. Maybe this mod is difficult execute or maybe once done the trick of it would be seem obvious and they don't want that getting out, or maybe it's so time consuming or maybe it's so helpful Apple is concerned they will be "taken advantage of" like Microsoft or Google has been, fielding thousands of requests from law enforcement per month.

When it appeared that the FBI was asking Apple to break iPhone encryption I was staunchly against it. That hack has obvious repercussions. Now that the truth of the request has come out, and Apple's primary response has been "we don't want to" I think they are just being dicks. If they would quantify why disabling the self-destruct would have larger implications that affect everyone, I'm totally willing to shift my opinion. Right now, though, it sure seems like they're just saying no because it's a pain for them to say yes. Given Apple's really shitty history with security, ("security through obscurity") I wouldn't doubt for a second that the self-destruct feature is or masks some goofy flaw with the seemingly bulletproof security, and once that veil is lifted everyone sees the man behind the curtain. I don't know, but until the truth of why Apple is choosing this moment to draw their line in the sand comes out, I sadly have to side with the FBI.

Kchrpm
February 27th, 2016, 07:40 PM
I think it's more likely that they're worried about the FBI setting a precedent for asking them to provide them tools to get into any locked, encrypted phone, than it is trying to hide something that the self destruct is hiding. Especially since they're changing the way security works anyway.

So far Apple (and Google, Facebook and others) have been giving up information stored on their servers, because that was a legal interpretation of another pre-internet law that says if you put your information in the hands of a third party, you give up the right to protect it from a request from law enforcement (I forget if it requires a warrant). Forcing Apple (or whoever else) to assist you in breaking the security/encryption of a locked device is being done based on a different law (from the 1700s IIRC), and would set a new precedent. I think that's why they're drawing the line here, it's gone from personal data on their servers to personal data on their customer's devices that has never left it. Many people, like LHutton, feel that anything you put on your phone is accessible by someone somewhere, but as Apple (and Samsung and every once else) introduce more encryption and secure elements so that they can sell to businesses and governments, they want to break that idea.

overpowered
February 27th, 2016, 10:07 PM
That guy is insane. Of course it's breakable, but the time involved makes doing so almost useless.This.

overpowered
February 27th, 2016, 10:11 PM
McAfee has been out of his company for years, and Intel now owns it anyway. It just bears his name... and back when he was involved, it was the best.The last time I tried to use McAfee, I wanted to track him down and beat his head into liquid with a baseball bat because it sucked so bad. That shit blocked Windows Update. I don't know if that was before or after he gave up control though.

MR2 Fan
February 27th, 2016, 11:29 PM
I've never had a good experience with McAfee, even back when he did run it

LHutton
February 28th, 2016, 03:16 AM
I keep hearing that, and actually put it in my original analogy, but I don't get why that is.
That's where I'm at on this too. As far as I can see, it's for one phone, Apple do it secretly and then hand the data to the FBI, so no one who didn't know already knows.

LHutton
February 28th, 2016, 03:20 AM
I think Apple resents being asked to help law enforcement in the same way people like AT&T came to resent all the John Doe Napster etc. IP requests... eventually they just stopped complying. Apple has helped law enforcement in the past, but something about the here and now has caused them re-evaluate their position. Maybe this mod is difficult execute or maybe once done the trick of it would be seem obvious and they don't want that getting out, or maybe it's so time consuming or maybe it's so helpful Apple is concerned they will be "taken advantage of" like Microsoft or Google has been, fielding thousands of requests from law enforcement per month.

That's a valid point, and I honestly think the FBI should pay a sum in proportion to the services rendered.


I think it's more likely that they're worried about the FBI setting a precedent for asking them to provide them tools to get into any locked, encrypted phone, than it is trying to hide something that the self destruct is hiding. Especially since they're changing the way security works anyway.

So far Apple (and Google, Facebook and others) have been giving up information stored on their servers, because that was a legal interpretation of another pre-internet law that says if you put your information in the hands of a third party, you give up the right to protect it from a request from law enforcement (I forget if it requires a warrant). Forcing Apple (or whoever else) to assist you in breaking the security/encryption of a locked device is being done based on a different law (from the 1700s IIRC), and would set a new precedent. I think that's why they're drawing the line here, it's gone from personal data on their servers to personal data on their customer's devices that has never left it. Many people, like LHutton, feel that anything you put on your phone is accessible by someone somewhere, but as Apple (and Samsung and every once else) introduce more encryption and secure elements so that they can sell to businesses and governments, they want to break that idea.
I think the server rule was covered by the same law that covers dustbin searches - my CSI/Law & Order knowledge at work. I'm of the opinion that private property linked to a terrorist act becomes public property after the act. E.g. If someone shoots the President from a private apartment, for the next few months that apartment will not be private, regardless of who owned it.

Kchrpm
February 28th, 2016, 03:46 AM
That is exactly why the FBI is doing this with a known terrorists phone, and not a suspected one, and not someone known/suspected to have committed a lesser crime.

And Apple isn't going to hand the data to the FBI. They are going to write and install the code that will let the phone be brute forced, and then give the phone to the FBI. As I've said numerous times, the issue is less with this one phone and more with the fact that there will then be a line of people with the same request (the FBI already has 12 more queued up), all on different phones. The solution won't be so simple on other phones, and at some point some a vulnerability created by Apple for their own devices will make it out. The FBI will be trying to figure it out so that they can unlock phones in their possession without having to go to Apple with a warrant, then they will share it with other government agencies, both federal and local, and at some point it will leak out and get into the hands of black hat hackers.

Let's say the FBI goes to a safe maker and says "we need you to make a tool that will let us easily guess the code for this one safe, and then for these twelve other ones." Even if that tool is designed for each individual safe, don't you think the processes used by the tool to unlock several similar safes from the same company will get figured out by them? Wouldn't the FBI then start trying to build their own safe-cracking tool so they wouldn't have to go to the safe company every time? Once they got someone to build that safe-cracking tool, and started building multiple of them and sending them to other law enforcement departments around the country, don't you think one of them would fall into the wrong hands at least long enough for them to duplicate it?

LHutton
February 28th, 2016, 06:18 AM
Not if it's done in a well regulated manner. There's lots of stuff we transfer around that we don't want people getting their hands on. I think the bottom line is that we need firm laws around this issue, about when it's okay and who pays for it etc.

The FBI isn't asking for a tool, they're asking them to allow the phone to be brute forced. The only thing going to the FBI is a phone that can be brute forced. For a safe it may very well be the same if the lock is a password based system.

At the rate people upgrade their phone, I think the slim possibility of it getting into the wrong hands eventually is by-the-by. And TBH I don't think it's an energy efficient use of a criminal's resources to brute force a vulnerable iPhone with a supercomputer anyway, not when there are far more efficient card scams and ID frauds on the go.

thesameguy
February 28th, 2016, 08:45 AM
The solution won't be so simple on other phones, and at some point some a vulnerability created by Apple for their own devices will make it out. The FBI will be trying to figure it out so that they can unlock phones in their possession without having to go to Apple with a warrant, then they will share it with other government agencies, both federal and local, and at some point it will leak out and get into the hands of black hat hackers.

This where you're losing me - not because it isn't plausible, but because Apple hasn't said any of this. Why do you think other phones won't be as simple? Why do you think a phone with hacked, self-destructless firmware will let the FBI clone the tool, creating a "thing" that could be leaked or sold? Up front, I agree that's definitely a possibility, but *if* that tool is easy to reverse engineer, or if it's a tool that could be cloned, then IMHO Apple is working from a crappy place to start and someone will come up with it with or without Apple's help. It also paints Apple as executing a bad security CYA and not a we love our customers defensive move. To me, that's a very important distinction.

Kchrpm
February 28th, 2016, 01:11 PM
This where you're losing me - not because it isn't plausible, but because Apple hasn't said any of this. Why do you think other phones won't be as simple?

Because I have listened to podcasts where an encryption/security expert said that versions of iOS and iPhones after the 5C are much more difficult to hack, that the solution the FBI is asking for in this case would not work on current devices.

Kchrpm
February 28th, 2016, 01:12 PM
http://www.theverge.com/2016/2/18/11048014/encryption-apple-fbi-iphone-explainer

Freude am Fahren
February 28th, 2016, 04:02 PM
I'm trying to listen to that, but I can't stand the music and sound effects.

edit: Okay, once they get into it, it's just them talking.

Around 18 minutes they start talking about how the government could require a backdoor or else the encryption company (Apple in this case) could be in trouble. On this, I am absolutely on Apple's side. I'm fine with the FBI asking Apple to get into a single phone if they can, but I am absolutely against the FBI saying they have to always be able to get into any future phone, just in case. If they government can separate the two ideas, then I say give them access.

Basically, if the gov. says "If you access to this single phone, give it to us, but if you don't fine."

thesameguy
February 28th, 2016, 04:50 PM
That's what this keeps coming down to - the FBI's request is very simple, very straightforward. In fact, remarkably specific. I don't understand how anyone can extrapolate this specific request into a precedent for anything n a way different than Apple's past help has been a precedent. If Apple breaks the self destruct on the old phone - call it "Method A" - and the FBI comes knocking with a later phone and Method A doesn't apply, the FBI is back to the beginning. If this request was "do whatever is necessary to get us access to the phone" it would be one conversation about vagaries and precedents, but it's not. The FBI's request is specific and has specific application. It would be like GM tracking down a 2016 Impala specifically using Onstar for the FBI, and the FBI using that as precedent to get GM to track down a 1966 Impala. Well, you see, it doesn't work that way. The technology is different, what you asked for before doesn't apply to what you're asking for now. Historically the law works on concepts, but the intricacies of technology require specifics and that is demonstrated over and over and over again in everything from criminal to civil suits.

I dunno. Until Apple truly quantifies their stand and reasoning, I think this whole thing is somewhere between being pissy and smoke and mirrors. Based on the information at hand, Apple's position sucks IMHO.

Kchrpm
February 28th, 2016, 05:38 PM
Do you really think a lawyer and judge in a future circumstance is going to make the distinction between unlocking a 5C and a 6S or 7 or whatever? The FBI at first said this wasn't going to set a precedent, then recanted that and said yes, actually, it will.

http://www.theguardian.com/technology/2016/feb/25/fbi-director-james-comey-apple-encryption-case-legal-precedent

Apple *has* quantified their stand and reasoning. They put out a letter detailing it, and Tim Cook was interviewed by ABC News.

https://www.washingtonpost.com/news/on-leadership/wp/2016/02/26/tim-cooks-interview-about-apples-fight-with-the-fbi-may-be-the-most-important-of-his-career/

Kchrpm
February 28th, 2016, 05:41 PM
FFS they're using an "All Writs Act" from 1789 to make this happen. They'll reference something that is 200+ year old as precedent, but not a legal ruling on an iPhone that's 2 or 3 years in age difference?

thesameguy
February 28th, 2016, 09:00 PM
A law is not precedent. A precedent is a ruling on a case which has the effect of clarifying existing law.

They are using a law from 1789 because there is no more recent law or precedent which applies. If there was, either the FBI would invoke it or Apple would.

The reason why this ruling is potentially so important is because it is far easier for Apple if a ruling establishes that Apple is never required to help the FBI. That would be *great* for Apple and I understand that.

However, as a guy who works in technology and a guy who works in law, I can assure you what Apple is doing is fear mongering and handwringing. They want this to seem as grandiose as possible with massive implications everywhere so that they can nip this in the bud and avoid any future court battles. It's a lot easier for them if they do. Precedents are not trivial to invoke, they are not free for alls. Unless "future" iPhones have self-destructs that can be defeated in the same way that the 5C does then this future implications of this precedent are weak at best. To date, in Apple's letter and Cook's interview, I have seen no evidence to the contrary. If you know of some *specific* way this hack would affect Apple's customer base at large, I'd love to see it. There is no technical detail in either of the aforementioned sources - just posturing and handwringing.

Drachen596
February 28th, 2016, 10:01 PM
I think whats going on is Apple has an easy way into their products and doesn't want anyone to know. I'd wager lots of manufacturers do.(maybe too many techonolgical spy thriller movies/books have been read by me..)

Remember the huge backlash against Microsoft for the always on internet connection and the Kinect always listening? Samsung is having the same thing with their smart tvs right now as well, you can opt out... but it still listens for specific words.

People somehow still value privacy while spreading all sorts of stuff about their lives on social media.

Kchrpm
February 29th, 2016, 04:29 AM
tsg-
Fair enough. I don't have enough information, and Apple isn't going to release enough (or doesn't have enough) to convince you. Neither of us have any say in what's happening, so it doesn't matter much either way.

Drach-
The way in to this iPhone isn't "easy" for anyone outside of Apple, because it requires an OS update that is signed by Apple and installed using their tools. Most Android devices, their major competitor, would be easier to do this with AFAIK.

Also, there's a big difference between sharing what we choose to with who we choose to or even semi-publicly, and opening up anything and everything we do with or near a certain device without choice to a company or government.

thesameguy
February 29th, 2016, 08:35 AM
The way in to this iPhone isn't "easy" for anyone outside of Apple, because it requires an OS update that is signed by Apple and installed using their tools. Most Android devices, their major competitor, would be easier to do this with AFAIK.

That - and the derivative facts - is why I think Apple is crying wolf. Three reasons:

1. The 5C does not have the "secure enclave" hardware the later phones do, so subverting the passcode entry delay requires only updating iOS. That would not be sufficient on A7+ phones, which would also require a firmware update to SE. It would have to be tested in court whether "bypass delays imposed by iOS" also extends to "bypass delays imposed by SE." I think it likely would not based on the language of the FBI request.
2. Executing the bypass requires custom firmware and a signed cert to load it. Apple does not have to relinquish either to the FBI, especially the cert. That substantially limits the possibility of any of this leaking for misuse - unless Apple was at risk of leaking it in the first place!
3. The custom firmware - both iOS and SE if that ever comes up - can be signed to the hardware making it impossible to port to another device. If "signed to the device" was not a sufficient roadblock, then the FBI wouldn't be where it is right now anyway, suggesting these tools are safe.

IMHO, Apple and its customers are well-protected. Handing hacked phones to the FBI carries no implicit risk of that tech leaking. You'd have to let the imagination run pretty wild to see how the court order has any effects on Apple's user base at large. Really, IMHO, they just don't want to be arsed to help the FBI in the same way that Jobs didn't want to be arsed to help charities. Or pay his parking tickets. ZING.

Kchrpm
February 29th, 2016, 08:43 AM
Actually Jobs is owed money by San Francisco because he overpaid on his parking tickets: http://gizmodo.com/san-francisco-still-owes-steve-jobs-150-for-overpaid-p-1761886955

And my point has always been that, though whatever tool they write for this phone may not be applicable on any other device, the legal precedent will be set and used in the future to force them to unlock other phones in whatever ways it takes. You don't believe that will happen, I do. I have accepted that I cannot change your mind on that, but mind has not changed on it as a result.

Kchrpm
February 29th, 2016, 08:45 AM
Listening to a discussion on The Verge, they just said the wiretapping law Apple refers to in some arguments specifically says it does not authorize the government to force companies to make their products so that they can be tapped. It doesn't, however, say, that it *can't* be done, just that the wiretapping law doesn't give the authorization. #funwithlegalese

thesameguy
February 29th, 2016, 09:04 AM
Yeah........... http://www.edibleapple.com/2011/10/27/the-story-behind-steve-jobs-mercedez-benz-and-its-missing-license-plate/

;)

Kchrpm
February 29th, 2016, 09:14 AM
I'm not here to argue that Steve Jobs is a good person, I just happened to see that article the other day :)

thesameguy
February 29th, 2016, 09:36 AM
Listening to a discussion on The Verge, they just said the wiretapping law Apple refers to in some arguments specifically says it does not authorize the government to force companies to make their products so that they can be tapped. It doesn't, however, say, that it *can't* be done, just that the wiretapping law doesn't give the authorization. #funwithlegalese

The wiretapping law is probably one of the worst-written laws as applies to technology today. I am sure when it was written it seemed pretty progressive, but in light of what we do these days it needs a major overhaul. Personally, I think it's funny that Apple invoked that law, because historically it hasn't done anybody any good... It has at multiple points screwed every major telco and internet carrier in the same way Apple is trying to say it helps them. Bad lawyering, IMHO. Or, maybe, Apple doesn't have a legal leg to stand on. :shrug:

Kchrpm
February 29th, 2016, 09:46 AM
Or, maybe, Apple doesn't have a legal leg to stand on. :shrug:
Yeah, I don't actually think Apple is going to win this argument, the FBI did a good job of framing their request the way they did.

Kchrpm
February 29th, 2016, 09:56 AM
http://www.engadget.com/2016/02/29/apple-fbi-congress-statement/

"Read Apple's opening statement to Congress over its FBI fight"

If you're so inclined.

thesameguy
February 29th, 2016, 10:32 AM
Nice.

That reads about how I would expect it to read. I think Sewell did a nice job of framing his argument and incorporating the pros and cons of the current situation. I am very interested to see how this bit is dissected:


As we have told them — and as we have told the American public — building that software tool would not affect just one iPhone. It would weaken the security for all of them. In fact, just last week Director Comey agreed that the FBI would likely use this precedent in other cases involving other phones. District Attorney Vance has also said he would absolutely plan to use this on over 175 phones. We can all agree this is not about access to just one iPhone.

I absolutely agree this is not about one iPhone - clearly. And I trust law enforcement just enough to know they have been waiting to tag "terrorist" onto an investigation so they can really compel Apple to hack the 175 phones they've been storing. ;) Totally. But every expert opinion I've read indicates nothing about this hack would inherently be universal, and Apple could roll them out on a per-phone basis retaining keys to the kingdom as they always have. Obviously, Apple isn't going to volunteer that information as it completely undermines their case. They have to come in with a sky is falling approach... That's how this system works.

The crappy thing from my perspective is that:

a) I believe Apple can execute this request without any material compromise to iPhone security
b) That the FBI's current request is reasonable
c) Our Congress is so fucking inept that no matter how these hearings go I will have zero confidence any truth or fact will be admitted, and that whatever decision is reached will be based on voodoo and witchcraft and not actual understanding of the matter at hand
d) Apple is as bad as any American corporate entity and that they are basing their case on voodoo and witchcraft - I mean the bottom line - and not pragmatism or good engineering
e) The FBI would leverage a positive ruling into additional demands but that those additional demands would have to be tested in the same way as this demand

WRT to the Writ & Wiretapping Act it's positively clear there is no one size fits all law that can be written. Technology and society wrapped up together are way too complex for that. But, especially as it applies to c) above, our Congress is too idiotic to understand that, much less churn out some good decision making... so we're reduced to producing a one size fits all precedent, which does nobody on either side any favors.

I definitely have The Fear that Apple has regarding the final ruling, but I can't in good faith support their position right this second.

Drachen596
February 29th, 2016, 11:04 AM
Fwiw i think Apple is actually worse than most of the Big Bad Corporations but people love them so its cool apparently.

Side thought on this particular phone though. Does one retain the right to privacy after death?

thesameguy
February 29th, 2016, 11:40 AM
Depends on the circumstances and what you're trying to keep private. Things like medical records are legally protected for 50 years after you die. Things that the government collects about you typically cannot be disseminated without your agreement, so if you're dead they are typically protected indefinitely. Sometimes Freedom of Information Act records can override government right to privacy assurances - but not always and maybe not even usually. Things collected about you by private entities - like your Facebook page - are typically governed by Terms of Use or other licenses, and those agreements made while you're alive typically remain in force even after you're dead. Depending on the specific agreement, sometimes those agreements can be overridden by your will or a pre-death signed power of attorney. Some things like contracts or NDAs are governed by the agreement or state law.

In the case of a suspected terrorist's property the rights that would normally be in question are protected, however a valid warrant can override them just as if they were alive. So, if the FBI seizes a phone of a suspected terrorist, a warrant is what's needed for them to legally attempt access. Just because you are dead does not give the government carte blanche access to your stuff. Typically speaking, alive or dead, the ToS or (E)ULA of most products and services require you to agree that whatever company is providing the product or service will comply with legal government or law enforcement information requests. Apple/iCloud/iTunes certainly does.

There was something specific I was going to mention as something I'm not sure about... but now I can't remember what I don't know.

Microsoft, Google, and every other major information gathering or housing company has been through what Apple is going through. I'm sure plenty of folks around here remember the NSA/RSA kerfuffle back in the '90s. That was just the beginning. Actually, that's a really good thing to mention, because that decision would be an example of something I am opposed to - the intentional weakening of security for the satisfaction of the government. That is not okay, that is what the Wiretap act vaguely (hey, it's old) seeks to prevent. But in order to tell the government to piss off, I'd need it proven that the request does in fact weaken the whole in a practical way. If the request does not, that it should be complied with. Here is a good summary of the RSA debacle:

http://www.farces.com/nsa-paid-rsa-us10-million-to-include-encryption-back-door/

thesameguy
February 29th, 2016, 11:51 AM
Mentioning RSA made me wonder what Bruce Schneier had to say.

https://www.schneier.com/blog/archives/2016/02/judge_demands_t.html

Not much, but the commentary is really worthwhile.

Also, just to keep us on track, still :lol:@McAfee.

LHutton
March 1st, 2016, 01:12 AM
Side thought on this particular phone though. Does one retain the right to privacy after death?
Not if you're a terrorist would be my thoughts.

LHutton
March 12th, 2016, 01:37 AM
http://www.digitaltrends.com/mobile/apple-fbi-justice-department-threat/


Department of Justice threatens to take Apple’s iOS source code in San Bernardino case

Freude am Fahren
March 14th, 2016, 07:37 AM
https://www.youtube.com/watch?v=zsjZ2r9Ygzw

overpowered
March 14th, 2016, 05:33 PM
I've generally not been a fan of Apple but I always liked Woz. He was always a bit different than the rest of Apple.


https://www.youtube.com/watch?v=GsK9_jaM-Ig

thesameguy
March 28th, 2016, 02:18 PM
Maybe McAfee isn't lying...

http://www.usatoday.com/story/news/nation/2016/03/28/apple-justice-department-farook/82354040/


The Justice Department is expected to withdraw from its legal action against Apple, as soon as today, as an outside method to bypass the locking function of a San Bernardino terrorist’s phone has proved successful, a federal law enforcement official said Monday.

The official, who is not authorized to comment publicly, said the method brought to the FBI earlier this month by an unidentified entity allows investigators to crack the security function without erasing contents of the iPhone used by Syed Farook, who with his wife, Tashfeen Malik, carried out the December mass shooting that left 14 dead.

This is definitely not a shock to me - I still think it's a strong possibility that Apple made a big deal out of this in order to cover up a vulnerability they were afraid might come out. Plus, you know, the gubment gets what the gubment wants.

LHutton
April 1st, 2016, 04:15 PM
McAfee's shoe breathes a sigh of relief.

Kchrpm
April 8th, 2016, 10:23 AM
Anyone surprised?

http://www.theverge.com/2016/4/8/11203928/feinstein-burr-encryption-bill-required-to-unlock-data


Today, legislators from the House and Senate introduced a new bill that would require smartphone manufacturers to decrypt data in response to law enforcement demands. Introduced by Senators Diane Feinstein (D-CA) and Richard Burr (R-NC),the bill does not establish any new civil or criminal penalties for companies unable to comply, simply stating companies "must provide in a timely manner responsive, intelligible information or data, or appropriate technical assistance to obtain such information."

So, like I predicted before, the government is introducing a bill that will require every smartphone company to build in a backdoor to every device, no matter what, so that they can be compliant with this requirement. But there's no way at all that criminal hackers will find these backdoors!

thesameguy
April 8th, 2016, 11:59 AM
I am not sure where you are getting that from the legislation. It specifies no penalties for an inability to comply, only that if they can they must. It does not require designing in any back doors, only facilitating access where they exist. There are plenty of ways to build hack proof devices, Apple just didn't do that - they put the key to the encryption on the device and allow signed devices to preempt the OS. If either or both of those things were different, there would be no approach to get the key. In the case of the iPhone, the back doors are already built in and the government is demanding access to them.

As virtually always, the flaw here is that Apple ass-ended into security and then tried to abstract areas from the user to keep it simple. There are no shortcuts to good security.

Knowing what we know now - that the FBI didn't even actually need Apple because the security is somehow flawed in a way we don't know about - Apple should have just complied rather than turning this into a spectacle that forced the government to push. :down:

Kchrpm
April 8th, 2016, 12:10 PM
But we're not talking about just Apple, and just on a device where they could do it. I don't see this stopping at "hey, if you can help, you should, but if you don't, that's cool." It'd be great if that is where it stopped, but I don't see this as anything other than a slow build to a mandatory back door. #conspiracytheory

thesameguy
April 8th, 2016, 02:12 PM
I am sure the gubment would like that, but it seems unlikely. Review the RSA/PGP stuff I posted a while back. The PTBs have a bad track record with this stuff. The Apple Debacle - as I will henceforth refer to it - was pointless posturing on both sides and the consumer is the one that now suffers from people making mountains out of molehills.