Apple challenges 'chilling' demand to decrypt San Bernardino shooter's iPhone

serpentdove

BANNED
Banned
[Apple challenges 'chilling' demand to decrypt San Bernardino shooter's iPhone: Tim Cook publicly attacks the US government for asking Apple to take an ‘unprecedented step which threatens the security of our customers’ by Stuart Dredge and Danny Yadron] "Apple has hit back after a US federal magistrate ordered the company to help the FBI unlock the iPhone of one of the San Bernardino shooters, with chief executive Tim Cook describing the demand as “chilling”.

The court order focuses on Apple’s security feature that slows down anyone trying to use “brute force” to gain access to an iPhone by guessing its passcode. In a letter published on the company’s website, Cook responded saying Apple would oppose the order and calling for public debate.

“The United States government has demanded that Apple take an unprecedented step which threatens the security of our customers. We oppose this order, which has implications far beyond the legal case at hand,” he wrote.

While Cook took pains to stress that Apple was “shocked and outraged” by the San Bernardino shooting last December – “we have no sympathy for terrorists” – he said company is determined to push back against the court order.

Cook wrote that opposing the order “is not something we take lightly”.

“We feel we must speak up in the face of what we see as an overreach by the US government,” he added..." Full text: Apple challenges 'chilling' demand to decrypt San Bernardino shooter's iPhone Jn 8:36, Re 13:17
 

chrysostom

Well-known member
Hall of Fame
someone please tell neil cavuto
he is an idiot
he wants apple to just get into one phone
not all the phones
listen up you idiots
apple cannot break into their phone
but
they can figure out how
if
anyone can
but
once they figure out how to get into one
they can get into any of them
how are you going to protect the info?
not everyone is an idiot
 

serpentdove

BANNED
Banned
They will give it up and obey the FBI.

chinese%20smile.gif
Ask China how to do it. :idunno:

This is why the FBI can't hack into iPhones
 

chrysostom

Well-known member
Hall of Fame
how much time have we wasted on the possibility of someone hacking into hillary's server?
how much time have we spent on china hacking into government files?
and
now we risk more of all of the above to get into one phone which most likely will produce zero useful information
 

Ask Mr. Religion

☞☞☞☞Presbyterian (PCA) &#9
Gold Subscriber
LIFETIME MEMBER
Hall of Fame
So the FBI can't find one hacker to hack into the iPhone? I find that hard to believe.
That would be quite a hack. Apple used to store the encryption keys that were used making recovery much easier via a hack. With the iOS8, that feature was removed and made a selling point by Apple for users wanting extreme privacy.

At the risk of someone running from the Echelon services contacting TOL for removal of this post, given all the likely trigger words used here, some thoughts...

Short of using a means of electronic submission of encryption key guesses, which is what the FBI wants Apple to provide as a "feature" for its iOS8+ phones, there is no feasible means of manually doing so, unless one can live for thousands of years to mount a brute-force attack on determining the encryption key the user used. ;)

Using a phone connected to a high performance network of computers that just starts guessing key numbers electronically submitted to the phone's hardware would make the effort within the realm of possibility. Depending upon the power of the computers used (they are often customized devices with thousands of parallel FPGAs made for these specific efforts), it may take weeks to accomplish, but it is within the realm of possibility.

I do wonder if the FBI's forensic experts are trying all methods, however. For example a "man in the middle" attack seems something worth trying, to wit:

1. Assume the encryption algorithm is known, e.g., DES, etc.
2. Enter a pangram string, "The quick brown fox jumps over the lazy dog"
3. Transmit the pangram
4. Intercept the transmission ("man in the middle") using Sting-Ray hardware
5. Deconstruct the transmission to its constituent bytes
6. Compare with an instrumented, stand-alone device, specifically intended as the recipient of the transmission in Step 3 containing the known encryption algorithm. The instrumented device would essentially be an Apple phone tear-down, laid out on a large printed circuit with connection probes available for logic analyzers, etc.
7. From the comparison determine more reasoned guesses of the used encryption key...proceed accordingly, going back to step 2, replace the pangram with high frequency alphabetics, and using more knowledge gained about the user's key.
8. Repeat until successful.

Admitedly, step 7 is where the "magic" happens. It includes knowledge of what probability density functions of the encryption algorithm can be exploited. In other words, finding truly random processes within a Von Neumann computer architecture (processing, storage, I/O) is highly improbable, hence statistical regularities can be exploited where they are found. Consumer devices would need something like quantum computing features, yet most rely upon simply the transistor junction electronic noise as a random number generator "seed". Even then, the manufacturing process used to create the semiconductor transistor has known statistical properties that can be exploited in "guesses".

Naturally, this is the stuff of NSA code-breaking and the associated huge financial budgets, so I doubt the FBI forensic team has the budget or the folks of the caliber needed. These persons are usually one's who can pass the equivalent of Harvard's Math 25 or Math 55 courses, often used to identify possible NSA code-breakers for recruitment.

Full disclosure: I owned a company in the late nineties that indirectly helped Harris Corp. in their development of the first Sting-Ray device that was first introduced in 2001.

AMR
 
Last edited:

bybee

New member
That would be quite a hack. Apple used to store the encryption keys that were used making recovery much easier via a hack. With the iOS8, that feature was removed and made a selling point by Apple for users wanting extreme privacy.

Short of using a means of electronic submission of encryption key guesses, which is what the FBI wants Apple to provide as a "feature" for its iOS8+ phones, there is no feasible means of manually doing so, unless one can live for thousands of years to mount a brute-force attack on determining the encryption key the user used. ;)

Using a phone connected to a high performance network of computers that just starts guessing key numbers electronically submitted to the phone's hardware would make the effort within the realm of possibility. Depending upon the power of the computers used (they are often customized devices with thousands of parallel FPGAs made for these specific efforts), it may take weeks to accomplish, but it is within the realm of possibility.

I do wonder if the FBI's forensic experts are trying all methods, however. For example a "man in the middle" attack seems something worth trying, to wit:

1. Assume the encryption algorithm is known, e.g., DES, etc.
2. Enter a string of pangram, "The quick brown fox jumps over the lazy dog"
3. Transmit the pangram
4. Intercept the transmission ("man in the middle") using Sting-Ray hardware
5. Deconstruct the transmission to its constituent bytes
6. Compare with an instrumented, stand-alone device containing the known encryption algorithm
7. From the comparison determine more reasoned guesses of the used encryption key...proceed accordingly, going back to step 2 and using more knowledge gained about the user's key.

This is the stuff of NSA code-breaking, so I doubt the FBI forensic team has folks of the calibre needed, as these persons are usually one's who can pass the equivalent of Harvard's Math 25 or Math 55 courses, often used to identify possible NSA code-breakers for recruitment.

Full disclosure: I owned a company in the late nineties that helped Harris Corp. in their development of the first Sting-Ray device that was first introduced in 2001.

AMR

Um... well... I was getting to that!:D
 

rexlunae

New member
That would be quite a hack. Apple used to store the encryption keys that were used making recovery much easier via a hack. With the iOS8, that feature was removed and made a selling point by Apple for users wanting extreme privacy.

At the risk of someone running from the Echelon services contacting TOL for removal of this post, given all the likely trigger words used here, some thoughts...

Short of using a means of electronic submission of encryption key guesses, which is what the FBI wants Apple to provide as a "feature" for its iOS8+ phones, there is no feasible means of manually doing so, unless one can live for thousands of years to mount a brute-force attack on determining the encryption key the user used. ;)

Using a phone connected to a high performance network of computers that just starts guessing key numbers electronically submitted to the phone's hardware would make the effort within the realm of possibility. Depending upon the power of the computers used (they are often customized devices with thousands of parallel FPGAs made for these specific efforts), it may take weeks to accomplish, but it is within the realm of possibility.

I do wonder if the FBI's forensic experts are trying all methods, however. For example a "man in the middle" attack seems something worth trying, to wit:

1. Assume the encryption algorithm is known, e.g., DES, etc.
2. Enter a pangram string, "The quick brown fox jumps over the lazy dog"
3. Transmit the pangram
4. Intercept the transmission ("man in the middle") using Sting-Ray hardware
5. Deconstruct the transmission to its constituent bytes
6. Compare with an instrumented, stand-alone device, specifically intended as the recipient of the transmission in Step 3 containing the known encryption algorithm. The instrumented device would essentially be an Apple phone tear-down, laid out on a large printed circuit with connection probes available for logic analyzers, etc.
7. From the comparison determine more reasoned guesses of the used encryption key...proceed accordingly, going back to step 2, replace the pangram with high frequency alphabetics, and using more knowledge gained about the user's key.
8. Repeat until successful.

Admitedly, step 7 is where the "magic" happens. It includes knowledge of what probability density functions of the encryption algorithm can be exploited. In other words, finding truly random processes within a Von Neumann computer architecture (processing, storage, I/O) is highly improbable, hence statistical regularities can be exploited where they are found. Consumer devices would need something like quantum computing features, yet most rely upon simply the transistor junction electronic noise as a random number generator "seed". Even then, the manufacturing process used to create the semiconductor transistor has known statistical properties that can be exploited in "guesses".

Naturally, this is the stuff of NSA code-breaking and the associated huge financial budgets, so I doubt the FBI forensic team has the budget or the folks of the caliber needed. These persons are usually one's who can pass the equivalent of Harvard's Math 25 or Math 55 courses, often used to identify possible NSA code-breakers for recruitment.

Full disclosure: I owned a company in the late nineties that indirectly helped Harris Corp. in their development of the first Sting-Ray device that was first introduced in 2001.

AMR

From what I've read of the request, what the FBI is asking is actually a lot simpler than that. They're not asking Apple for the keys. They want Apple to make a custom version of iOS that won't honor the setting to delete all user data after some number of wrong password attempts. The code for that fundamentally must be somewhere unencrypted, because it has to run without the correct password having been entered. So, the implications to the phone's strong crypto features are limited. But, they are asking Apple to make a tool that fundamentally could be used to attack any iPhone, which implicates the security of billions of devices around the world.

It sounds like what the FBI wants to do is fairly unsophisticated. They want to hold the phone in their hands and try passwords in the hopes of guessing the right one. Which might prove prohibitive, even if Apple does give them what they want.
 

Ask Mr. Religion

☞☞☞☞Presbyterian (PCA) &#9
Gold Subscriber
LIFETIME MEMBER
Hall of Fame
From what I've read of the request, what the FBI is asking is actually a lot simpler than that. They're not asking Apple for the keys. They want Apple to make a custom version of iOS that won't honor the setting to delete all user data after some number of wrong password attempts. The code for that fundamentally must be somewhere unencrypted, because it has to run without the correct password having been entered. So, the implications to the phone's strong crypto features are limited. But, they are asking Apple to make a tool that fundamentally could be used to attack any iPhone, which implicates the security of billions of devices around the world.

It sounds like what the FBI wants to do is fairly unsophisticated. They want to hold the phone in their hands and try passwords in the hopes of guessing the right one. Which might prove prohibitive, even if Apple does give them what they want.
You may very well be correct. I took my assumption of what the FBI was seeking from a discussion I read this morning on slashdot. In the case of just accessing the phone from a user's PIN or whatnot, it would seem the problem is well within reach using known profiling methods of a user's usual proclivities, available password apps, etc., for setting up phone access PINs. Some may even use fingerprints, so if that were the case, compelling the criminal for the fingerprint or using the booking fingerprints, seems doable. That all said, they may have access to the phone, but there is also the fact that some folders therein may have been encrypted by the user.

AMR
 

chrysostom

Well-known member
Hall of Fame
stop listening to the fbi
and
start listening to what apple is saying

they are asking for something that doesn't exist
apple might be able to create it
but
the security of all their other phones would be put at risk
you can't protect the software once it is created
that is the selling point of their phone
 

rexlunae

New member
they are asking for something that doesn't exist
apple might be able to create it
but
the security of all their other phones would be put at risk
you can't protect the software once it is created
that is the selling point of their phone

That's certainly true. But also true is the fact that if the protection mechanism that is in place is simply that the software hasn't been written, it can be a very strong mechanism. It might well be doable by a competent hacker without Apple's help.
 

chrysostom

Well-known member
Hall of Fame
That's certainly true. But also true is the fact that if the protection mechanism that is in place is simply that the software hasn't been written, it can be a very strong mechanism. It might well be doable by a competent hacker without Apple's help.

I don't think that is true
it is much easier for apple to do it
if
it is at all possible
it is my understanding that it may not be
 

rexlunae

New member
I don't think that is true
it is much easier for apple to do it
if
it is at all possible
it is my understanding that it may not be

It's definitely easier for Apple to do. And if there is any real crypto involved, it isn't likely possible for anyone in tractable time without technology that isn't widely available. But the code that controls deletion of personal data after some number of failed attempts to log in must be somewhere unencrypted, or it couldn't be run.
 

chrysostom

Well-known member
Hall of Fame
It's definitely easier for Apple to do. And if there is any real crypto involved, it isn't likely possible for anyone in tractable time without technology that isn't widely available. But the code that controls deletion of personal data after some number of failed attempts to log in must be somewhere unencrypted, or it couldn't be run.

encryption is a selling point for both apple and google
and
that is a biggie
I have been warned many times by gmail that
if
I forget my password
I will have to start a new account
 
Top