Security of data
Posted by: Chris G on 20 February 2016
So Apple is refusing to develop software to allow the US authorities to hack in to a dead terrorist's iphone. This probably hampers the fight against terrorism and is potentially a risk to security. Is Apple right to take this position? Apple prides itself on its stance for the rights of the individual, but itself doubtless gets much information from Apple users' activities.
This is simple. Just get U2 to break into the iPhone, after all they managed to get themselves into all of them recently.
Don Atkinson posted:
"I prefer the option of strong, but not absolute encryption. With the usual safeguards for "Authorised" access and punishment for "unauthorised" access".
I think I have to agree with you, even though I am sure that whatever safeguards are introduced will regularly be compromised.
It's not just apple and iPhones , the U.K. Government were pushing for all ISPs to store all internet usage indefinitely for all customers, which was ruled out by being not commercially viable - the cost of storage.
Yes, I must admit that although I do not support Apple's stance on in respect of the proven terrorist's phone, I really don't trust the UK Government (especially the current lot) to implement safeguards were they to force through such a policy. I suspect that we would end up with a free-for-all with all Government departments having pretty much unrestricted 'fishing' access to any information they desire (bad enough in itself), but with the added risk that the data is not secured adequately to prevent completely unauthorised access or leaks.
http://www.smh.com.au/national...20160222-gn0qip.html
This sort of idiocy informs public opinion. It leads to us all thinking that the so-called "threat" is significant, and that we must give in to hatred, fear, paranoia, racism and general hatred of people "not like us" because the consequences of anything else are unthinkable. Won't somebody please think of the children. It is idiotic that this oxygen thief is given any press time at all. Ridiculous.
Breaking into an iPhone is just like breaking into a private home, we just need a warrant from the court to get Apple to break into this particular iPhone, we just do not want Apple to create a backdoor to any iPhone.
The dead terrorist actually forfeited his rights to privacy.
Peter Dinh posted:Breaking into an iPhone is just like breaking into a private home, we just need a warrant from the court to get Apple to break into this particular iPhone, we just do not want Apple to create a backdoor to any iPhone.
The dead terrorist actually forfeited his rights to privacy.
Yes, they do want Apple to create a "back door"
It's like living in a world where the standard 5 lever lock is routinely picked; so a company creates a 10 lever lock. Only the FBI want to fit a 1 lever lock to your kitchen door just incase they get a warrant to search your home.
I agree that the FBI would like Apple to create a "back door", but my understanding is that in this case (and a number of others), the FBI have asked Apple to provide access to the contents of this particular phone.
As Peter has suggested above, this is no different from the police (or FBI) requesting and obtaining a search warrant to search physical premises.
If Apple is able to comply, then in my opinion it should in this particular case. If it cannot comply because it has no means of doing so then it should simply say so. In the event that the FBI is specifically asking for a "Back door" to be introduced in all new versions of IOS, then that is entirely different, and something that Apple should pursue in the courts.
Hmack posted:I agree that the FBI would like Apple to create a "back door", but my understanding is that in this case (and a number of others), the FBI have asked Apple to provide access to the contents of this particular phone.
Actually I'm going to slightly change what I wrote just two posts above...
The FBI don't want Apple to create a backdoor, but they want Apple to lessen the security on the front door for this (and maybe other) phones by removing the security feature where you can't simply go through every combination of 4 digit passcode: 0000; 0001; 0002; 0003; etc. - as standard the iPhone is usually set to delete data (well actually the encryption key) after 10 wrong guesses and there is also a time out so that after 4 guesses you have to wait a minute, after the 5th guess have to wait 5 minutes, 6th wait 15 minutes and after 7th and subsequent guesses wait an hour.
The FBI want Apple to provide a special version of the OS (and a method to install it) to remove the 10 guesses and the time out delay.
The main worry is that although the FBI headlines are about this one phone ... its the inevitable creep. Once Apple has done it for a terrorists phone, what about a kidnapper's phone, what about a rapists phone, what about a burgers phone? And once the FBI can do it ... why not the CIA, the NSA, GCHQ, the DGSE/DGSI, Mossad? And then perhaps China and India will ban imports of iPhones unless their agencies have access to the same unlocking facility; could Tim Cook resist shareholders requirements that they continue selling in those markets?
Eloise posted:Hmack posted:I agree that the FBI would like Apple to create a "back door", but my understanding is that in this case (and a number of others), the FBI have asked Apple to provide access to the contents of this particular phone.
Actually I'm going to slightly change what I wrote just two posts above...
The FBI don't want Apple to create a backdoor, but they want Apple to lessen the security on the front door for this (and maybe other) phones by removing the security feature where you can't simply go through every combination of 4 digit passcode: 0000; 0001; 0002; 0003; etc. - as standard the iPhone is usually set to delete data (well actually the encryption key) after 10 wrong guesses and there is also a time out so that after 4 guesses you have to wait a minute, after the 5th guess have to wait 5 minutes, 6th wait 15 minutes and after 7th and subsequent guesses wait an hour.
The FBI want Apple to provide a special version of the OS (and a method to install it) to remove the 10 guesses and the time out delay.
The main worry is that although the FBI headlines are about this one phone ... its the inevitable creep. Once Apple has done it for a terrorists phone, what about a kidnapper's phone, what about a rapists phone, what about a burgers phone? And once the FBI can do it ... why not the CIA, the NSA, GCHQ, the DGSE/DGSI, Mossad? And then perhaps China and India will ban imports of iPhones unless their agencies have access to the same unlocking facility; could Tim Cook resist shareholders requirements that they continue selling in those markets?
If the commercial delights are big enough for Tom Cook and his shareholders to sell fool-proof encryping communications devices to potential terrorists, why stop at fool=proof encrypting i-phones.
Why not let people sell guns and bombs and tanks and..............................provided of course that the commercial delights are big enough !!
Don Atkinson posted:Eloise posted:Hmack posted:I agree that the FBI would like Apple to create a "back door", but my understanding is that in this case (and a number of others), the FBI have asked Apple to provide access to the contents of this particular phone.
Actually I'm going to slightly change what I wrote just two posts above...
The FBI don't want Apple to create a backdoor, but they want Apple to lessen the security on the front door for this (and maybe other) phones by removing the security feature where you can't simply go through every combination of 4 digit passcode: 0000; 0001; 0002; 0003; etc. - as standard the iPhone is usually set to delete data (well actually the encryption key) after 10 wrong guesses and there is also a time out so that after 4 guesses you have to wait a minute, after the 5th guess have to wait 5 minutes, 6th wait 15 minutes and after 7th and subsequent guesses wait an hour.
The FBI want Apple to provide a special version of the OS (and a method to install it) to remove the 10 guesses and the time out delay.
The main worry is that although the FBI headlines are about this one phone ... its the inevitable creep. Once Apple has done it for a terrorists phone, what about a kidnapper's phone, what about a rapists phone, what about a burgers phone? And once the FBI can do it ... why not the CIA, the NSA, GCHQ, the DGSE/DGSI, Mossad? And then perhaps China and India will ban imports of iPhones unless their agencies have access to the same unlocking facility; could Tim Cook resist shareholders requirements that they continue selling in those markets?
If the commercial delights are big enough for Tom Cook and his shareholders to sell fool-proof encryping communications devices to potential terrorists, why stop at fool=proof encrypting i-phones.
Why not let people sell guns and bombs and tanks and..............................provided of course that the commercial delights are big enough !!
You're comparing a phone to a tank? Really?
Don Atkinson posted:Eloise posted:The main worry is that although the FBI headlines are about this one phone ... its the inevitable creep. Once Apple has done it for a terrorists phone, what about a kidnapper's phone, what about a rapists phone, what about a burgers phone? And once the FBI can do it ... why not the CIA, the NSA, GCHQ, the DGSE/DGSI, Mossad? And then perhaps China and India will ban imports of iPhones unless their agencies have access to the same unlocking facility; could Tim Cook resist shareholders requirements that they continue selling in those markets?
If the commercial delights are big enough for Tom Cook and his shareholders to sell fool-proof encryping communications devices to potential terrorists, why stop at fool=proof encrypting i-phones.
Why not let people sell guns and bombs and tanks and..............................provided of course that the commercial delights are big enough !!
Erm ... you missed what I meant.
I was saying if the "back door" (for want of a better term) was created and made available to the FBI ... do you not think China and Indian security services would want it. And their governments are not beyond banning imports / making it illegal to own a "secure" phone if their demands for access to that back door wasn't met.
As India and China represent the largest world markets for mobile phones, then Apple would either have to give up those markets, or give in to their demands.
As for comparing iPhones to guns, bombs and tanks ... surely people DO sell guns, bombs and tanks all the time. Especially guns! And bombs (well at least explosives). Perhaps not tanks but you get my jist. These are sold legitimately and find their way into the hands of terrorists and killers all the time!
Apple will fight this order all the way to the Supreme Court. It isn't a question of what's right or wrong, or of security versus privacy. It is instead based on a question of legal authority.
The Justice Department is asking a judge to order Apple to unlock a criminal defendant’s passcode-protected iPhone. The government seized the phone and clearly has the legal authority to search the phone pursuant to a search warrant. But the Justice department is also trying to compel Apple to do this work for them, and they are citing a 227 year old "catch all" law as the legal basis. Even the judge in this case has expressed doubt that the All Writs Act of 1789 (AWA) does, in fact, authorize him to issue such an order.
Two things come to mind: first, and rather obviously, the law lags far behind today's modern technologies. We have that here on the forum with digital music ownership rights, and it is equally true in mobile phone data privacy rights. Second, even if Apple were to create an iOS with no limits on guessing the unlock password, it would not address the issue of widespread use of data encryption by criminals and terrorists worldwide. They are often using application-embedded encryption that, even if one could easily unlock their iPhones, would still remain invisible. Also, these applications are often created in countries where criminal prosecution is nearly impossible.
I am generally sympathetic to any nation's need for security. I rely on my country's intelligence services to try and thwart as many crimes and terrorist plots as possible. I recognize that there are limits to my privacy in our digitally connected, mobile and social word. But I also think we need, as individual countries, to establish laws which clearly establish limits to how intrusive our governments can be in the pursuit of security goals. IMO, the biggest mistake in this Apple versus FBI case is that it is being conducted in the public eye. I want to believe that Apple could have hacked this one iPhone while also protecting the intellectual property that allowed them to do so. Once this case went public, it became a matter of principle. And when the court order was issued under the AWA, Apple clearly felt a moral obligation to challenge it in the courts. IMO, it is far past time for the do-nothing US Congress to begin bringing our 18th century laws into the modern age. But in a year of Presidential politics, that is simply not going to happen.
ATB.
Hook
Hook,
I pretty much agree with your post. Apple should simply have dealt with the problem without going public. If possible, Apple should have offered to take this particular phone, by-passed the security/encryption and then handed it back to the FBI.
Of course, I have no knowledge of the security features involved, so this might not even be possible. However, if it is possible then Apple (whenever asked to do so) could comply with individual court orders without compromising their generic security features, and without inviting the glare of hi-profile publicity.
Hmack posted:Hook,
I pretty much agree with your post. Apple should simply have dealt with the problem without going public. If possible, Apple should have offered to take this particular phone, by-passed the security/encryption and then handed it back to the FBI.
Of course, I have no knowledge of the security features involved, so this might not even be possible. However, if it is possible then Apple (whenever asked to do so) could comply with individual court orders without compromising their generic security features, and without inviting the glare of hi-profile publicity.
If they'd dealt with it privately and it had subsequently become public (almost inevitable), they would be pilloried for the secrecy. They can't win either way.
Hmack posted:I pretty much agree with your post. Apple should simply have dealt with the problem without going public. If possible, Apple should have offered to take this particular phone, by-passed the security/encryption and then handed it back to the FBI.
But the point is that (currently) there is no way Apple CAN bypass the security.
And as Winky pointed out ... it would inevitably have been made public (any future trial referencing the data on that phone for example).
At the end of the day, either you have devices such as phones which are secure; or you have to open a backdoor. If you are going to open a backdoor then that is not a decision which should be made by a court / half a dozen lawyers using 200+ year old legislation and twisting it to fit a modern situation - it should be a decision made in the open by politicians are representatives of the people (however much I distrust them).
Hook posted:Second, even if Apple were to create an iOS with no limits on guessing the unlock password, it would not address the issue of widespread use of data encryption by criminals and terrorists worldwide.
Even if Apple created the no limits guessing, as soon as you increase the number of digits in the passcode it becomes impractical to crack.
Apparently the iPhone can (currently) check around 12.5 codes a second. So with 4 digits thats about 800 second to check the 10,000 possible combinations (or just over 13 minutes). And they would likely start with the more common combinations ... 1-30 in the first 2 digits and 1-12 in the second 2 and vice versa.
If you increase that to 6 digits (standard on newer phones) it would take around 80,000 seconds (22 hours). You also start to eliminate guessing through looking at where on the screen is dirtiest.
If you increase again to 9 digits that will take nearly 80 MILLION seconds (nearly 1000 days).
So either you're going to start banning encryption ... or its just a temporary solution.
I'm not sure at the end of the day where I stand on the rights or wrongs. But I don't think this court order is really about one particular phone!
Eloise posted:
"But the point is that (currently) there is no way Apple CAN bypass the security."
Ok - in which case, Apple should simply have responded to the request by stating that they have no way of complying rather than state (as I believe they did) that they would not do so as a matter of principle.
Hmack posted:Eloise posted:
"But the point is that (currently) there is no way Apple CAN bypass the security."
Ok - in which case, Apple should simply have responded to the request by stating that they have no way of complying rather than state (as I believe they did) that they would not do so as a matter of principle.
http://www.apple.com/customer-letter/ and http://www.apple.com/customer-letter/answers/ is Apple's response (I suspect there is a more legalese version available).
Eloise,
Thanks for the link. I hadn't seen this, and it does change my view to an extent.
However, are these not two entirely different issues, because obviously a new version of IOS would have no impact on the FBI's ability to gain access to the proven terrorist's phone? Am I wrong in my belief that the FBI has asked Apple to provide them with a way of accessing data on this specific phone?
If I am not wrong, then I do not support Apple in their stance with respect to this particular phone, but I do support Apple in declining to releases a generic version of IOS that would be released to all phones.
Hook posted:Apple will fight this order all the way to the Supreme Court. It isn't a question of what's right or wrong, or of security versus privacy. It is instead based on a question of legal authority.
The Justice Department is asking a judge to order Apple to unlock a criminal defendant’s passcode-protected iPhone. The government seized the phone and clearly has the legal authority to search the phone pursuant to a search warrant. But the Justice department is also trying to compel Apple to do this work for them, and they are citing a 227 year old "catch all" law as the legal basis. Even the judge in this case has expressed doubt that the All Writs Act of 1789 (AWA) does, in fact, authorize him to issue such an order.
Two things come to mind: first, and rather obviously, the law lags far behind today's modern technologies. We have that here on the forum with digital music ownership rights, and it is equally true in mobile phone data privacy rights. Second, even if Apple were to create an iOS with no limits on guessing the unlock password, it would not address the issue of widespread use of data encryption by criminals and terrorists worldwide. They are often using application-embedded encryption that, even if one could easily unlock their iPhones, would still remain invisible. Also, these applications are often created in countries where criminal prosecution is nearly impossible.
I am generally sympathetic to any nation's need for security. I rely on my country's intelligence services to try and thwart as many crimes and terrorist plots as possible. I recognize that there are limits to my privacy in our digitally connected, mobile and social word. But I also think we need, as individual countries, to establish laws which clearly establish limits to how intrusive our governments can be in the pursuit of security goals. IMO, the biggest mistake in this Apple versus FBI case is that it is being conducted in the public eye. I want to believe that Apple could have hacked this one iPhone while also protecting the intellectual property that allowed them to do so. Once this case went public, it became a matter of principle. And when the court order was issued under the AWA, Apple clearly felt a moral obligation to challenge it in the courts. IMO, it is far past time for the do-nothing US Congress to begin bringing our 18th century laws into the modern age. But in a year of Presidential politics, that is simply not going to happen.
ATB.
Hook
That's interesting, Hook. The idea of trying to rely on legislation that is over 200 years old seems a long shot to me. In the UK when courts are asked to rely on legislation that doesn't explicitly fit the circumstances of the case, they sometimes explore what was in the legislators' minds when the law was passed. In the UK that would include explanatory memoranda published alongside the draft legislation and the written account of any debate in Parliament which might shed further light on the intent behind the piece of legislation. I suspect the lawyers representing the Justice Dept will have something of struggle to persuade a court that the legislators of this particular piece of legislation intended that it be applied to smart-phone technology. That might be entertaining to watch though.
Mike
Hmack posted:
However, are these not two entirely different issues, because obviously a new version of IOS would have no impact on the FBI's ability to gain access to the proven terrorist's phone? Am I wrong in my belief that the FBI has asked Apple to provide them with a way of accessing data on this specific phone?
If I am not wrong, then I do not support Apple in their stance with respect to this particular phone, but I do support Apple in declining to releases a generic version of IOS that would be released to all phones.
I think the argument is that if they are forced to release a "hacked" (to all intents and purposes) version of iOS for this one phone, the next step will be being forced to release the same hacked version for somewhere between 12 and 100s of other iPhones that law enforcement agencies want to gain access to.
Its not just opening one iPhone. It's creating the ability to do the same to any iPhone.
Yes, I get your general argument, but I don't understand how a hacked version of IOS could get the FBI into this particular phone. The phone is locked, and so it won't be possible to install the new IOS without first hacking into it, which is the FBI's problem to begin with.