PDA

View Full Version : The encryption debate



U-Ute
03-16-2016, 09:41 AM
I'm sure a lot of you have heard about the government trying to force Apple to decrypt the cell phone of the San Bernadino shooter Syed Rizwan Farook.

Apple responded yesterday. Here's an interesting dive into their response.

https://www.techdirt.com/articles/20160315/15505433916/apples-response-to-doj-your-filing-is-full-blatantly-misleading-claims-outright-falsehoods.shtml

There are many reasons why this is a futile reach for the government. First of all, this wouldn't be just a one off request. Anyone who has kids knows that if you give into one kid, you'll get 20 more requests. Second of all, this sort of stuff is like whack a mole: as soon as you force Apple to do it, people will just go to another provider of the technology. You can't shut them all down because eventually people will end up going with some Russian technology the government can't control.

Second, they seem to forget that if the encryption is weakened for everyone, that also includes the phones of the government employees themselves. If you want to keep your secrets safe, then you'll need to deal with other people keeping their secrets safe.

Rocker Ute
03-16-2016, 10:22 AM
I'm sure a lot of you have heard about the government trying to force Apple to decrypt the cell phone of the San Bernadino shooter Syed Rizwan Farook.

Apple responded yesterday. Here's an interesting dive into their response.

https://www.techdirt.com/articles/20160315/15505433916/apples-response-to-doj-your-filing-is-full-blatantly-misleading-claims-outright-falsehoods.shtml

There are many reasons why this is a futile reach for the government. First of all, this wouldn't be just a one off request. Anyone who has kids knows that if you give into one kid, you'll get 20 more requests. Second of all, this sort of stuff is like whack a mole: as soon as you force Apple to do it, people will just go to another provider of the technology. You can't shut them all down because eventually people will end up going with some Russian technology the government can't control.

Second, they seem to forget that if the encryption is weakened for everyone, that also includes the phones of the government employees themselves. If you want to keep your secrets safe, then you'll need to deal with other people keeping their secrets safe.

I've been pretty fascinated by this as well. If you follow Edward Snowden's assertion that the government really could crack that phone if they wanted to (and McAfee says he can do it with his team as well) you wonder if government really can crack it, why then are they arguing this? After you go through the mental gymnastics of all that though I think the simplest answer resonates in that they really think they can't or are at least seeking a legal precedent to be able to order others to do it.

So I was thinking the other day of possible ways where a company could create a backdoor while also maintaining the same level of security. The simplest and most profitable way I could think of is if you created a single hardware key and keep that in the company's sole possession. Hardware keys are nearly impossible to emulate. So then they could go to the gov't and say, "Okay, we'll decrypt your phone but it is going to cost $1M per device and it can only be done with a court order and on our campus."

Then it would be up to the government on how badly they want to decrypt a device. In the case of the San Bernandino terrorists, that is maybe small potatoes. For ordinary people like us it would be a barrier to high in most instances. Further, if somehow devices did get cracked through an emulated hardware key it wouldn't be difficult for apple to update the software to use a new key.

That's just one thought.

One side note, I was entertained by the license vs own argument that Apple mentioned. I've long been perplexed by this notion when it comes to all things digital. It is the one thing you can own but don't own. For example, did you know that if you owned a movie on DVD and showed it to a church group you are technically in violation of the terms of use for that DVD. Same thing if you do like we do and in the summer have movie night in our backyard for all of our neighbors. You also aren't allowed to modify your phones or its software when it comes to Apple, etc.

Anyway, It'll be interesting thing to see as it plays out. Also fascinating about this is they are afraid that they phone might have been set to brick after entering in the wrong password 10x, but they don't know if that is the case or not. If I were a gambling man I would say they were not because you have to go and turn that on.

U-Ute
03-16-2016, 11:02 AM
I've been pretty fascinated by this as well. If you follow Edward Snowden's assertion that the government really could crack that phone if they wanted to (and McAfee says he can do it with his team as well) you wonder if government really can crack it, why then are they arguing this? After you go through the mental gymnastics of all that though I think the simplest answer resonates in that they really think they can't or are at least seeking a legal precedent to be able to order others to do it.

If I'm understanding this correctly, these assertions are incorrect. It may have been true in the past, but Apple made a change to their latest OS which changed the game somewhat.

In the past, the biggest hurdle was the passkey. Once you got past that and were in, you had everything. If you could backdoor your way around the passkey or quickly hack it, you were in and could see everything. Those are the tools that the government and John McAfee (who has since recanted his claim) are using to get into phones.

What the issue is with this particular OS update and that phone is that all of the data on the phone is encrypted using a key salted with that passkey. So even if you could get around the passkey and downloaded the contents of the phone, it is all encrypted. Even if Apple provided the key, the phone is encrypted with the combined key and passcode. So having the key from Apple by itself is useless. The only way you can get at the data is by knowing the passcode. Furthermore the OS limits how quickly you can enter in passkey attempts and adds an option so that a failed number of passkey attempts in a certain amount of time causes the OS to erase the data on the phone.

What the FBI has really been asking for in this is a modified OS that allows them to brute force the passcode as quickly as possible without any repercussions.



So I was thinking the other day of possible ways where a company could create a backdoor while also maintaining the same level of security. The simplest and most profitable way I could think of is if you created a single hardware key and keep that in the company's sole possession. Hardware keys are nearly impossible to emulate. So then they could go to the gov't and say, "Okay, we'll decrypt your phone but it is going to cost $1M per device and it can only be done with a court order and on our campus."

People find these keys whether you want to or not. From the mundane method of lazy employees who keep the key on a memory stick to state funded hacker penetration, keys can't ever be guaranteed to be safe. Plus, as I said, these keys are not used by themselves, they are salted with a user provided key.


Further, if somehow devices did get cracked through an emulated hardware key it wouldn't be difficult for apple to update the software to use a new key.

But then it becomes a cost for them to bear in terms of engineering and user education, not to mention the public perception that comes with constant updates of "oh hey that new phone you got may be hackable now you need to get an update". That's just something they don't want to have to deal with.



One side note, I was entertained by the license vs own argument that Apple mentioned. I've long been perplexed by this notion when it comes to all things digital. It is the one thing you can own but don't own. For example, did you know that if you owned a movie on DVD and showed it to a church group you are technically in violation of the terms of use for that DVD. Same thing if you do like we do and in the summer have movie night in our backyard for all of our neighbors. You also aren't allowed to modify your phones or its software when it comes to Apple, etc.

Anyway, It'll be interesting thing to see as it plays out. Also fascinating about this is they are afraid that they phone might have been set to brick after entering in the wrong password 10x, but they don't know if that is the case or not. If I were a gambling man I would say they were not because you have to go and turn that on.

Yeah, this is all coming down to business models and durable goods. Back in the day if you played a cassette tape or record, the simple act of listening to it would degrade it thereby causing it to lose value. Reselling used would intrinsically be at a lower value. Also, constant use would require repurchasing. The exact duplication digital provides has changed the game and is threatening revenue models of old businesses, so they are trying to find ways to be able to strong-arm their customers. The licensing model was actually started in software because one copy could be put on multiple machines. You don't buy software, you buy a license to use software (go check those agreements everyone). But the advancements in hardware and software have hampered even that. Companies are finding it difficult to get people to buy the latest and greatest every year. So now software is moving into a subscription model (Office 365 anyone?) similar to what we're seeing with NetFlix and Amazon Prime (for movies) and Pandora or Spotify (for music).

But that's for another thread.

Diehard Ute
03-16-2016, 01:12 PM
Yeah. Apple has been going to more and more encryption for a while.

iMessages are encrypted by the two phones not their servers.

What's funny is most of us in law enforcement are on Apple's side of this, contrary to what the media would like you to believe


Sent from my iPhone using Tapatalk

Rocker Ute
03-16-2016, 02:22 PM
If I'm understanding this correctly, these assertions are incorrect. It may have been true in the past, but Apple made a change to their latest OS which changed the game somewhat.

In the past, the biggest hurdle was the passkey. Once you got past that and were in, you had everything. If you could backdoor your way around the passkey or quickly hack it, you were in and could see everything. Those are the tools that the government and John McAfee (who has since recanted his claim) are using to get into phones.

What the issue is with this particular OS update and that phone is that all of the data on the phone is encrypted using a key salted with that passkey. So even if you could get around the passkey and downloaded the contents of the phone, it is all encrypted. Even if Apple provided the key, the phone is encrypted with the combined key and passcode. So having the key from Apple by itself is useless. The only way you can get at the data is by knowing the passcode. Furthermore the OS limits how quickly you can enter in passkey attempts and adds an option so that a failed number of passkey attempts in a certain amount of time causes the OS to erase the data on the phone.

What the FBI has really been asking for in this is a modified OS that allows them to brute force the passcode as quickly as possible without any repercussions.


You are correct regarding asymmetrically encrypting all data preventing it from being decrypted. Apple has gone to appropriate levels of security which is also why they used to brick the phone if you attempt to replace the touch ID through a third party on the new phones etc (now they'll let you keep the phone going but the touch ID won't work any more). What I am proposing is essentially what someone would try to do to hack the device now, which is to override any of the precautions for a brute force attack (like limited number of passcodes) using a hardware key. You are probably thinking of a portable USB device that could be passed around or something like that, and yes those do easily walk off. Make one, bolt it down and only let Tim Cook access it or something. When connected it would allow to bypass the passcode lockout and allow for bruteforce attack to access it.

Anyway, that is just one idea, there are likely many others that would allow for a backdoor and also not compromise the security of everyone else. I personally think this is more about Apple cleaning the egg on their face because there have been numerous instances of them helping the government crack devices in the past.

So on a more practical note though, everyone should encrypt everything they have, it is built into almost everything now,I'm talking laptops, desktops etc. The main purpose is that it is really easy to lose a laptop or with a break-in for people to go and then get your data. Encrypting the entire drive helps keep you safe (unless your password is skutah1 like it is for half of Park City).

Rocker Ute
03-16-2016, 02:25 PM
One other thing, if you are wanting to know how to set up your iPhone so that it erases all data after 10 passcode attempts (which is NOT on by default) go to Settings > Touch ID & Passcode and scroll all the way to the bottom and toggle on 'Erase Data'.

More practical for people is to make sure to set up iCloud and turn 'Find My Phone' on. If a device ever gets stolen you can either track it or erase everything on it remotely if anyone turns it on.

Rocker Ute
03-28-2016, 04:31 PM
So as predicted the DOJ has cracked the iPhone in question 'using a third-party'. Kind of hilarious if you ask me. Now I bet Apple will want to sue to figure out what they did to crack the phone. Short story, don't say or do anything electronically that you don't want public, nothing is secure.

Diehard Ute
03-28-2016, 05:45 PM
So as predicted the DOJ has cracked the iPhone in question 'using a third-party'. Kind of hilarious if you ask me. Now I bet Apple will want to sue to figure out what they did to crack the phone. Short story, don't say or do anything electronically that you don't want public, nothing is secure.

Apple knew there were ways in. They've been working round the clock to update security to make it all but impossible.

The only reason the FBI was struggling was the erase feature if they failed. Much of the data itself wasn't encrypted. It likely will be in the future.


Sent from my iPhone using Tapatalk

NorthwestUteFan
03-28-2016, 06:24 PM
Apple already turns over the data in China (by housing the data on government servers). They do what they must to maintain access to their market. They will sell a billion phones in China.

But at the same time they have to maintain the sense of security here in America.

Now that the cat is out of the bag Apple needs to step up their game to protect it in the future.

Still it lends itself to funny jokes.

FBI: "Apple, we want you to give up the back door."

Apple: "Hey, FBI, we really like you a LOT but I don't think we are quite to that point in our relationship..."

Rocker Ute
03-28-2016, 06:30 PM
Apple knew there were ways in. They've been working round the clock to update security to make it all but impossible.

The only reason the FBI was struggling was the erase feature if they failed. Much of the data itself wasn't encrypted. It likely will be in the future.


Sent from my iPhone using Tapatalk

There always are and there will always be ways in. In fact you might argue that Apple is giving people a false sense of security.

As for the data encryption, the old OS had much of the data unencrypted but the new phones have encrypted everything. It just means you need to find other ways to bypass the security.

Diehard Ute
03-28-2016, 06:38 PM
There always are and there will always be ways in. In fact you might argue that Apple is giving people a false sense of security.

As for the data encryption, the old OS had much of the data unencrypted but the new phones have encrypted everything. It just means you need to find other ways to bypass the security.

I don't think there's much of a sense of security, I just think lots of people don't understand any of it.


Sent from my iPhone using Tapatalk

U-Ute
04-27-2016, 11:12 AM
This new bill (https://apsis.io/blog/2016/04/24/burr-feinsten-and-encryption) would require anyone who makes software that does any sort of encryption be liable for being able to decrypt the data.

As the article points out, it also opens up people to security vulnerabilities because now one-way hashes would be illegal, meaning any piece of data on the internet could be decrypted by hackers.