Free drive encryption
-
@BBigford said:
@Dashrender said:
@BBigford said:
@Dashrender said:
Personally, for the standard things we need to be concerned about - a stolen laptop that had HIPAA data or some such, Bitlocker is totally acceptable. If you're worried about the NSA, then no, you can't use it.
It's more of a search for the "gotcha" in Bitlocker. There's so much controversy behind it, I'm just curious if there is a "see... it's crackable with <method x> or <software x>." I haven't found any solid evidence throughout the years aside from the cold boot vulnerabilities, so that's why I turned to the community.
Exactly - and I'm betting you won't find any either. It's like this FBI thing and the bomber's phone. I'm not sure I believe that anyone actually cracked the encryption on the phone. Personally I think that's a lie so they could drop a case they felt they were losing and didn't want to have a precedent set against them.
So you're speculating that Bitlocker is ultra secure,
ultra? who's to say - but I do consider it secure enough for the common man to use. Again, Healthcare worker trying to keep their PHI (personal health information) away from prying eyes on a stolen laptop, it's more than likely fine, the average thug on the street will just format it if able and reinstall Windows and move on. But if you're talking about a targeted attack, say the FBI is trying - then I have no idea how good it is against them trying to crack into it.
or any material on it is just being smothered?
No idea what you mean here.
-
I'll say this another more direct way.
If you are not a criminal, and are just trying to keep you data away from someone who steals your stuff, like a laptop or phone - then Bitlocker is mostly likely plenty to keep you safe.
But if you are a criminal and are looking to secure your crimes away from the NSA, FBI, InterPol, etc - then no, you should probably go with something other than Bitlocker.
-
@BRRABill said:
@Dashrender said:
Exactly - and I'm betting you won't find any either. It's like this FBI thing and the bomber's phone. I'm not sure I believe that anyone actually cracked the encryption on the phone. Personally I think that's a lie so they could drop a case they felt they were losing and didn't want to have a precedent set against them.
I actually believe it.
It was for an older model, and only applicable under certain circumstances.
It like we always say, if they have the device, they'll eventually have the data.
yeah - this is what I'm not really sure of. The iPhone 5 (or was it the 5s?) didn't have a secure enclave. If the security company found a way to hack into whatever chip held the security key - then who knows.. it's possible they found a way to get the key out.
-
@Dashrender said:
yeah - this is what I'm not really sure of. The iPhone 5 (or was it the 5s?) didn't have a secure enclave. If the security company found a way to hack into whatever chip held the security key - then who knows.. it's possible they found a way to get the key out.
I mean at some level, someONE knows someTHING, right?
Ultimately security rests in keeping the secret service dedicated, and people like Snowden from saying anything. If someone with the "keys to the kingdom" decides to go rogue ... who knows what could happen?
-
@BRRABill said:
@Dashrender said:
There are reports that the drive manufacturers can unlock those self-encrypting drives.
I've had Wave (who makes the software that locks the Samung SSDs I use) enable an unlocked drive.
I forget exactly what we did, and data was basically wiped clean, but if they have access to do that, who knows what they can really do...
There are multiple options here. If the system works as most do, Wave would simply be resetting the SSDs version of a secure enclave and now that it's locked any longer would allow a new key to be enabled in the enclave (only known to the enclave) and the system would consider the data on the drive as not there, blank, and now with the new key in the enclave, you basically start over.
Another option would be that they actually were able to get the enclave to give up the key (horrible design) and then they used that key to wipe the drive - but that seems unnessary.
Steve Gibson has a good podcast that talks about how self encrypting drives typically work - might be worth a listen.
-
@BRRABill said:
@Dashrender said:
yeah - this is what I'm not really sure of. The iPhone 5 (or was it the 5s?) didn't have a secure enclave. If the security company found a way to hack into whatever chip held the security key - then who knows.. it's possible they found a way to get the key out.
I mean at some level, someONE knows someTHING, right?
Ultimately security rests in keeping the secret service dedicated, and people like Snowden from saying anything. If someone with the "keys to the kingdom" decides to go rogue ... who knows what could happen?
Why do you assume someone has to know something? I don't agree with that at all.
Let's look at modern iPhones with the secure enclave.
The first time the phone boots, the secure enclave activates and generates a random key. That key is used to write everything that is written to the drive on the phone.
Now currently there is no security, because there is no lock on the secure enclave. One of the first things the iPhone has you do (I assume as I've actually never set one up) is setup a password or pin. That pin is encoded and saved in the secure enclave along side the previously generated random key.
From this point forward, the secure enclave won't allow the system to pass encryption/decryption requests through the secure enclave if the password/pin hasn't been properly presented.
So now someone steals your phone. Assuming you have the ten try thing enabled, after they try ten bad codes, the secure enclave wipes itself - wiping out both the password/pin provided by the owner AND the random key used to encrypt/decrypt the drive.
Now that the key is gone, the data on the drive is more or less useless. when the phone resets, it starts over from the top by reinitializing itself with a new randomly generated key, starts encrypting everything written to the phone with the new key, and asked for a new password or pin, and we start this whole process over again.
In this situation, no one but the owner of the device ever has to know anything about what is on the phone.
-
Didn't Apple say they COULD do it, but wouldn't?
My point is that it is doable.
If someone really wanted it, and the people who could do it defected from Apple for the right amount of money...
Is that really so outlandish to even consider?
-
What I understood Apple to be able to do was write a newer version of the iOS that could be updated on the phone that would remove the 10 try disable feature on the iPhone.
That is what they were refusing to do.
-
Also, even if disgruntled employees did decide to do this, they wouldn't get very far because they would need Apple Signing Certificate to sign the code along with a nonce from the phone itself.
The way the iphones update works as follows:
iphone checks update server finds an update
iphone sends nonce to Apple,
Apple cryptographically signs nonce and update package
phone verifies that signature matches known good Apple cert from local, on phone archive
assuming good, installs update.If the ex-employees don't have Apple certificate - they can't get anywhere.
-
@BRRABill said:
@scottalanmiller said:
What would a fake one be?
A program that says it is the "next generation" of True Crypt, but isn't the logical resumption of the programming.
Ah. No one has done that
Yes, this is the TC code.
-
@Dashrender said:
Personally, for the standard things we need to be concerned about - a stolen laptop that had HIPAA data or some such, Bitlocker is totally acceptable. If you're worried about the NSA, then no, you can't use it.
TL;DR = Bitlocker for audits, VC for security.
-
@BRRABill said:
@Dashrender said:
Exactly - and I'm betting you won't find any either. It's like this FBI thing and the bomber's phone. I'm not sure I believe that anyone actually cracked the encryption on the phone. Personally I think that's a lie so they could drop a case they felt they were losing and didn't want to have a precedent set against them.
I actually believe it.
It was for an older model, and only applicable under certain circumstances.
It like we always say, if they have the device, they'll eventually have the data.
We say that about smart people, not the FBI
-
@BRRABill said:
Ultimately security rests in keeping the secret service dedicated, and people like Snowden from saying anything. If someone with the "keys to the kingdom" decides to go rogue ... who knows what could happen?
Snowden isn't some kind of ultra hacker. He's just a guy who had access and ethics. Anyone in the security department with his strong moral character could have done the same thing. It's just that everyone else was happy working against the public and hiding what they were doing from Americans - or were scared and lived with regret.
I don't consider Snowden rogue, he did his patriotic and civil duty. It's the rest that were rogue and acting against the people that were supposed to protect.
-
@BRRABill said:
Didn't Apple say they COULD do it, but wouldn't?
Yes, and they recognized that this was a security vulnerability and are now changing that.
-
@BRRABill said:
Is that really so outlandish to even consider?
But that's using a specific vulnerability in the Apple ecosystem (that is being plugged) to determine that such a thing can't be avoided. Apple says that it can be avoided and that they are going to avoid it in the future.
-
I use Veracrypt, It's fully compatible with Truecrypt.
It's free and very similar to Truecrypt, I use it with Dropbox and Google Drive.
-
My brother and I discuss encryption sometimes and have pick on / at TC and VC... This discussion has been a good read.
-
@scottalanmiller said:
We say that about smart people, not the FBI
Yes but the FBI has money at its disposal to buy smart people if it wishes.
Or coerce them, of course.
-
@scottalanmiller said:
@BRRABill said:
Ultimately security rests in keeping the secret service dedicated, and people like Snowden from saying anything. If someone with the "keys to the kingdom" decides to go rogue ... who knows what could happen?
Snowden isn't some kind of ultra hacker. He's just a guy who had access and ethics. Anyone in the security department with his strong moral character could have done the same thing. It's just that everyone else was happy working against the public and hiding what they were doing from Americans - or were scared and lived with regret.
I don't consider Snowden rogue, he did his patriotic and civil duty. It's the rest that were rogue and acting against the people that were supposed to protect.
My point is that there are people with access and knowledge. If they decide to make a break for it, regardless of why, the system is compromised.
-
@BRRABill said:
My point is that there are people with access and knowledge. If they decide to make a break for it, regardless of why, the system is compromised.
Ah, thought taht you were thinking that he broke in somehow.