Security Fails Hard
-
@travisdh1 said:
I'm not forgetting some basic principal of file servers here am I? That means they had write access to other folders, that were not their own!
I haven't had a chance to read the article, but wouldn't have setting the sticky bit just solved this problem (not write access but editing other files by other users)?
Not that they should have write access to other folders at all, but for all the more work chmod +s is I would think that would have been done.
Assuming the file server was on Linux that is.
-
@Dashrender said:
When we move to O365, we'll have to find a solution to this as well. I don't want to use things like ODfB, that just opens those files to Cryptolocker, etc.
Why is that?
-
@johnhooks said:
@travisdh1 said:
I'm not forgetting some basic principal of file servers here am I? That means they had write access to other folders, that were not their own!
I haven't had a chance to read the article, but wouldn't have setting the sticky bit just solved this problem (not write access but editing other files by other users)?
Not that they should have write access to other folders at all, but for all the more work chmod +s is I would think that would have been done.
Assuming the file server was on Linux that is.
Right. We have no information on which hosting company and what platform(s) they use are yet. Might be a month or two, but we'll find out eventually.
-
@BRRABill said:
@Dashrender said:
When we move to O365, we'll have to find a solution to this as well. I don't want to use things like ODfB, that just opens those files to Cryptolocker, etc.
Why is that?
Because it stores a copy locally. It is better than a mapped drive, but isn't fully decoupled from the local filesystem. Assuming that they install the client, that is.
-
@scottalanmiller said:
Because it stores a copy locally. It is better than a mapped drive, but isn't fully decoupled from the local filesystem. Assuming that they install the client, that is.
I mean, is there any way around that if you want to give users a convenient way to access files?
As these viruses have progressed, all the ideas we had to prevent contamination have been overcome. (We're not at risk because we use UNC names. NEXT VERSION. Oh crud.)
Call me a cynic, but they'll figure out a way to go after cloud files some day, too.
-
@BRRABill said:
@scottalanmiller said:
Because it stores a copy locally. It is better than a mapped drive, but isn't fully decoupled from the local filesystem. Assuming that they install the client, that is.
I mean, is there any way around that if you want to give users a convenient way to access files?
As these viruses have progressed, all the ideas we had to prevent contamination have been overcome. (We're not at risk because we use UNC names. NEXT VERSION. Oh crud.)
Call me a cynic, but they'll figure out a way to go after cloud files some day, too.
Kinda hard to call you a cynic when you've just 100% accurately predicted the future!
-
@BRRABill said:
I mean, is there any way around that if you want to give users a convenient way to access files?
Yes, have the applications talk to them directly, which I've been promoting for a while. That's the future of storage. Why would users want to manipulate files? It's a basic misunderstanding of goals. IT thinks about files. Users just want to get their work done.
-
@BRRABill said:
Call me a cynic, but they'll figure out a way to go after cloud files some day, too.
Define cloud files. Hard to do, right? That's what makes it hard for malware to attack. It isn't a thing. Mapped drives are cloud files too, right? So "cloud files" were compromised before they started.
What modern access systems do is:
- Decouple the OS from the files, so a compromise of one is not a compromise of the other.
- Decentralize storage so a storage compromise is not a total compromise.
- Provide a point for version tracking which, thus far, is 100% effective in stopping ransomware.
- Destandardize access controls so that every system is a unique challenge, not simply a new point of attack. A compromise of one does not lead to the compromise of another.
- Increase the cost of attack while decreasing the value of success.
-
@scottalanmiller said:
@BRRABill said:
@Dashrender said:
When we move to O365, we'll have to find a solution to this as well. I don't want to use things like ODfB, that just opens those files to Cryptolocker, etc.
Why is that?
Because it stores a copy locally. It is better than a mapped drive, but isn't fully decoupled from the local filesystem. Assuming that they install the client, that is.
Excatly - If you're syncing SharePoint files locally via ODfB, then you've lost the protection that SharePoint provides by not having easy access to the files to encrypt.
Assuming you had pass through authentication for SharePoint enabled, I'm trying to think how malware on your machine would log into SharePoint, pull a fill down locally, encrypt it, then put it back into SharePoint.
Though I'm sure there's a way that I'm just not thinking of.
Of course, by not using ODfB, you loose offline access, and in those cases you'll have to find another solution.
-
@Dashrender said:
Assuming you had pass through authentication for SharePoint enabled, I'm trying to think how malware on your machine would log into SharePoint, pull a fill down locally, encrypt it, then put it back into SharePoint.
It would need to automate the browser, find the URL to use, list the files, check one out, download, do the encryption, upload, check it in and move on to the next. Would have a lot of effects on the process, one being that it would slow down considerably and become much less reliable. And the big one, with versioning it would do no good as people could just roll back each file.
-
@scottalanmiller said:
@Dashrender said:
Assuming you had pass through authentication for SharePoint enabled, I'm trying to think how malware on your machine would log into SharePoint, pull a fill down locally, encrypt it, then put it back into SharePoint.
It would need to automate the browser, find the URL to use, list the files, check one out, download, do the encryption, upload, check it in and move on to the next. Would have a lot of effects on the process, one being that it would slow down considerably and become much less reliable. And the big one, with versioning it would do no good as people could just roll back each file.
Thanks - I'm not that familiar with how things worked. Is there a default number of versions kept? Again it would slow things down considerably, but let's assume there was a default of 10 past versions, they could simply do that 11 times per file.
-
I'm not sure what the default is, but it is definitely easy to modify.
Yes, making many copies quickly could cause an issue. But that could be circumvented with an approval workflow that makes a human verify new submissions. Although that could be very cumbersome.
-
@scottalanmiller said:
I'm not sure what the default is, but it is definitely easy to modify.
Yes, making many copies quickly could cause an issue. But that could be circumvented with an approval workflow that makes a human verify new submissions. Although that could be very cumbersome.
yeah, or something like, you can't check in more than X-1 versions saved in some stated period of time. Assuming you had 10 versions, it's pretty unlikely you'd be checking in more than 10 versions over say an 8 hour day. And if are a company that does that, then you simply increase the number of versions to compensate, but the masses would be covered.
-
Yes, that would work well. Also checking in the same version should not trigger a new version to be created. So the encryption malware would need to alter the file in addition to encrypting it each time or else it would fail even with the incrementing numbers.