Anyone backing up a file server with 13 million plus files?
-
What hypervisor? 7TB and 13 Million files doesn't seem that excessive for pretty much any solution today.
-
@dustinb3403 said in Anyone backing up a file server with 13 million plus files?:
What hypervisor? 7TB and 13 Million files doesn't seem that excessive for pretty much any solution today.
7TB isn't bad. 13m files is a bit tough to handle. This is where hypervisor / platform agentless backups have an advantage. If they skip the file list, it's that much easier. But if they read the file list anyway, it is back to square one.
-
Veeam doesn't care about files, it's block level backup isn't it?
-
Assuming the VM is getting backed-up and not files/folders or drives.
-
@dustinb3403 said in Anyone backing up a file server with 13 million plus files?:
What hypervisor? 7TB and 13 Million files doesn't seem that excessive for pretty much any solution today.
It's a physical server, with an attached MD1000 for on-prem backup.
-
Currently, backup consists of bi-weekly full tape backups, and daily incremental tape backup.
This is how it's been since before I started.
We want to do on-prem backups now, and then move the on-prem backups to tape, instead of how it's being done now.
But as it stands, the tape backup software is doing a much better job than the on-prem backup software (Windows Server Backup, because it's included in the OS and free.)
WSB isn't working for this amount of files. I need to use something else. I can't do weekly full on-prem backups if it takes two weeks to back up lol.
I'll just have to start going through them to test. I'll start with Veeam I suppose, unless someone has experienced this using something else.
-
@tim_g said in Anyone backing up a file server with 13 million plus files?:
@dustinb3403 said in Anyone backing up a file server with 13 million plus files?:
What hypervisor? 7TB and 13 Million files doesn't seem that excessive for pretty much any solution today.
It's a physical server, with an attached MD1000 for on-prem backup.
Gotcha. . . anything that can do block level, while skipping reading every file is what you're looking for then. While also offering the ability to perform per file recovery (I'm assuming).
-
@dustinb3403 said in Anyone backing up a file server with 13 million plus files?:
@tim_g said in Anyone backing up a file server with 13 million plus files?:
@dustinb3403 said in Anyone backing up a file server with 13 million plus files?:
What hypervisor? 7TB and 13 Million files doesn't seem that excessive for pretty much any solution today.
It's a physical server, with an attached MD1000 for on-prem backup.
Gotcha. . . anything that can do block level, while skipping reading every file is what you're looking for then. While also offering the ability to perform per file recovery (I'm assuming).
Good point.
I haven't paid much attention to that fact, but which backup softwares do it like that?
-
Veeam Windows Agent does block level backup but if it is being linked to an existing Veeam B&R Repository, there might be search indexes that are built during backup. I remember reading some best-practice documentation about their catalog but only related to VM backups. Not sure if you're poiting Veeam Windows Agent to a B&R or just to storage.
-
@tim_g said in Anyone backing up a file server with 13 million plus files?:
@dustinb3403 said in Anyone backing up a file server with 13 million plus files?:
@tim_g said in Anyone backing up a file server with 13 million plus files?:
@dustinb3403 said in Anyone backing up a file server with 13 million plus files?:
What hypervisor? 7TB and 13 Million files doesn't seem that excessive for pretty much any solution today.
It's a physical server, with an attached MD1000 for on-prem backup.
Gotcha. . . anything that can do block level, while skipping reading every file is what you're looking for then. While also offering the ability to perform per file recovery (I'm assuming).
Good point.
I haven't paid much attention to that fact, but which backup softwares do it like that?
Scale HC3's built in backup will do that without looking at the files
-
@scottalanmiller said in Anyone backing up a file server with 13 million plus files?:
@tim_g said in Anyone backing up a file server with 13 million plus files?:
@dustinb3403 said in Anyone backing up a file server with 13 million plus files?:
@tim_g said in Anyone backing up a file server with 13 million plus files?:
@dustinb3403 said in Anyone backing up a file server with 13 million plus files?:
What hypervisor? 7TB and 13 Million files doesn't seem that excessive for pretty much any solution today.
It's a physical server, with an attached MD1000 for on-prem backup.
Gotcha. . . anything that can do block level, while skipping reading every file is what you're looking for then. While also offering the ability to perform per file recovery (I'm assuming).
Good point.
I haven't paid much attention to that fact, but which backup softwares do it like that?
Scale HC3's built in backup will do that without looking at the files
Can it do individual file restores?
-
@scottalanmiller said in Anyone backing up a file server with 13 million plus files?:
@tim_g said in Anyone backing up a file server with 13 million plus files?:
@dustinb3403 said in Anyone backing up a file server with 13 million plus files?:
@tim_g said in Anyone backing up a file server with 13 million plus files?:
@dustinb3403 said in Anyone backing up a file server with 13 million plus files?:
What hypervisor? 7TB and 13 Million files doesn't seem that excessive for pretty much any solution today.
It's a physical server, with an attached MD1000 for on-prem backup.
Gotcha. . . anything that can do block level, while skipping reading every file is what you're looking for then. While also offering the ability to perform per file recovery (I'm assuming).
Good point.
I haven't paid much attention to that fact, but which backup softwares do it like that?
Scale HC3's built in backup will do that without looking at the files
Sure, but a $200 Veeam Agent for Windows Server is much more likely than $30k + a migration LOL
-
@dashrender said in Anyone backing up a file server with 13 million plus files?:
@scottalanmiller said in Anyone backing up a file server with 13 million plus files?:
@tim_g said in Anyone backing up a file server with 13 million plus files?:
@dustinb3403 said in Anyone backing up a file server with 13 million plus files?:
@tim_g said in Anyone backing up a file server with 13 million plus files?:
@dustinb3403 said in Anyone backing up a file server with 13 million plus files?:
What hypervisor? 7TB and 13 Million files doesn't seem that excessive for pretty much any solution today.
It's a physical server, with an attached MD1000 for on-prem backup.
Gotcha. . . anything that can do block level, while skipping reading every file is what you're looking for then. While also offering the ability to perform per file recovery (I'm assuming).
Good point.
I haven't paid much attention to that fact, but which backup softwares do it like that?
Scale HC3's built in backup will do that without looking at the files
Can it do individual file restores?
https://mangolassi.it/topic/16110/restoring-files-and-folders-out-of-scale-hc3-vm-snapshots
-
Veeam likely would work well here, but I'm not sure if it actually skips reading and creating a file index though. . .
I honestly don't know of any solution that would do this. . . agent-wise.
-
I'm going to test it with the free version just to see how it handles it.
If it works out, we'll buy it. $200 is nothing at this point... and if it proves itself, it may help with going all Veeam later (hopefully).
-
Veeam test is on-going.
I started it over an hour ago, and so far, it's over a week ahead of where the other backup software would be...
It's going to back it all up into a .vbk file. So after it's complete, I'll see what can be done with it tape-wise, file restore wise, etc.
I'm not exactly sure "how" it's doing the backup... but I did see from Resmon it read the Master File Table. Much more efficient than scanning every damn file...
-
@tim_g if you are testing the freeware and not a trial, mind that it doesnt do CBT only the paid server edition does.
-
@matteo-nunziati said in Anyone backing up a file server with 13 million plus files?:
@tim_g if you are testing the freeware and not a trial, mind that it doesnt do CBT only the paid server edition does.
Good to know.
It is the free version, which I chose just to do a quick test. I didn't feel like being bothered by sales (yet).
-
On a slightly relavant note... I noticed they updated their Fedora agent to Fedora 26. Perhaps we'll see 27 soon!
-
@tim_g said in Anyone backing up a file server with 13 million plus files?:
@matteo-nunziati said in Anyone backing up a file server with 13 million plus files?:
@tim_g if you are testing the freeware and not a trial, mind that it doesnt do CBT only the paid server edition does.
Good to know.
It is the free version, which I chose just to do a quick test. I didn't feel like being bothered by sales (yet).
On this note (and partly due to this topic and on-going projects) I discovered that UrBackup has CBT for only $17/system.
I'm not sure how well it would scale with this many files though. Something I'm still digging into.