Migrating Logs
-
Far more often, what you will be doing is not using cat or tar at all, but instead using a tool like gzip or bzip2 to compress the individual log files to a fraction of their original size and leaving them as individual files. This is mostly better because you compress incrementally as you go reducing their size hour by hour, day by day or whatever granularity that you need. Then you can backup or ship them wherever you need whenever you need rather than waiting to bundle them all up before doing something with them. And retrieving them is just grabbing one small file rather than wading through one giant one.
-
Of course you could tar up a bunch of gzipped files, but why bother? You cannot cat gzipped files, though.
-
@wirestyle22 said in Migrating Logs:
@scottalanmiller said in Migrating Logs:
So the quick answer is "no", cat would not be used in this way.
Am I correct in thinking that the naming convention is used with a wildcard to handle the migration of logs (as an example) that are migrated regularly?
You would rarely use anything to grab stuff in that way with logs. A more common thing to do would either to have a script that does something complex, or a really simple command that does something like this...
find all files over 24 hours old in the log directory whose names end in .gz and ship them to backupserver:/logarchive/
-
@scottalanmiller said in Migrating Logs:
retrieving them is just grabbing one small file rather than wading through one giant one.
The assumption here would be if it's getting done regularly it wouldn't be that big but I guess that differs depending on the situation.
-
@scottalanmiller said in Migrating Logs:
@wirestyle22 said in Migrating Logs:
@scottalanmiller said in Migrating Logs:
So the quick answer is "no", cat would not be used in this way.
Am I correct in thinking that the naming convention is used with a wildcard to handle the migration of logs (as an example) that are migrated regularly?
You would rarely use anything to grab stuff in that way with logs. A more common thing to do would either to have a script that does something complex, or a really simple command that does something like this...
find all files over 24 hours old in the log directory whose names end in .gz and ship them to backupserver:/logarchive/
Understood. Thanks!
-
@wirestyle22 said in Migrating Logs:
@scottalanmiller said in Migrating Logs:
retrieving them is just grabbing one small file rather than wading through one giant one.
The assumption here would be if it's getting done regularly it wouldn't be that big but I guess that differs depending on the situation.
If it isn't really big, what was the benefit of combining just a few?
-
@scottalanmiller said in Migrating Logs:
@wirestyle22 said in Migrating Logs:
@scottalanmiller said in Migrating Logs:
retrieving them is just grabbing one small file rather than wading through one giant one.
The assumption here would be if it's getting done regularly it wouldn't be that big but I guess that differs depending on the situation.
If it isn't really big, what was the benefit of combining just a few?
Yeah I guess the positive there is also the negative either way you go
-
@wirestyle22 said in Migrating Logs:
@scottalanmiller said in Migrating Logs:
@wirestyle22 said in Migrating Logs:
@scottalanmiller said in Migrating Logs:
retrieving them is just grabbing one small file rather than wading through one giant one.
The assumption here would be if it's getting done regularly it wouldn't be that big but I guess that differs depending on the situation.
If it isn't really big, what was the benefit of combining just a few?
Yeah I guess the positive there is also the negative either way you go
Yes, merging just a few small files would be extra effort without benefit. If you merged enough to be beneficial in any way, you'd introduce loads of problems.
-
Really, log shipping with local storage is a thing of the past as well. Not what you are looking for with your use case, but long ago people did this. Today if you want to store logs beyond what fits on the local system you look at remote log servers like syslog, rsyslog, Kiwi, Graylog, ELK, loggly, Splunk and so forth. They have more useful platforms for dealing with centralized logs, archiving and backups.
-
@scottalanmiller said in Migrating Logs:
Really, log shipping with local storage is a thing of the past as well. Not what you are looking for with your use case, but long ago people did this. Today if you want to store logs beyond what fits on the local system you look at remote log servers like syslog, rsyslog, Kiwi, Graylog, ELK, loggly, Splunk and so forth. They have more useful platforms for dealing with centralized logs, archiving and backups.
Then we get into why you would use each. What product benefits what situation