Copying log to file share in realtime (or close)

  • Hey All,

    I've been asked to set up a script (or something) that will copy the log file of one of our applications (running on CentOS 6 at the moment) over to a Windows file server. Apparently SSHing into a Linux box and grepping through logs is intimidating to some...who would've thought? Heh...

    At any rate, my initial thought was to write a script that basically does a "tail -f" on the log and writes to a file on the file share, but I think this may be problematic even with using autofs to mount the Windows share.

    I could just write a script that copies the entire log over every x minutes and overwrite the existing file on the share. But that's not very efficient.

    Or perhaps I could do it in 2 stages. Stage 1 would be a tail -f that writes to a local file, then a second process (stage 2) takes the "tail" results and tacks it to the end of the existing file on the share every x interval. Stage 2 can ensure the file share is available, and if it's not exit and try again on the next interval.

    Or, maybe there is a completely better way to approach this all together?

    What are y'all's thoughts?

  • Use logstash or something similar to push the logs to a remote server. Here's a discussion from a few years back --

  • @Danp Wow, I can't believe I didn't think of doing a simple search for "centralized log management". Take my IT card...

  • For Simplicity's sake, copying the file every x minutes via a cron job would be the easier option.

    Logstash or what-not would be a good long term goal.

  • @dafyre I am going to go down the rabbit hole of Logstash first, actually. I can think of many other logs I'd throw at something like that. It would be a very nice add to the arsenal.

    If I fail, or if folks are getting impatient, a cron job to do a simple copy will be plan B.

  • Even though you already found an answer, there's a couple of other things you can do. Since you are on CentO6, you can use inotify to keep the files synchronized. If you ever get to CentOS 7, you can do a similar task with a systemd.path unit.

  • @stacksofplates Thanks! I started down the path of Logstash/Greylog/others and realized that it's going to take a bit more of my bandwidth than I can dedicate at the moment. So I ended up throwing together a BASH script that'll copy the current log files over to the share every minute. It works for now...

Log in to reply