ML
    • Recent
    • Categories
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login

    Copying log to file share in realtime (or close)

    IT Discussion
    4
    7
    404
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • anthonyhA
      anthonyh
      last edited by

      Hey All,

      I've been asked to set up a script (or something) that will copy the log file of one of our applications (running on CentOS 6 at the moment) over to a Windows file server. Apparently SSHing into a Linux box and grepping through logs is intimidating to some...who would've thought? Heh...

      At any rate, my initial thought was to write a script that basically does a "tail -f" on the log and writes to a file on the file share, but I think this may be problematic even with using autofs to mount the Windows share.

      I could just write a script that copies the entire log over every x minutes and overwrite the existing file on the share. But that's not very efficient.

      Or perhaps I could do it in 2 stages. Stage 1 would be a tail -f that writes to a local file, then a second process (stage 2) takes the "tail" results and tacks it to the end of the existing file on the share every x interval. Stage 2 can ensure the file share is available, and if it's not exit and try again on the next interval.

      Or, maybe there is a completely better way to approach this all together?

      What are y'all's thoughts?

      1 Reply Last reply Reply Quote 0
      • DanpD
        Danp
        last edited by

        Use logstash or something similar to push the logs to a remote server. Here's a discussion from a few years back --

        https://www.mangolassi.it/topic/5365/setting-up-logstash-for-elk

        anthonyhA 1 Reply Last reply Reply Quote 2
        • anthonyhA
          anthonyh @Danp
          last edited by

          @Danp Wow, I can't believe I didn't think of doing a simple search for "centralized log management". Take my IT card...

          1 Reply Last reply Reply Quote 0
          • dafyreD
            dafyre
            last edited by

            For Simplicity's sake, copying the file every x minutes via a cron job would be the easier option.

            Logstash or what-not would be a good long term goal.

            anthonyhA 1 Reply Last reply Reply Quote 0
            • anthonyhA
              anthonyh @dafyre
              last edited by

              @dafyre I am going to go down the rabbit hole of Logstash first, actually. I can think of many other logs I'd throw at something like that. It would be a very nice add to the arsenal.

              If I fail, or if folks are getting impatient, a cron job to do a simple copy will be plan B.

              1 Reply Last reply Reply Quote 1
              • stacksofplatesS
                stacksofplates
                last edited by

                Even though you already found an answer, there's a couple of other things you can do. Since you are on CentO6, you can use inotify to keep the files synchronized. If you ever get to CentOS 7, you can do a similar task with a systemd.path unit.

                anthonyhA 1 Reply Last reply Reply Quote 1
                • anthonyhA
                  anthonyh @stacksofplates
                  last edited by anthonyh

                  @stacksofplates Thanks! I started down the path of Logstash/Greylog/others and realized that it's going to take a bit more of my bandwidth than I can dedicate at the moment. So I ended up throwing together a BASH script that'll copy the current log files over to the share every minute. It works for now...

                  1 Reply Last reply Reply Quote 1
                  • 1 / 1
                  • First post
                    Last post