Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

Hi Perl Gurus,

I am in a urgent need of an script to alert in case a specific operation like MOD/DEL/ADD over a user is performed several times, the idea is to put a threshold like 100 or something, so that script can alert us in case the MOD/DEL/ADD is performed more than the set threshold.

I've a very basic idea of perl/shell script. I would like to put a script to monitor a access log every 10 min. Here is my log format:

-- log -- [20/Dec/2010:09:41:15 -0500] conn=867190 op=17 msgId=21 - MOD dn="uid= +acc107,ou=internal,ou=People,dc=eis,dc=xxx,dc=com" -- log --

I would like to put a script to pull out a user, and check for occurrence with the logs, and then if the occurrence is more than threshold then alert us. Please help me in this regard.

- Pamela Honneycut

Replies are listed 'Best First'.
Re: Help Required
by MishaMoose (Scribe) on Dec 20, 2010 at 15:26 UTC

    Greetings Pamela

    Before we can give you much assistance there are some additional pieces of information that would be helpful.

    What is the time period for which the threshold is valid (i.e. 100 M/D/A in an hour, 8 hours, a day ....). Are your log files circular? Are they aged out and how often? How large are the log files likely to become? Should the code re-read the whole log file each time it is invoked or does it need to preserve some knowledge of what it knew from previous runs so that it can process efficently?

    Have you written any code to attack this yet? If so share a working sample so we can better assist you

    Misha/Michael - Russian student, grognard, bemused observer of humanity and self professed programmer with delusions of relevance

      Hi,

      No I've not written any code as I've very limited knowledge of a Perl. Basically, I'm a DBA, but would like to learn and start writing some script which would help in my env.

      About your query, the log ( access ) gets rotated in itself, so no need to set a time period.

      For a day the access log is usually 20 mbs, however, i would like to put sometime to record previous run, that way the script would be less resource intensive or would be quicker.

      Thanks.

        Given your desire to acquire PERL knowledge, I would recommend you use this as your learning project. It is relatively straightforward and touches on some of the important basics.

        Approach it iteratively. Write a program that simple reads in the file line by line and prints it back out ….. Examples can be found in any good perl book such as ‘Learning Perl’ Chapter 6.1

        Then see if you can determine which lines contain things that interest you. Read up on the match operator and basic regular expressions, and hashes. Try out what you learn in your code. If you encounter problems come back and post your code and ask questions about it. It will be much easier for the Monks to answer those questions than to write the code for you. Which is the only way to answer you question at this point.

        Get that far and I expect you will find a number of very competent folks who will be happy to assist

        Every journey starts with the first step 8^)

        I wish you a good journey and hope to provide some assistance further down the road.

        Misha/Michael - Russian student, grognard, bemused observer of humanity and self professed programmer with delusions of relevance
Re: Help Required
by chrestomanci (Priest) on Dec 20, 2010 at 16:20 UTC

    It looks to me that you want an alert that will raise an alarm if one user makes a large number of edits in a short period of time. (It looks to me like this is an LDAP you are monitoring, but I could be wrong)

    You also said that your monitoring script needs to preserve state, in order to minimise resource usage, which implies some sort of database or persistent data.

    My approach would first be to use something like File::Tail to monitor you log file for new log entries, so that every time a new log line is written, your script springs into action, and checks if the user responsible for the latest transaction has made to many other transactions recently. That way you get alerted immediately a user commits their 101st transaction, and you don't waste resource checking for users who hardly ever make any transactions.

    Once you have extracted the user ID and type of operation from the log line, you have three ways you can count how many other transactions the user has done recently.

    • You can make the perl script long running, and keep the data internally in large hashes.
    • You can stash each log entry in a relational DB, and run queries in that DB to get a count of the number of transactions since a certain time.
    • You can configure your LDAP server to log directly into a DB as well as a file, so your script only needs to query the DB.

    The second two options mean that the script does not have to preserve state, so you can restart it any time you like, or even start it as an event based log processor on your LDAP server.

    In my view the third option is best, as it will keep your perl script very simple, and transfer all the heavy lifting to your database server. It does rely on being able to configure your LDAP server to log directly to a database.

    Alternatively, considering you are a DBA, and presumably know how to write SQL trigger scripts, you could consider bypassing perl entirely, and doing the whole thing as a Trigger on the logging database. I have no idea if this is easy, or even possible, as I am not a DBA, and can't write SQL beyond fairly basic SELECT and UPDATE calls.

    A reply falls below the community's threshold of quality. You may see it by logging in.