in reply to How can I maximize performance for my router netflow analyzer? (was: Perl Performance Question)

One option would be to write a collector script which would then fork, for each packet received, an analyzer script which would take the packet, do stuff to it, then save stuff to a mysql DB. At that point, you can have whatever daemons you want look at that DB, irregardless of how that DB is populated.

I'm not fully conversant on how fork works, but I'm sure a number of people here are. Plus, you could just play with it. :)

Replies are listed 'Best First'.
Re: Re: Perl Performance Question
by IkomaAndy (Acolyte) on Jun 13, 2001 at 20:12 UTC
    I've tried this, but seemed to take a relatively bad performance hit with each fork. This is why I collected 1,000 packets before forking in the current idea, losing a few packets out of each thousand.