in reply to How can I maximize performance for my router netflow analyzer? (was: Perl Performance Question)
You need a large FIFO buffer filled by a capture process that is niced fast enough to handle your I/O requirements. I should think Perl could handle it if you are not asking for anything crazy. Some You could probably even just dump whole blocks of packets into mysql records; they can get very large. I also glanced at PDL::IO::FastRaw which suggests mmapping a file for faster access but that seems like overkill.
Then another process would come in periodically (or more nicely) to service that buffer doing the data reduction and analysis you need, assuming that this is necessary because of a long capture session. It sounds like right now you are getting caught in overhead. One thing I can say is that you might save a lot of time if you can get Mysql to do the reduction on a lot of records at once and store results in a separate table. Another thing you could look at is using study() before running a batch. You could also look for a module which runs batch processing in C.
|
|---|