tradez has asked for the wisdom of the Perl Monks concerning the following question:
Basically I am going to be going back to tcpdump to dump data for a certain "sector" of users on a broadband network. With shell, I am using sleep, and doing these all at once. I am planning on updating to perl with systems calls and forking off multiple tcpdump sessions at once, but I wanted to ask if anyone had done something like this before. Will tcpdump run multiple sessions? What type of load are we talking if somewhere around 15 sessions are running? What type of degredation of integrity/is there any that I can expect from the data recieved through tcpdump? Thanks ahead of time for any help you can spare.#!/bin/sh # # # Sector au-135 cd /tmp/tcptrace FILENAME=au135-`date +%Y%m%d%H%M%S`.tcpdump /usr/sbin/tcpdump -n -c 30000 -w /tmp/tcptrace/$FILENAME -i eth1 host +`/usr/local/bin/which-hosts au-135 50|/usr/local/bin/host-it.pl`& MYJOB=$! sleep 60 sync kill $MYJOB /usr/local/bin/tcptrace-munch $FILENAME rm /tmp/tcptrace/$FILENAME
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: To Shell or To Perl, that is the question
by C-Keen (Monk) on Mar 05, 2002 at 20:16 UTC |