thanos1983 has asked for the wisdom of the Perl Monks concerning the following question:
Dear all,
First of all thanks for the time and effort to assist me with my problem.
I have written a short script that produces some data (with 1 sec period) and writes them on a text file.
Main script (producing data)
#!/usr/bin/perl use strict; use warnings; use Data::Dumper; use Fcntl qw(:flock); # import LOCK_* and SEEK_END constants use Fcntl qw(SEEK_SET SEEK_CUR SEEK_END); #SEEK_SET=0 SEEK_CUR=1 ... my @doc_write; my @array; my $file = "test.txt"; sub add { open (DATA, "+<", $file) or die ("Could not open file: ".$file.". $!\n"); flock(DATA, LOCK_EX) or die "Could not lock '".$file."' - $!\n"; if (-z "".$file."") { print DATA "0\n"; } while( @doc_write = <DATA> ) { chomp @doc_write; seek(DATA , 0 , SEEK_SET) or die "Cannot seek - $!\n"; truncate(DATA, 0); my $range = 50; my $minimum = 100; my $random_number = int(rand($range)) + $minimum; my $time = time(); my $packet = join (' ' , $time , $random_number); push(@doc_write , $packet); print Dumper (\@doc_write); foreach $_ (@doc_write) { print DATA $_ . "\n"; } close (DATA) or die "Could not close '".$file."' - $!\n"; return $packet; } # End of While (<DATA>) } # End of sub add while (sleep 1) { my $losses = &add(); print "Added:" .$losses. "\n"; }
From the same text file I am requesting 3 other scripts to get the data simultaneously and process them.
Secondary script reading data process them and write the result to a comparison file. Same process occurs for all three secondary scripts.
while (sleep 1) { read all data into an array... extract last element... process it.. write to a secondary file the extracted data for comparison purposes.. }
I am using for all scripts flock as I am trying to avoid possibilities of colliding two process together (reading and writing) from the text file.
I noticed by comparing the text files that approximately every 9-10 seconds I am missing one instance when running two scripts together. More data losses in cases that I am running all 4 scripts together.
As far as I can understand the sleep 1 period produces most of the problems. Well I thought the time would be sufficient for all scripts to read and write on the file. But fortunately I also need to test the scripts on time intervals of 1 second.
So I am wondering, is there a way to verify that all scripts will process the text file before I push the new data? Or maybe someone has a better idea of completing this task in a different way I would be more than happy to hear about it.
Maybe my description is not well defined so please do not hesitate to ask for further details on the parts that are not clear enough.
Again thank you all for your time and effort.
|
|---|