right. locking doesn't help in this case. ;-)
however - isn't there a possibility to
a) generate something like
"a socket for each local connection attempt" (
ResourcePool::Factory ??) within the
deamon.pl, which collects the different
application.pl's data and
b) pump the data from the different socket connections to a "general/shared queue" in
deamon.pl
like this pseudo-code
deamon.pl:
$queue = generate_global_ressource();
create_socket_pool(max_sockets = 200);
while(1=1){
if($input=connection_attempt) {
if($input ne 'finish') {
push($fetched_data_from_socket, queue);
} else {
close_socket();
create_new_socket_in_socket_pool();
}
}
(where generate_global_ressource() is something like ResourcePool::Command::DBI::Execute)
because all i need that deamon for are INSERT statements, the applications don't need to wait for return values. they should only "dump" the sql statements to a socket without caring what happens. (also there is a quite generous error-tolerance here: i don't care for lost data, as long as it doesn't exceed 5-10%...)
i still don't get:
a) how to keep a number of opened sockets in stock for the different
application.pl which all use THE SAME port; and
b) how to use
deamon.pl's general $queue AS a general queue for the several connection instances...
jdporter fixed code formatting and CPAN links