Bod has asked for the wisdom of the Perl Monks concerning the following question:
I am trying to create an email sending system that can be driven directly from our CRM. The current system which this will replace works fine but involves copying and pasting blocks of text with addresses and HTML code for emails. So I want something easier to use and less error-prone.
The script we currently use calls itself with a query string parameter set so it tracks the sending of the emails. The problem is that it can behave strangely if refreshed or called again whilst an email is still sending. So, for the replacement, I am trying to fork a process. One process will send the emails and the other will load a webpage that will periodically check the progress through AJAX calls. The progress will be stored in a DB table.
I am getting strange behaviour which I think is because sometimes one process grabs STDIN and other times the other process gets it.
#!/usr/bin/perl use CGI::Carp qw(fatalsToBrowser); use strict; use warnings; my ($etc, $pid); if ($ENV{'QUERY_STRING'} =~ /etc=(\d{6})/ ) { $etc = $1; if ( !defined($pid = fork()) ) { die "Unable to fork!"; } } else { die "Missing Email Tracking Code"; } our (%data, %file); our $dbh; our $user_number; use Template; use MIME::Lite; require "incl/common.pl"; require "incl/html.pl"; if ($pid != 0) { my $template = Template->new({INCLUDE_PATH => "$ENV{'DOCUMENT_ROOT +'}/template"}); &html_head; my $vars = { 'command' => $data{'command'}, 'mail' => $data{'mail'}, }; $template->process('sendmail.tt', $vars); &html_foot; exit; } open my $fh, '>', "testfile.txt"; print $fh "MAIL - $data{'mail'}\n\n"; #.... # spend a long time sending email #....
Within the require "incl/common.pl"; file, the %data hash is populated from the $ENV{'QUERY_STRING'} variable and STDIN. This file also connects the database. Sometimes $data{'mail'} is passed to the template and sometimes it is written to the test file.
I have had problems with fork before where I have made the database connection after the fork. Then one process has ended and closed the database connection on the other process. For that reason, here I am connecting after the fork. I don't really want to play with the require (it is nasty but works) as a lot of other code relies on it.
I have rarely used fork so...am I along the right lines in how I am doing this with fork or is there a better way?
Would the solution to the problem be to do all the hash populating and database connection stuff before the fork then make a copy of the hash and database handle in each process?
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: Sharing STDIN after fork
by Corion (Patriarch) on Feb 13, 2022 at 07:48 UTC | |
by Bod (Parson) on Feb 13, 2022 at 22:27 UTC | |
by Corion (Patriarch) on Feb 14, 2022 at 06:41 UTC | |
|
Re: Sharing STDIN after fork
by talexb (Chancellor) on Feb 13, 2022 at 17:46 UTC | |
by Bod (Parson) on Feb 13, 2022 at 22:34 UTC | |
by talexb (Chancellor) on Feb 14, 2022 at 04:43 UTC | |
by Bod (Parson) on Feb 14, 2022 at 16:11 UTC | |
|
Re: Sharing STDIN after fork
by etj (Priest) on Feb 14, 2022 at 02:26 UTC | |
by Bod (Parson) on Feb 14, 2022 at 16:06 UTC |