darshan_atha has asked for the wisdom of the Perl Monks concerning the following question:

hi fello monks, i have a script which read from database and store into file but it take lot of time since record is very large , till then my browser time out.is there any way to solve this problem and can you give some sample code for that. thanx in advance.

Replies are listed 'Best First'.
Re: browser time out
by robartes (Priest) on Nov 07, 2002 at 12:38 UTC
    You probably want to split off the file writes into a child process:
    use strict; # Do lots of stuff write_file(); sub write_file { if (defined (my $pid=fork)) { if ($pid) { # Parent, let's just continue return; } else { # Child, do the file mod stuff. do_write_file(); exit 0; } } else { die "Blerch: could not fork: $!\n"; }
    Note that this code is quite unpolished (e.g. the if/else structure could be saner, and you definitely want to set a SIGCHLD handler in your parent to check the return value of the child), but it should get you on your way.

    CU
    Robartes-

Re: browser time out
by tachyon (Chancellor) on Nov 07, 2002 at 15:52 UTC

    For an in depth analysis with code see this article from merlyn's Web Techniques column

    cheers

    tachyon

    s&&rsenoyhcatreve&&&s&n.+t&"$'$`$\"$\&"&ee&&y&srve&&d&&print

Re: browser time out
by Callum (Chaplain) on Nov 07, 2002 at 13:32 UTC
    The method I've previously used for jobs which take a long time (tens of seconds) to run is to generate a holding page ("Please wait while we foo...") which refreshes every five seconds or so, with the script checking each time it refreshes to see if the job has completed.