Did I just mess up a file processing run?
I have a 1,000,000 line text file to process. The important part of the code is:
open(FILE, $data_file) or die("cannot open file : $data_file $!"); print "<p>opened file $data_file ok\n"; while ($line = <FILE>) { DO A BUNCH OF STUFF WITH THIS LINE } close(FILE);
I start the program via telnet, and use NOHUP so it runs to the end. (NOHUP perl myscript.pl) The "print" statement gets printed out to a nohup.out file, an in my WHILE lopp there's a print statement that outputs the line number it's workig on every 1,000 lines so I know where it is.
Because there are about 1,000 operations to do on each line, it takes a looooooong time to run. It's running on a UNIX server.
In the middle of a run, someone deleted the data_file from the server by ftp.
I thought that using this form of while ($line = <FILE>) read in one line ata time from disk, and that deleting the file during arun would kill the running process, since it could no longer read from it. BUT - I see that my nohup.out file continues to be updated with lines being process. AND "ps U myusername" via telnet says myscript.pl is still running.
Is it possible that it really is still running? Have I misunderstood how ths works? Did it really read a copy of the file into cache or something that allows it to coninue with the file deleted?
Is there hope for me (or am I hopeless?)
Thanks.
In reply to What if FILE deleted during WHILE read-in loop? by punch_card_don
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |