Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

I'm doing a script on a FreeBSD os, and apparently, the reading/writing operator (eg: open(FH,"+<./file")) doesn't seem to work...I'm assuiming its the os (like fork() with win32 systems), but I need a workaround.

See, I have the script, which is run from the web, which writes to a data file. I need to read the data from a file, modify it, and write it back, but I dont want a different instance of the script to append new data to the file before I write back to it in the current instance or I might lose the new information. So, since I can't open it for reading and writing in the same filehandle, i need some sort of workaround, but I can't seem to work out how.

Replies are listed 'Best First'.
Re: FreeBSD reading/writing workaround?
by chipmunk (Parson) on Apr 25, 2001 at 08:08 UTC
    I very much doubt that FreeBSD is unable to open a file in read/write mode. Unfortunately, you haven't provided enough information to diagnose the actual problem.

    What do you mean when you say that it "doesn't seem to work"? Are you checking the return value from open? Have you tested the script from the command line? Are you sure the current working directory is what you think it is when the script is run from the web?

    Anyway, it sounds like you need to use file locking. Opening for read/write won't prevent two instances of the script from doing conflicting writes. See flock and Fcntl.

      I AM locking the files. But, when I open the file, it doesnt return an error value, but it doesnt open a file stream either. So I don't know what the heck is wrong...
Re: FreeBSD reading/writing workaround?
by hdp (Beadle) on Apr 25, 2001 at 08:16 UTC
    I just tested open using "+<file" syntax, and it worked for me. This is FreeBSD 4.2-STABLE on i386, perl 5.005_3. I suggest giving similar details about your configuration -- maybe someone's heard of similar problems (though to be honest I suspect the error is in your code, not in the OS or perl). Showing some code that doesn't do what you'd expect would help a lot as well.

    As an aside, I'm assuming that you're using the "different instances" you refer to because you are having problems with open. If not (i.e. if you're going to be using multiple instances regardless of whether open behaves or not), have you looked at flock? (If you use flock, keep in mind that its locks are only advisory, so a completely different process can just clobber it.)

    hdp.

      I'm on perl, version 5.005_03 built for i386-freebsd...the code is as follows:
      open(FH,"+<./file.dat") || die("Can't open ./file.dat for reading/writ +ing: $!"); flock(FH,LOCK_EX); seek(FH,0,0); @dat = <FH>; foreach (@dat) {chomp} # this is where I process @dat, and when I feed text # manually into it, it works. the problem doesn't # lie in this segment of the code seek(FH,0,0); print FH join("\n",@dat); flock(FH,LOCK_UN); close(FH);
      Now, i can open the file for reading, and i can open it for writing, and both work fine. But when I try reading AND writing, it doesnt spring an error, it simply doesnt read anything from or write anything to the file. Should I try +>instead of +<or what?
        I've taken your code and run it unchanged on my machine except for the addition of this line (push @dat, $dat[rand(@dat)]) just before the second seek. It does what I'd expect it to, reading in the file and writing it out again with a random line from the file appended.

        I suggest you show the rest of your code, even though you don't think the problem lies there.

        hdp.

check permissions, use flock
by gregw (Beadle) on Apr 25, 2001 at 08:58 UTC
    Do make sure that you have file permissions set appropriately- you can test whether that is an issue by doing a chmod 777 filename; default Apache configs run scripts as the user nobody and thus often can't write to or append files which don't have proper permissions.

    I'd concur with the other poster who encouraged you to look at the flock node and its sample code for doing file locking in Perl. That's a pretty standard approach for ensuring that multiple running scripts can't write to the same file at the same time. (At least if those scripts are all using flock, you won't have a problem. Read the fine print about advisory locking for more details.)