in reply to Re^2: open ldif
in thread Net::LDAP::LDIF->new Chokes on large LDIF because of comments every 5k lines

I think my original question was more along the lines of all these files I'm opening and closing and how best to manage them. In my script I currently have one system call which uses dxsearch (which is an enhanced ldapsearch) to produce a very large ldif with over 100,000 search results. then I open the same file the dxsearch produces in the next line, (even though I might have it in memory already because I used: my $input = qxdxsearch blahblah, but never use the $input anywhere else in my script). Then I open an output file to convert it to complaint ldif since there are some uncommented bad lines in it. Then I open an output file.csv, I process the ldif generated by dxsearch by reading in the massaged ldif I cleaned up with perl in previous step with net::ldap::ldif->new( so it seems in one script I have three open's, two closes, a file being generated by a system call and net ldap reading in a file I closed right before calling it, not to mention an $input variable probably storing the whole un-massaged ldif in memory which is never used because I wasn't sure that would work with net ldap ldif new. Everything seems to be working fine, just was interested if there's any best practice here or if it's better I open, close files to conserve memory. I guess that is kindof the question I was asking, but once I solved the issue with processing the ldif decided to change it since I didn't want to waste anyone's time.

I also have a bad habit of compulsively editing my posts after submitting them, (for instance I've probably edited this one 4 or 5 times since I originally posted it ugh) not sure if anyone else has fallen victim to that trap too, just saying ;-)
- 3dbc