Sounds like you need to keep a configuration file of some sort that contains a black list of files (files to ignore). There are many modules that you could use for this purpose, but my current favourite is YAML.
Perl's payment curve coincides with its learning curve.
| [reply] |
You have to be careful before changing the characteristics(binary, unreadable,..,) of a file. Because, it is going to affect you when you access the file for some other purpose. It is better to log the read file names in a separate file(log) and check the existence of file name in the file(log) before reading the files.
| [reply] |
There are many solutions. On Windows you could tag each bad file with an Additional Data Stream, that's just the main filename with :something tagged on the end.
Simpler, and more portable, have a directory (called badfiles?), which is otherwise empty, and just create an empty file within it named the same as the bad file. The test to see if the bad file is there is a simple -e.
To avoid using a binary file use the -B test, that is not 100% accurate but works most of the time. If the file is genuinely corrupt then you might want to chmod 000 so no one can use it. | [reply] |
My first thought was "this sounds like a job for a hash".
next if exists( $filesToSkip{$filename} );
The retention between runs requirement just means that you need to save the hash to a file and load it back on startup: Try Storable for that. | [reply] [d/l] |
Depending on your approach, the possibilities could depend substantially on which operating system you're using.
On Windows, I'd say you could try using the Archive bit.
Between the mind which plans and the hands which build, there must be a mediator... and this mediator must be the heart.
| [reply] |