We have a framework consisting of 40+ modules. It is placed as tar'ed archive on remote linux host in '~/.folder'. After execution of one-two methods of one-two classes it must be purged out. The problen is: When you're messing up with one file, you can remove it rigth after execution. This perl we have many files, that are dynamically 'use'd. So we can't control successful removing of all files if connection lost. Extracting files to temporary directory in temp is not panacea. Loading all modules for each short-time execution is he right way to kill all perfomance. Is there any way to keep all modules in one file without perfomance penalities? Any ideas?