Re: pp --clean does not seem to work
by Lotus1 (Vicar) on Dec 11, 2015 at 17:33 UTC
|
Is the environment variable PAR_GLOBAL_TEMP set to /tmp on the target machine? If so it will override the clean option. What is the value of the clean attribute in META.yml inside the packed file?
If PAR_GLOBAL_TEMP is specified, use it as the cache directory for extracted libraries, and do not clean it up after execution.
Update: The Anonymous Monk I think made the correct observation about this.
| [reply] |
|
|
[ps@emer:~]$ env | grep PAR
Seems like it is not set, and ps.. no idea on how to check the META.yml | [reply] [d/l] |
|
|
| [reply] |
Re: pp --clean does not seem to work (it does)
by Anonymous Monk on Dec 11, 2015 at 23:50 UTC
|
pp --clean does not seem to work ... no proof ... You're confused, --clean works
when pp is creating an executable , it putts stuff into temp, this is what you're confusing for --clean not doing its job
Running the executable doesn't increase the number of files in temp, this means --clean is working
Emptying temp, and running the executable shows no files in temp
Without --clean option the number of files doubles after running the executable
--clean worked as designed, its working
Could it be that it works on windows and is broken on linux? Not likely -- now you know how to check
| [reply] [d/l] [select] |
|
|
Thanks Monks,
I just ran few tests and realised that --clean works on 'normal' conditions. The temp-xxx is cleaned after normal exit.
However, if the script crashed or killed for whatsoever reason, the temp-xxx contents are retained and not cleaned up. So probably have to deal that separately, like, to clean up explicitly in an END block or so...
| [reply] |
|
|
| [reply] |
|
|
Re: pp --clean does not seem to work
by james28909 (Deacon) on Dec 11, 2015 at 18:53 UTC
|
scan the directory at start of running the script, then after the script is done, scan the directory again and delete the new dir if the dir name =~ /temp/par-xxxxx/ | [reply] |
|
|
scan the directory at start of running the script, then after the script is done, scan the directory again and delete the new dir if the dir name =~ /temp/par-xxxxx/
Race condition. Don't do that. While the pp "compiled" perl script runs, someone can change the contents of the directory, or even remove the directory and replace it with something else.
Almost all linux distributions come with some perl version and a package manager. To distribute a perl script, either use the system perl as script interpreter and create a package (*.rpm, *.deb, ...) that depends on it (and perhaps some pre-packaged modules from the distribution), or package a custom perl, perhaps some custom modules, and make the script depend on the custom perl and the custom modules. There is no need to bundle a huge amount of binaries (perl interpreter, libraries, modules) into a single executable that starts polluting the file system every time it is started just to distribute a perl script.
To make things worse, the bundled binaries are most likely already present on the target system, wasting space. And even more worse, security updates distributed to the target system won't update the pp bundled script, it will still have security problems, even on a patched target system.
Now imagine not one pp-packed script, but hundreds of them, all with their own copy of a vulnerable library, where patching the system library just won't help. Welcome to the latest nightmare of Java.
Plus, using pp limits the resulting executable to a single operating system on a single processor platform. While the perl script could run, for example, on MIPS, ARM, x86, and x64 on Linux, *BSD, MacOS, and Windows, running pp on Linux x64 limits the resulting executable to Linux x64.
Alexander
--
Today I will gladly share my knowledge and experience, for there are no sweeter words than "I told you so". ;-)
| [reply] |
|
|
i completely disagree, i have never had a problem scanning directories that my script created, whether it be created on the fly or be created statically. the average user of said compiled prgram isnt going to even know the directory was even created, much less change it contents. not to mention, you could lock the dir or every file in it,
also i think you are way off in left feild. if you have 100 pp programs running, then you will have 100 "temp" dir.. then when you run your program it will scan the dirs, then create its own. which can easily be singled out and filtered. i honestly dont know where the other stuff you mention would even come into play while /just/ scanning directories tbh
EDIT: the only thing that could be a problem is if the script/exe created the dir BEFORE the directory could be scanned.
| [reply] |
|
|
|
|
|