Re^3: Setting up a web-based perl interpeter
by Your Mother (Archbishop) on May 27, 2011 at 16:42 UTC
|
Unless any part of the served pages ever, at any time, calls an external URL or visits one from it. Plus IP filtering is a form of authentication and, without reverse look-ups, not a "very secure" one.
| [reply] [Watch: Dir/Any] |
|
True, those situations can arise. But remember you still have two forms of verification even if the address is known to web servers. You still have an IP validation which in itself isn't so secure however unless they know what IP you are using it won't do them any good. Or throw the reverse look up on there, too. That won't require the OP to do anything extra when they load the script.
When visiting other pages it never shows the URL params. It won't pass on that you had to do script.pl?this=that. It'll just show script.pl.
I'm not saying the script is fort knox worthy but this is more or less secure.
There are many ways to secure a script without a required sign on verification process. For instance.. make a requirement that a script on the server is run within the past hour or the script won't run. Ie: have another hidden script on the server that timestamps a file that the perl intepreter then reads. If it was last hit over an hour ago the script doesn't execute. It's simple to do and makes the script live only as long as that time frame. And to increase that, add the function script.pl?time=stop to instantly kill access when you're done with it.
It may seem like a lot of small things but it would absolutely work with what the OP is trying to accomplish.
| [reply] [Watch: Dir/Any] |
|
| [reply] [Watch: Dir/Any] |
|
When visiting other pages it never shows the URL params. It won't pass on that you had to do script.pl?this=that. It'll just show script.pl.
My access_log records show the parameters passed in a GET request, and the error_log shows the referer (sic) including the parameters if it was a GET request.
You are using obscurity to secure your script. Once the obscurity is gone, the security goes right with it. Obscurity can be much more difficult to maintain (perhaps approaching "impossible" for anything more complex than a crossover cable) than other methods once $badguy has access to the request path (see previous post in this thread). Maintaining this secure request path is expensive, error prone, and difficult. There are other, more economical solutions available.
| [reply] [Watch: Dir/Any] |
Re^3: Setting up a web-based perl interpeter
by flexvault (Monsignor) on May 27, 2011 at 18:29 UTC
|
That's not true. Not publishing a URL can be quite secure....
If you're using a web server, people will be knocking on your server port within hours. In the 90s you could put a server up and no one tried for weeks, after 2001 it was less then 8 hours and now about 2-4 hours. If you use https with your own certificates, you may have a chance. But, that's a lot of work!
Further, on the "...script run unless a certain param is passed...", that param had better change every few minutes, or you'll find someone harvesting your information. A recent study of victims of on-line theft stated that 95% of them thought they didn't have anything to steal on their PC. Now add a web server!
Go with security first!
"Well done is better than well said." - Benjamin Franklin
| [reply] [Watch: Dir/Any] |
|
| [reply] [Watch: Dir/Any] |
Re^3: Setting up a web-based perl interpeter
by MidLifeXis (Monsignor) on May 27, 2011 at 20:17 UTC
|
Does your traffic pass through something I have control over? Think network, cache, diverted network route, wireless leak, ...
| [reply] [Watch: Dir/Any] |
Re^3: Setting up a web-based perl interpeter
by Anonymous Monk on May 27, 2011 at 16:30 UTC
|
I did forget the add the OP would have to have an index file in the above mentioned directories so the tree isn't visible. | [reply] [Watch: Dir/Any] |