Beefy Boxes and Bandwidth Generously Provided by pair Networks
Welcome to the Monastery

Secure passwords again

by Anonymous Monk
on Apr 07, 2001 at 04:14 UTC ( [id://70658]=perlquestion: print w/replies, xml ) Need Help??

Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

First, an answer to the following ppl:
to fpi (and everyone else) sorry - this may not be strictly perl related. it didnt occour to me on the first post, as cgi nearly always equals perl in my mind.
to athomason, thankyou! that was exactly what I was looking for. Until I thought about it a little more.

Now for everyone else:

I am attempting to write a webmail program to check pop mail from any server. The question is secuity and (now) how to implement sessions. I have everything working as planned. I store a cookie with the session number and write a file for each session on the server containing the users details. It occured to me much later, that I cannot see a way to delete these files automatically, assuming the user does not click 'logout'.
I would prefer not to use SQL or anything complicated to make it possible to install on (almost) any server. I have been racking my brains about it, and I think there is probably no solution but to use a database. And even then I will have to find out if a database can do it. What do fellow monks think?

Replies are listed 'Best First'.
Re: Secure passwords again
by bobtfish (Scribe) on Apr 07, 2001 at 05:32 UTC
    Simple solution: Assuming that you have no root access, no cron access, no shell access then you can write session files to a temporary directory (or preferably a directory within your temporary directory.) Every time the script is run, make it lsdir and stat all the files then delete the ones with date 1/x hour before current time. This is a brute force approach, as you may have guessed.. This is not a problem for a small number of users/hits but will be if the system is popular.

    You could write a small daemon that sat in the background and did the deletions every 5 mins. The main perl script could check for it's existance and restart it if necessary.

    Or, write a process that sits on the end of a unix domain socket and holds all the session information internally. This eliminates files alltogether but gives major problems with concurrent access and serialization. Either you have one connection at once and pay for it in performance, or you do shared memory and semaphores and pay for it in code size.

    These are all implementable solutions, you know how popular the system is going to be, what performance and how many concurrant users you are going to have. These will determine how professional the solution needs to be, and, ergo, the amount of effort you should put into writing it.

      The alternative to stating a directory full of session files is to give those session files (partly) date related names - ie, they could start with YYYYMMDDHHMM - then, you get a list of the files, sort it, start at the beginning and delete until you find a file which is sufficiently new. Because of the ordering of the list, you know that the rest of the files in the list are new enough not to be deleted.

      It all depends on how more expensive a load of system calls compared to a sort () is, I guess. Yet another method would be to have a directory tree that holds the session files - with a directory for each time period, possibly called YYYYMMDDHH. You implement the same culling as above, except now, the actual number of session files has no effect on the culling algorithm - we've grouped all the files created in a one hour period in their own directory, and we're just considering the directories now.

      That last method should mean that you only ever have three directories at most - you can't delete the second most recent, because that hasn't necessarily expired. And if a session from one of the older directories is used again, it gets restored in the newest directory. Hopefully that makes sense...

Re: Secure passwords again
by Masem (Monsignor) on Apr 07, 2001 at 04:26 UTC
    You should have access to a cron daemon; write a separate script that compares the current time with the time the session was created for each file in the directory you store the scripts, and if it exceeds a time you designate for how long you'd expect a user to stay at your site, delete the file. Have this run every 15/30/60 minutes, or whatever interval you feel works well.
    Dr. Michael K. Neylon - || "You've left the lens cap of your mind on again, Pinky" - The Brain

Log In?

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://70658]
Approved by root
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others lurking in the Monastery: (1)
As of 2024-04-15 16:21 GMT
Find Nodes?
    Voting Booth?

    No recent polls found