Beefy Boxes and Bandwidth Generously Provided by pair Networks
The stupid question is the question not asked
 
PerlMonks  

Re: Secure passwords again

by bobtfish (Scribe)
on Apr 07, 2001 at 05:32 UTC ( #70672=note: print w/replies, xml ) Need Help??


in reply to Secure passwords again

Simple solution: Assuming that you have no root access, no cron access, no shell access then you can write session files to a temporary directory (or preferably a directory within your temporary directory.) Every time the script is run, make it lsdir and stat all the files then delete the ones with date 1/x hour before current time. This is a brute force approach, as you may have guessed.. This is not a problem for a small number of users/hits but will be if the system is popular.

You could write a small daemon that sat in the background and did the deletions every 5 mins. The main perl script could check for it's existance and restart it if necessary.

Or, write a process that sits on the end of a unix domain socket and holds all the session information internally. This eliminates files alltogether but gives major problems with concurrent access and serialization. Either you have one connection at once and pay for it in performance, or you do shared memory and semaphores and pay for it in code size.

These are all implementable solutions, you know how popular the system is going to be, what performance and how many concurrant users you are going to have. These will determine how professional the solution needs to be, and, ergo, the amount of effort you should put into writing it.

Replies are listed 'Best First'.
Re: Secure passwords again
by kal (Hermit) on Apr 07, 2001 at 12:10 UTC

    The alternative to stating a directory full of session files is to give those session files (partly) date related names - ie, they could start with YYYYMMDDHHMM - then, you get a list of the files, sort it, start at the beginning and delete until you find a file which is sufficiently new. Because of the ordering of the list, you know that the rest of the files in the list are new enough not to be deleted.

    It all depends on how more expensive a load of system calls compared to a sort () is, I guess. Yet another method would be to have a directory tree that holds the session files - with a directory for each time period, possibly called YYYYMMDDHH. You implement the same culling as above, except now, the actual number of session files has no effect on the culling algorithm - we've grouped all the files created in a one hour period in their own directory, and we're just considering the directories now.

    That last method should mean that you only ever have three directories at most - you can't delete the second most recent, because that hasn't necessarily expired. And if a session from one of the older directories is used again, it gets restored in the newest directory. Hopefully that makes sense...

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://70672]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others about the Monastery: (4)
As of 2022-05-17 01:50 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?
    Do you prefer to work remotely?



    Results (65 votes). Check out past polls.

    Notices?