You should note that I have used memcached here. My reasoning for doing it is you can have a small localized chunk of memory allocated for this very temporary, very dynamic system and you can insert entries into memcache and then forget about them. The old entries and unused entries are automatically dropped as new entries use up the available memcache space. I would say that for the use you have mentioned here, having a memcache daemon running with only 1MB of allocation would be sufficient for what you are doing.use Cache::Memcached; use Digest::MD5 qw(md5_hex); sub get_random { my ($pick_list, $unique_key) = @_; # add uniqueness to our key $unique_key .= $ENV{'REMOTE_USER'} || $ENV{'REMOTE_ADDR'} # less reliable || $ENV{'HTTP_REFERER'} # least reliable || ''; # based on time now $unique_key = md5_hex($unique_key); # this solution requires a memcache server # or some other cache that handles volitility my $mem = Cache::Memcached->new({servers => ['localhost:11211']}) +; # see what has already been used my $used = $mem->get($unique_key) || []; $used = [] if @$used >= @$pick_list; # reset when full my %used = map {$_ => 1} @$used; my @avail = grep {! $used{$_}} 0 .. $#$pick_list; # pick random item and add it to list of used items my $index = $avail[rand @avail]; push @$used, $index; $mem->set($unique_key, $used); return $pick_list->[$index]; } # use it like this my @items = ("http://foo", "http://bar", "http://baz"); my $page = 'pagefoo'; print get_random(\@items, $page), "\n"; print get_random(\@items, $page), "\n"; print get_random(\@items, $page), "\n"; # it will automatically reset print "Reset\n"; print get_random(\@items, $page), "\n"; print get_random(\@items, $page), "\n"; print get_random(\@items, $page), "\n"; __END__ Prints something like: http://bar http://baz http://foo Reset http://baz http://foo http://bar
In reply to Re: Duplicate Randoms with SSI
by Rhandom
in thread Duplicate Randoms with SSI
by gwhite
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |