in reply to Re: Re: Encoding/compress CGI GET parameters
in thread Encoding/compress CGI GET parameters

In an general sense, I suppose it would be possible to set up a persistent hash where the key is a unique ID and the value is the parameters. When generating the email, the parameters would be stripped off the URL, and stored in the hash against a new unique key. The key would then be added to the URL. When the CGI is run, it would look up the values against the key.

That's pretty much what I envisaged, but why not have two hashes going both ways. The keys of the other hash are the params and the value is the unique id. Then when you generate the email look to see if the id for this combination of params already exists, and if it doesn't generate one and insert a record in both hashes

My objection to this is it's not possible to tell when a given hash entry can be safely deleted.

I think that under my scheme, as you're reusing the same id for the same combination of parameters you won't need to delete entries from the hashes.

I may be misunderstanding the problem tho'.

--
<http://www.dave.org.uk>

"Perl makes the fun jobs fun
and the boring jobs bearable" - me

  • Comment on Re: Re: Re: Encoding/compress CGI GET parameters

Replies are listed 'Best First'.
Re: Re: Re: Re: Encoding/compress CGI GET parameters
by snellm (Monk) on Jan 17, 2001 at 19:33 UTC

    Reusing the hash keys is a good idea, but where one of the parameters is, for instance, an key referring to an entry in a database table, which is constantly incremented, then new hash entries will be created on an ongoing basis.

    -- Michael Snell
    -- michael@snell.com