So here's the bigger gist. I am improving a special-purpose HTTP proxy server that rewrites URLs in pages that it fetches so that they all point back to itself (e.g. the URL "http://www.yahoo.com/" gets rewritten as "http://my_proxy?url=http://www.yahoo.com/". So though I have a large collection of URLs (from my logs), I need to "implode" URLs on a one-by-one basis. GZip and the like don't do very much on a single URL.
Finding the collection of "top" substrings has already reduced my downloads by 20% on a given page, but that was done by hand for a single test page with only 30 or so URLs in it.
So the problem as stated stands...I wish it were as simple as GZip/Compress. In fact, I used those and in many cases the URLs are actually larger (for short URLs)...especially once the data is encrypted and base64'ed...
mG.
In reply to Re^2: Imploding URLs
by mobiGeek
in thread Imploding URLs
by mobiGeek
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |