Is there a way to tell IO::Socket::SSL to indicate to the server that it should use compression?
I don't know. I would think that, to begin with, both the openssl library against which IO::Socket::SSL (client) has been built, and the openssl library on the server would need to have been built with zlib support. Even then you'll probably have to rely on IO::Socket::SSL having been written with the capability of passing on the request for compression - which, one would hope, is documented (if such a capability exists).
I gather that ZLib is built into OpenSSL by default now
Best to check on that. When I built 0.9.8g a week or so ago, zlib support was certainly *not* the default for me - I had to explicitly declare zlib and -lz as ./config arguments.
When building openssl-0.9.8g you'll see lots of occurrences of -DZLIB during 'make' iff it's being built with zlib support. And during 'make test' you'll see numerous lines stating
Available compression methods:. Iff zlib support was included it will be specified following each of those lines.
There are probably other ways to verify whether zlib compression was included, but I don't know what they are.
The reason I was thinking that ZLib is in there now is references on the mailing lists like this. Im using pre-compiled .lib and .dll though so its likely not enabled there. And if (as you pointed out) the capability was in the Perl module it would be documented (which it doesn't appear to be).