kuratkull has asked for the wisdom of the Perl Monks concerning the following question:

I think the code is pretty self explanatory. Anyway, it
downloads the one image over an over again from the
site and then prints the total sum of downloaded data,
by predefining the image size in the scalar.

But why does the script make pauses, that last up to 10 seconds, between the connections? I would like to tune it
to get the file as fast as possible and as many times
as possiblem.
Modules not included in the default installation should not be used :)
Thanks for any help.
#!/usr/bin/perl use IO::Socket::INET; for ($count=0;$count<=100;$count+=1) { $getexp = IO::Socket::INET->new(PeerAddr=>'www.test.com',PeerPort=>'80 +',Proto=>'tcp',Timeout=>'5') || print "Error: Connection\n"; print $getexp "GET /image.jpg HTTP/1.0\n"; print $getexp "Host: www.test.com\n\n"; while($exp = <$getexp>) {} $sizesum += 55.43; print "$1\n" if $sizesum =~ /(.*?)\./; }

Replies are listed 'Best First'.
Re: INET slow reconnects.
by merlyn (Sage) on Apr 30, 2007 at 16:51 UTC
    It's probably repeatedly doing a DNS lookup. You should cache the IP address of your host, and use that in your socket connect.
      Thx, but the server I am trying it on, is a virtual server,
      else I could have just entered the static IP :/
      But I need it to work on any computer this will be ran on.
      So these DNS cachings do not fit very well :/
      Can't I just force it to "memorize" or cache the DNS automaticly after the first query?

        If by "force it" you mean "write some code", that's what merlyn is saying.