Hoping you can help give me some suggestions on how to speed up this script to avoid the remote URL lag I'm experiencing every 60 seconds.
I am using:
my pricing = 0; my $time = 0; while (1) { if (time() >= ($time + 60)) # updates the pricing on +initial run and every 60 seconds. { Net::Curl to remote URL... request here. $pricing = [from net::curl] $time = time(); } ... continue on with my perl code (needs the $pricing variable to work +) }
The issue I have is that every 60 seconds I am hit with a lag to my script due to the remote URL call. I was wondering if there may be a way to perhaps turn the "Net::Curl" call into a server side localhost script that is updating the pricing constantly and that listens for the client (my main script) and immediately responds with no lag so speeding up the process. Or perhaps Fork off the Net::Curl call inside my main script and update it on a future loop once I get the response back from the remote URL. Max of 1 child process at a time.
If you have any thoughts on how best to accomplish this, code examples, pointers, please let me know. Your help is appreciated.
Thank you
In reply to making a loop script with a remote URL call faster by brandonm78
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |