It's not unbearably slow but it seems to be slower than lynx which is weird to me. Anyway I've been watching the communications with wireshark. I can't find anything that stands out but then again I'm not an expert.
I'm starting to think the best solution (since I'm actually trying to fetch about 6 pages and then snatch stuff out of them with regex's) would be to perform the requests simultaneously rather than one after another. If that can be done it would be plenty fast enough for me.
I've been looking all over for a perl equivalent to putting "&" after a command in bash but I can't find anything other than the fork function.
Basically I've got a function in a module I wrote called getPage($url) that fetches the page. Is there any way to background each getPage() call so they all run at once?
I would just do "lynx -dump $url &" but I'm trying to make this a portable script that doesn't require a bunch of linux programs to function properly. I'm pretty new to perl still so I'm sure there must be some way I just don't know of.
In reply to Re^2: My first socket program is SLOW?
by ttlgreen
in thread My first socket program is SLOW?
by ttlgreen
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |