Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:
I want to submit to 9,000 forms (all across one domain) and it's not spamming, just so you know.
Question 1: How do I LOG IN and STORE THE COOKIES so I can access these 9,000 pages? I know some of you will post links to read a FAQ or something but I've tried doing this a number of times in the past with separate scripts and never have I successfully been able to scrape pages OR submit data to pages that required a login first.
Question 2: Since this is 9,000 pages and I don't know fork(), there's no way this as a CGI script could execute without timing out. I don't think the script would be THAT fast!
So this question is, I think breaking it into three scripts would be best. 1 login form which sends data to the other two scripts. One script submits the data to the first half of 9,000 pages and the second would submit to the other half of 9,000. The only problem with this is I wouldn't be able to see live updates as to which page it's up to if it's two scripts.
So any ideas on how this would work? It doesn't necessarily have to print out a message for each submission, maybe for each 10 or 100 just so you know it's still running.
Any help with the cookies and ideas with question #2 would be very helpful!
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re: automated form processing with cookies
by merlyn (Sage) on Jun 08, 2005 at 19:18 UTC | |
by Anonymous Monk on Jun 08, 2005 at 19:38 UTC | |
by Fletch (Bishop) on Jun 08, 2005 at 20:33 UTC | |
by djohnston (Monk) on Jun 08, 2005 at 21:46 UTC | |
|
Re: automated form processing with cookies
by cmeyer (Pilgrim) on Jun 08, 2005 at 19:18 UTC | |
|
Re: automated form processing with cookies
by Anonymous Monk on Jun 08, 2005 at 21:46 UTC |