Hello Monks!
I am trying to get the title, summary, original url and thumbnail url of a set number of google images via a cgi. For yahoo, it works just fine, but google is giving me headaches. I suspect it has something to do with proxy settings (for yahoo, I use LWP::UserAgent, for Google I don't know how to do that). I saw the nifty Scrape Google's Image Search program, but this time, I don't actually want to download the pics, and I need more context, so decided against using it. I went with this code:
my $json = new JSON; my $google = WebService::Simple->new( base_url => "http://ajax.googleapis.com/ajax/services/search/image +s", param => {api_key => $key,} ); for (my $i=0; $i<$number; $i+=4) { my $hashref1 = $google->get( {v => "0.1", q => "$query", rsz => "s +mall", hl => "ja", start => "$i"} ) or die "Could not get google images: $!\n"; } }
This works in a regular perl script. But in my cgi, it fails. Putting the above code in a separate script which is called by the cgi, writes to file and allows for the cgi to read the file also fails. Is this likely to be the fault of the firewall? How do I get around it?
Many thanks for any help!
- Dime
In reply to Google Images via CGI by dime
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |