in reply to MediaWiki::API problem

I tried it with the main site, using an anonymous login and found that the main site doesn't need lgname or lgpassword. I had no decoding errors, but I did add binmode STDOUT, ':encoding(utf8)'; just in case.
#!/usr/bin/perl use strict; use warnings; use MediaWiki::API; my $url = 'http://en.wikipedia.org/w/api.php'; my $mw = MediaWiki::API->new( {api_url => $url} ); binmode STDOUT,':encoding(utf8)'; if ( my $ref = $mw->api({ action => 'query', meta => 'siteinfo' } )) { print $ref->{query}->{general}->{sitename}, "\n"; } my $page = $mw->get_page( { title => 'Main Page' } ); print $page->{'*'}, "\n";

Replies are listed 'Best First'.
Re^2: MediaWiki::API problem
by danj35 (Sexton) on May 25, 2011 at 09:46 UTC

    Thanks. That makes sense, but I'm still not getting anything (for the main wiki and my one). I'm beginning to think that it's more to do with my machine than anything else. This time I got back a "use of unitialised value" after trying to print the page. It's strange because if I print the $mw I get a hash of all of the necessary info. This is quite frustrating...

      Backtrack and check that all the requirements are being met. Then run this script to make sure that all the dependencies are updated.
      #!/usr/bin/perl use strict; use warnings; use CPAN; CPAN::Shell->install( "JSON", "WWW::RobotRules", "HTTP::Cookies", "Net::FTP", "IO::Socket::INET", "Net::HTTP", "HTTP::Daemon", "Digest::base", "Digest::MD5", "HTTP::Negotiate", "File::Listing", "HTML::Tagset", "XSLoader", "HTML::Parser", "Time::Local", "HTTP::Date", "Compress::Raw::Zlib", "Compress::Raw::Bzip2", "Scalar::Util", "IO::Uncompress::Inflate", "HTTP::Status", "Encode::Alias", "Encode::Locale", "LWP::MediaTypes", "MIME::Base64", "URI::Escape", "LWP::UserAgent", "Test::Harness", "Test::More", "MediaWiki::API");
      I ran into the same thing. However, it seems that is OS related. The same code works with no problem on Windows and Ubuntu. Didn't spent time to narrow what system setting on Mac OS X is causing a garbled HTTP response! Hope this helps!