Beefy Boxes and Bandwidth Generously Provided by pair Networks
Your skill will accomplish
what the force of many cannot
 
PerlMonks  

Net_SSLeay help?

by booter (Novice)
on Sep 18, 2003 at 15:05 UTC ( [id://292419]=perlquestion: print w/replies, xml ) Need Help??

booter has asked for the wisdom of the Perl Monks concerning the following question:

I need some help using Net_SSLeay, and really hope someone out there can lend a hand. Here's my problem.

I want to be able to connect to an HTTPS website and submit my account information (connectID and password), then obtain further details from my account. I've installed all the necessary binaries (openSSL, etc) and I am able to connect to the https site using Net::SSLeay::get_https, but I can't seem to figure out what do do or how to post the necessary data to the login page, then validate that i've connected. I'm a newbie, and would really appreciate someone helping me out.

The site I'm trying to connect to is https://webbroker.tdwaterhouse.ca/

I've been able to determine (at least i think) that the form they are using is looking for two values (1) connectID.value and (2)password.value. Maybe i'm wrong here, that's why I gave the website so that perhaps someone can do a view source on the site to validate my approach.

Here is the sample code I've tried using with the post_https function call in the following code

$site = "webbroker.tdwaterhouse.ca';
$port = 443;
$path = '';
($page) = post_https($site, $port, $path);

Now that I have the page, I try to submit the connectID and password values, using the post_https method.
And this is where i THINK the problem is

(page, $response, %reply_headers) = post_https($site, $port, $path ,'',make_form('connectID.value' => 'wxxxxxxx', 'password.value' => 'xxxxxxxx'));

I don't get any errors, but I also don't get any data :(.

I'm really stumped!

Any help, including code samples, that you can give me would be very much appreciated!!!!!!

Thanks

Replies are listed 'Best First'.
Re: Net_SSLeay help?
by liz (Monsignor) on Sep 18, 2003 at 15:56 UTC
    If it's just a single page that you need to get through https and you have lynx available, I usually tend to take the easy way out:
    open IN,"/usr/local/bin/lynx -auth=$name:$password -source https://sec +ure.website.com?foo=bar |" ) or die "Could not open pipe: $!\n"; local $/; my $html = <IN>; close IN;

    At least you don't have to load a lot of modules for something that only needs to be done now and then.

    Not sure this will help you in this particular case. But lynx can handle cookies alright, so maybe it's just a matter of setting up the right cookie jar.

    Liz

Re: Net_SSLeay help?
by asarih (Hermit) on Sep 18, 2003 at 15:19 UTC
    I am guessing that your user agent needs cookies. If it's a standard LWP::UserAgent, you can set $browser->cookie_jar({}); and go from there.
      Clearly I don't understand how Net_SSLeay works. I know that LWP::UserAgent requires cookies to be set, and I would have used the LWP library had it worked properly with https, but it doesn't. So I resorted to using Net_SSLeay (Crypt also gave me some problems).

      So, with that in the open, are you saying that I can still use the LWP cookies with Net_SSLeay?

        The following is an example of some code that uses LWP to post some data to an HTTPS login page and collect the cookies in a temporary HTTP::Cookie jar. You can persist the cookies for future use by giving the cookie jar a file to use.

        You will need to change the $loginFormUrl and form data in order to get this to work with your login page.

        Inman

        #! /usr/bin/perl -w
        
        use strict;
        use warnings;
        
        use LWP::UserAgent;
        use HTTP::Cookies;
        use HTTP::Request;
        use IO::Socket::SSL;
        
        my $loginFormUrl = "https://webbroker33w.tdwaterhouse.ca/LogOn";
        
        # Join the list of name=value pairs that you get from your login page
        my $postContent = join "&", qw (
        userid=myUser
        passwd=myPassword
        );
        
        # Create User Agent
        my $ua = LWP::UserAgent->new();
        
        # Set up the cookie jar to use the file 'cookies.txt'
        $ua->cookie_jar (HTTP::Cookies->new ());
        push @{ $ua->requests_redirectable }, 'POST';
        
        my $req = HTTP::Request->new("POST", $loginFormUrl );
        $req->content_type('application/x-www-form-urlencoded');
        $req->content($postContent);
        
        # Send the request to the server
        my $resp = $ua->request( $req );
        
        if ( $resp->is_error() ) {
        	print "warning Error requesting URL ->" . $resp->message();}
        else{
        	print $resp->headers_as_string . "\n";}
        
        
Re: Net_SSLeay help? [ JUNOS https download ]
by zengargoyle (Deacon) on Sep 18, 2003 at 19:37 UTC

    you should also take a look at WWW::Mechanize and WWW::Mechanize::FormFiller and WWW::Mechanize::Shell just because they make it relatively easy to whip up a script. here's a little script to download router software from Juniper over https.

    #!perl use strict; use warnings; $|++; my $user = '*username-here*'; my $pass = '*password-here*'; my $ver = '6.0'; # lazy version matching my $ext = 'tgz'; # pdf|tgz documentation use WWW::Mechanize; use WWW::Mechanize::FormFiller; use URI::URL; my $agent = WWW::Mechanize->new(); my $formfiller = WWW::Mechanize::FormFiller->new(); $agent->env_proxy(); $agent->agent('Mozilla/4.0 (compatible; MSIE 4.01; Windows 98)'); $agent->get('https://www.juniper.net/support/'); $agent->form(2); $formfiller->add_filler( 'USER' => Fixed => $user ); $formfiller->add_filler( 'PASSWORD' => Fixed => $pass ); $formfiller->fill_form($agent->current_form); $agent->submit(); $agent->follow(qr(Download software)); $agent->follow(qr(Encrypted)); my @links; my @all_links = $agent->links(); my @targets = grep { m!\Q$ver! and m!\Q//download.juniper.net/! } map {$_->[0]} @all_links; $agent->follow(qr(Software Documentation: Release \Q$ver\E)); @all_links = $agent->links(); push @targets, grep { m!\Q-comprehensive-index.pdf! or ( m!/download/! and m!$ext$! ) or m!/download/rn-sw-\d+\.pdf! } map {$_->[0]} @all_links; # uncomment line below for testing # print join $/, @targets, ''; exit; my $base = $agent->uri; for my $target (@targets) { my $url = URI::URL->new($target,$base); $target = $url->path; $target =~ s!^(.*/)?([^/]+)$!$2!; $url = $url->abs; print "mirroring $target$/"; $agent->mirror($url,$target); };

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://292419]
Approved by ybiC
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others imbibing at the Monastery: (5)
As of 2024-04-25 07:14 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found