in reply to perl hooks for AI

While Python has become the de facto language for AI and machine learning, let's not forget the versatility of Perl

There is a great deal that can be done with Perl and publically available APIs. Not everything AI requires Python!

Earlier this year, I created AI::Embedding and I'm working on other pure Perl modules that will help bring AI capabilities to a script near you... As far as I can tell, there is no inherent benefit that Python has when it comes to AI.

Replies are listed 'Best First'.
Re^2: perl hooks for AI
by cavac (Prior) on Nov 20, 2023 at 15:52 UTC

    I completely agree. More and more of that stuff runs "in the cloud" anyway, because running a couple of racks of high end hardware under your office desk just isn't feasable.

    Cloud almost always means "HTTP", and Perl is very good at interfacing with stuff like that directly, without having to remote-control a browser. (Yes, sometime you have to pay extra to use those "professional" APIs).

    In my experience, it's often easier to interface with those newer services, because they provide modern APIs based on modern standards. It's much more work to interface with old services that originally pre-dated the modern web and just converted their paper-based data exchange to textfile-based ones.

    My bet is, you get get an interface to ChatGPT going much faster that ingress NOAA space weather prediction data. ChatGPT has some well defined Web APIs. They have a docomented workflow, documented return codes, etc...

    NOAA gives you a text file. When to pull new data? What are the exact parsing rules? Are there exceptions when the format could slightly change? How to convert the values into a nice 1-5 scale? Those are the things you have to spend a day painstakingly searching and reading obscure documents...

    PerlMonks XP is useless? Not anymore: XPD - Do more with your PerlMonks XP
      NOAA gives you a text file. When to pull new data? What are the exact parsing rules? Are there exceptions when the format could slightly change? How to convert the values into a nice 1-5 scale? Those are the things you have to spend a day painstakingly searching and reading obscure documents...

      That. First, you have to create a known data set from the data, then you have to do a whole bunch of very complex math to normalize all the data that's way above my head (super thanks goes out to no_slogan who did all of the math for me in this thread).

      After that, you do have to remember to update the data (typically every five years, but there are known to be incremental changes as well. If the format changes at all, you have to rewrite your parser code that translates it to a format the code understands.

      An API would be much more helpful :)

        An API would be much more helpful :)

        Hey stevie, how's tricks? Did you get stomped by winter storms? (They went north of me.)

        I thought I was closing in on this:

        #!/usr/bin/perl use v5.030; use utf8; use LWP::UserAgent; use JSON; ## get credentials my $ini_path = qw( /Users/mymac/Documents/1.тай&#108 +5;ый.txt ); # get key my $ref_config = get_тайный($ini_p +ath); $DB::single = 1; my %h = %$ref_config; ## dial up server # Your OpenAI API endpoint and token my $api_url = 'https://api.openai.com/v1/engines/davinci-codex/complet +ions'; # Replace with the correct API endpoint if needed my $auth_token = $h{key}; # The prompt you want to send my $prompt = "How many units are there in a mole?"; my $ua = LWP::UserAgent->new; $ua->ssl_opts( verify_hostname => 1, SSL_ca_file => '/Users/mymac/Documents/cacert-2023-12-12.pem', ); # Set up Request my $req = HTTP::Request->new(POST => $api_url); $req->header('Content-Type' => 'application/json'); $req->header('Authorization' => "Bearer $auth_token"); # Add the JSON-encoded data to the request my $json_data = encode_json({ prompt => $prompt, max_tokens => 150 }); + # Adjust the number of tokens as needed $req->content($json_data); # Perform the request my $res = $ua->request($req); # Check the outcome if ($res->is_success) { print $res->decoded_content; } else { print "Error: " . $res->status_line . "\n"; } ## don't change anything about this subroutine sub get_тайный { use Config::Tiny; use Data::Dump; my %h; #creating here and exporting reference to caller my $ini_path = shift; #caller provides inipath my $sub_hash1 = "openai"; my $Config = Config::Tiny->new; $Config = Config::Tiny->read( $ini_path, 'utf8' ); # -> is optional between brackets $h{email} = $Config->{$sub_hash1}{'email'}; $h{key} = $Config->{$sub_hash1}{'key'}; my $ref_config = \%h; dd $ref_config; $DB::single = 1; return ($ref_config); } __END__

        It comes back as a 404, and looks like this in the debugger:

        main::(3.openai.pl:26): my $ua = LWP::UserAgent->new; >> n main::(3.openai.pl:27): $ua->ssl_opts( main::(3.openai.pl:28): verify_hostname => 1, main::(3.openai.pl:29): SSL_ca_file => '/Users/mymac/Docume +nts/cacert-2023-12-12.pem', >> s LWP::UserAgent::ssl_opts(/System/Library/Perl/Extras/5.30/LWP/UserAgen +t.pm:713): 713: my $self = shift; >> r void context return from LWP::UserAgent::ssl_opts ... main::(3.openai.pl:46): if ($res->is_success) { >> y $api_url = 'https://api.openai.com/v1/engines/davinci-codex/completion +s' $auth_token = 'redacted but correct' $json_data = '{"max_tokens":150,"prompt":"How many units are there in +a mole?"}' $prompt = 'How many units are there in a mole?' ... $req = HTTP::Request=HASH(0x14103e258) '_content' => '{"max_tokens":150,"prompt":"How many units are there + in a mole?"}' '_headers' => HTTP::Headers=HASH(0x14105a110) '::std_case' => HASH(0x131958040) 'if-ssl-cert-subject' => 'If-SSL-Cert-Subject' 'authorization' => 'Bearer sk-...' 'content-type' => 'application/json' 'user-agent' => 'libwww-perl/6.44' '_method' => 'POST' '_uri' => URI::https=SCALAR(0x140444dc8) -> 'https://api.openai.com/v1/engines/davinci-codex/completions' '_uri_canonical' => URI::https=SCALAR(0x140444dc8) -> REUSED_ADDRESS $res = HTTP::Response=HASH(0x140698518) '_content' => '{ "error": { "message": "The model `davinci-codex` does not exist or you do + not have access to it.", "type": "invalid_request_error", "param": null, "code": "model_not_found" }

        Fishing for tips,

Re^2: perl hooks for AI
by Aldebaran (Curate) on Dec 18, 2023 at 06:54 UTC
    Earlier this year, I created AI::Embedding and I'm working on other pure Perl modules that will help bring AI capabilities to a script near you...

    Thx for your response, Bod. I took a look at these embeddings, feeling out the topic with the debugger and chatgpt. I took a peek at what one of these things looks like, and it looks like a giant vector of floats:

    -0.818496979663585,-0.572010804875021,-0.409478105446063,-0.937661798043237'

    I'm always curious about the compression, and asked how expensive it is to represent and it came to pass

    1. **Word2Vec Embedding**: - Word2Vec typically creates embeddings in spaces ranging from 100 +to 300 dimensions. Let's assume we are using a 300-dimensional model. - Each word in the phrase "And it came to pass..." would be convert +ed into a 300-dimensional vector. 2. **Representation of Each Word**: - The phrase has 5 words, so we would have 5 vectors. - Each dimension in the vector is usually a 32-bit floating-point n +umber. 3. **Memory Calculation**: - Each 32-bit float requires 4 bytes of memory. - A 300-dimensional vector would thus require \( 300 \times 4 \) by +tes = 1200 bytes. - For 5 words, the total memory would be \( 5 \times 1200 \) bytes += 6000 bytes (or 6 kilobytes). So, in this hypothetical scenario, representing the phrase "And it cam +e to pass..." using a 300-dimensional Word2Vec model would require ap +proximately 6 kilobytes of memory.

    The representation seems expensive to me, but maybe I'm old-fashioned. The new Q* model that made recent news in another variety of the same, so I'm going to withhold on pronouncements of AGI, as are the boasts on youtube. (Are there alternatives to youtube?)

    I did try out some source:

    #!/usr/bin/perl use v5.030; use utf8; use AI::Embedding; my $ini_path = qw( /Users/mymac/Documents/1.тай&#108 +5;ый.txt ); # get key my $ref_config = get_тайный($ini_p +ath); $DB::single = 1; my %h = %$ref_config; ## keep ^^^ this the same my $embedding = AI::Embedding->new( api => 'OpenAI', key => $h{key}, ); ## ^^^this works now ## this doesn't: my $csv_embedding = $embedding->embedding('I demand a shrubbery'); my $test_embedding = $embedding->test_embedding('We are the knights wh +o say nyet'); my @raw_embedding = $embedding->raw_embedding('great eddie murphy sho +w'); my $cmp = $embedding->comparator($csv_embedding); my $similarity = $cmp->($test_embedding); my $similarity_with_other_embedding = $embedding->compare($csv_embeddi +ng, $test_embedding); say $cmp; say $similarity; say $similarity_with_other_embedding; ## don't change anything about the subroutine sub get_тайный { use Config::Tiny; use Data::Dump; my %h; #creating here and exporting reference to caller my $ini_path = shift; #caller provides inipath my $sub_hash1 = "openai"; my $Config = Config::Tiny->new; $Config = Config::Tiny->read( $ini_path, 'utf8' ); # -> is optional between brackets $h{email} = $Config->{$sub_hash1}{'email'}; $h{key} = $Config->{$sub_hash1}{'key'}; my $ref_config = \%h; dd $ref_config; $DB::single = 1; return ($ref_config); } __END__

    This compiles but gets lost in runtime:

    (base) Merrills-Mac-mini:Documents mymac$ ./1.openai.pl Use of uninitialized value $embed_string in split at /Library/Perl/5.3 +0/AI/Embedding.pm line 141. features must contain terms at /Library/Perl/5.30/Data/CosineSimilarit +y.pm line 68. (base) Merrills-Mac-mini:Documents mymac$

    Not sure what this means. The success of this so far is getting a proper api key. I'm suuuper rusty with all this. Can't find any perl install I recognize from before...yikes....

    Anyways, assume mistakes are mine so far. (I usually have a couple dozen to make before I get anywhere.)

    My question might be: how do I dial up this api properly?

    Cheers from the 'Ho

      Check your API key!

      I've modified your code for obtaining the API key and hardcoded it into the subroutine...with a valid API key, your code works for me. With an invalid key, I get the same error as you are seeing.

      #!/usr/bin/perl use v5.030; use utf8; use AI::Embedding; my $ini_path = qw( /Users/mymac/Documents/1.тай&#108 +5;ый.txt ); # get key my $ref_config = get_api_key(); $DB::single = 1; my %h = %$ref_config; ## keep ^^^ this the same my $embedding = AI::Embedding->new( api => 'OpenAI', key => $h{key}, ); ## ^^^this works now ## this doesn't: my $csv_embedding = $embedding->embedding('I demand a shrubbery'); my $test_embedding = $embedding->test_embedding('We are the knights wh +o say nyet'); my @raw_embedding = $embedding->raw_embedding('great eddie murphy sho +w'); my $cmp = $embedding->comparator($csv_embedding); my $similarity = $cmp->($test_embedding); my $similarity_with_other_embedding = $embedding->compare($csv_embeddi +ng, $test_embedding); say $cmp; say $similarity; say $similarity_with_other_embedding; ## don't change anything about the subroutine sub get_api_key { return { 'key' => 'sk-abc123', }; } __END__

      Output:

      CODE(0x1cf363a5c48) -0.00931772883675172 -0.00931772883675168

      In the next release, I will put some more helpful error message that directs the user to check their AP key...thanks for discovering this issue :)

      Version 1.1 of AI::Embedding is now live on CPAN - update the module and you'll get a more helpful error if the API Key is wrong.

      The representation seems expensive to me...

      If we were to store each float as a separate DB field, it would be rather expensive. This is why the AI::Embedding documentation suggests storing the vector as a string in a TEXT field. See the embedding method

      Because the purpose of an embedding is to compare it with another embedding, the floats that make up the vector are dealt with as a single unit. Generally, we have no reason to access the individual components of the vector.

      The vector comparison is carried out internally with Data::CosineSimilarity.

      This compiles but gets lost in runtime:

      From a cursory look, I'm not sure what to make of that error!

      The problem is in a private method but I cannot recall where it's called from. I shall look into this further over the next few days when I've got a little more time...I'm on domestic duties, making the house look festive right now!

      AI::Embedding is working in a production environment, so I am hopeful there isn't a fundamental problem with the module. But it shouldn't really thrown the error you are experiencing.