in reply to Re: perl hooks for AI
in thread perl hooks for AI
Thx for your response, Bod. I took a look at these embeddings, feeling out the topic with the debugger and chatgpt. I took a peek at what one of these things looks like, and it looks like a giant vector of floats:
-0.818496979663585,-0.572010804875021,-0.409478105446063,-0.937661798043237'I'm always curious about the compression, and asked how expensive it is to represent and it came to pass
1. **Word2Vec Embedding**: - Word2Vec typically creates embeddings in spaces ranging from 100 +to 300 dimensions. Let's assume we are using a 300-dimensional model. - Each word in the phrase "And it came to pass..." would be convert +ed into a 300-dimensional vector. 2. **Representation of Each Word**: - The phrase has 5 words, so we would have 5 vectors. - Each dimension in the vector is usually a 32-bit floating-point n +umber. 3. **Memory Calculation**: - Each 32-bit float requires 4 bytes of memory. - A 300-dimensional vector would thus require \( 300 \times 4 \) by +tes = 1200 bytes. - For 5 words, the total memory would be \( 5 \times 1200 \) bytes += 6000 bytes (or 6 kilobytes). So, in this hypothetical scenario, representing the phrase "And it cam +e to pass..." using a 300-dimensional Word2Vec model would require ap +proximately 6 kilobytes of memory.
The representation seems expensive to me, but maybe I'm old-fashioned. The new Q* model that made recent news in another variety of the same, so I'm going to withhold on pronouncements of AGI, as are the boasts on youtube. (Are there alternatives to youtube?)
I did try out some source:
#!/usr/bin/perl use v5.030; use utf8; use AI::Embedding; my $ini_path = qw( /Users/mymac/Documents/1.тайl +5;ый.txt ); # get key my $ref_config = get_тайный($ini_p +ath); $DB::single = 1; my %h = %$ref_config; ## keep ^^^ this the same my $embedding = AI::Embedding->new( api => 'OpenAI', key => $h{key}, ); ## ^^^this works now ## this doesn't: my $csv_embedding = $embedding->embedding('I demand a shrubbery'); my $test_embedding = $embedding->test_embedding('We are the knights wh +o say nyet'); my @raw_embedding = $embedding->raw_embedding('great eddie murphy sho +w'); my $cmp = $embedding->comparator($csv_embedding); my $similarity = $cmp->($test_embedding); my $similarity_with_other_embedding = $embedding->compare($csv_embeddi +ng, $test_embedding); say $cmp; say $similarity; say $similarity_with_other_embedding; ## don't change anything about the subroutine sub get_тайный { use Config::Tiny; use Data::Dump; my %h; #creating here and exporting reference to caller my $ini_path = shift; #caller provides inipath my $sub_hash1 = "openai"; my $Config = Config::Tiny->new; $Config = Config::Tiny->read( $ini_path, 'utf8' ); # -> is optional between brackets $h{email} = $Config->{$sub_hash1}{'email'}; $h{key} = $Config->{$sub_hash1}{'key'}; my $ref_config = \%h; dd $ref_config; $DB::single = 1; return ($ref_config); } __END__
This compiles but gets lost in runtime:
(base) Merrills-Mac-mini:Documents mymac$ ./1.openai.pl Use of uninitialized value $embed_string in split at /Library/Perl/5.3 +0/AI/Embedding.pm line 141. features must contain terms at /Library/Perl/5.30/Data/CosineSimilarit +y.pm line 68. (base) Merrills-Mac-mini:Documents mymac$
Not sure what this means. The success of this so far is getting a proper api key. I'm suuuper rusty with all this. Can't find any perl install I recognize from before...yikes....
Anyways, assume mistakes are mine so far. (I usually have a couple dozen to make before I get anywhere.)
My question might be: how do I dial up this api properly?
Cheers from the 'Ho
|
---|
Replies are listed 'Best First'. | |
---|---|
Re^3: perl hooks for AI
by Bod (Parson) on Dec 18, 2023 at 21:09 UTC | |
Re^3: perl hooks for AI
by Bod (Parson) on Dec 18, 2023 at 20:32 UTC | |
Re^3: perl hooks for AI
by Bod (Parson) on Dec 19, 2023 at 13:58 UTC |