in reply to Re^2: array confusion
in thread array confusion

Outta interest.

  1. Did you mean to imply that all the subarray in the input had exactly 3 elements?
  2. Is modifying the input in-place okay?

With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.

The start of some sanity?

Replies are listed 'Best First'.
Re^4: array confusion
by zerocred (Beadle) on May 11, 2012 at 09:08 UTC
    No, the subarrays can have any number of elements all same size (in practice maybe up to 20).

    In fact the bit sliced off the subarrays could be more than one element (1, 2 or 3) but again same in each case.

    What its for - the necessary reformatting of a table for feeding into a neural network (NNFlex and others need similar treatment).
    The table starts off in a spreadsheet/txt file:
    x11,x12,x13,x14,...,y11,y12 x21,x22,x23,x24,...,y21,y22 "
    (yn2 optional)
    That sort of thing.
    To answer second question - yes it can be modified in place it will have already been read from somewhere else.

    I'll post it of anyone is interested
    Thanks for all the help.
    Here it is first cut working! without 'map' - yet!
    #!/usr/bin/perl use strict; use Data::Dumper; use AI::NNFlex::Backprop; use AI::NNFlex::Dataset; my @AoA; my $iv; my $dv; my $file = "ivdv.txt"; # format - one dependent variable here: #x11,x12,x13,x14,...,y11 #x21,x22,x23,x24,...,y21 #example 7 cols - a b c d e f g # a b c d e are randomly 1 or -1 # f is random value -0.5 <= f <= 0.5 - just to see if it can ignore th +e uncorrelated noise # g is the sum/5 of a-f (to keep in range -1 to 1) #actual example: #-1.00 -1.00 1.00 -1.00 -1.00 -0.18 -0.6 #1.00 -1.00 1.00 -1.00 -1.00 0.00 -0.2 #1.00 -1.00 1.00 -1.00 -1.00 0.48 -0.2 #1.00 -1.00 1.00 -1.00 -1.00 0.12 -0.2 # my $network = AI::NNFlex::Backprop->new( learningrate=>.0001, fahlmanconstant=>0, momentum=>0.4, bias=>1); $network->add_layer( nodes=>6, activationfunction=>"tanh", decay=>0); $network->add_layer( nodes=>6, activationfunction=>"tanh", decay=>0); $network->add_layer( nodes=>1, activationfunction=>"tanh", decay=>0); $network->init(); open (FH, "< $file") or die "cannot open $file for reading: $!\n"; while (<FH>) { chomp; push @AoA, [ split '\t' ]; #split on TAB } my $vv = \@AoA; my $row; my $col; print "\$#AoA = $#AoA\n"; print "\$#AoA[0] = ".$#{$AoA[0]}."\n"; foreach $row (0..$#AoA){ foreach $col (0 .. ($#{$AoA[0]} - 1) ) { $iv->[$row *2][$col] = $vv->[$row][$col]; } $iv->[($row *2)+1][0] = $vv->[$row][$#{$AoA[0]}]; } $network->dump_state(filename=>'nn1.wts'); #$network->load_state(filename=>'nn.wts'); #print Dumper($vv); #print Dumper($iv); #print Dumper($dv); #print $AoA[0][0]."\n"; my $dataset = AI::NNFlex::Dataset->new($iv); my $err = 10; # Stop after 4096 epochs -- don't want to wait more than that for ( my $i = 0; ($err > 0.001) && ($i < 10000); $i++ ) { $err = $dataset->learn($network); print "Epoch = $i error = $err\n"; } $network->dump_state(filename=>'nn2.wts'); #close FH or die "cannot clode $file: $!\n"; ###### this bit is after the training - separate into different file $network->load_state (filename=>'nn2.wts'); $network->run([-1,-1,-1,-1,-1,1]); my $output = $network->output(); foreach (@$output){ print "output should be sum (a-f)/5 - ".$_."\n"; }