No, the subarrays can have any number of elements all same size (in practice maybe up to 20).

In fact the bit sliced off the subarrays could be more than one element (1, 2 or 3) but again same in each case.

What its for - the necessary reformatting of a table for feeding into a neural network (NNFlex and others need similar treatment).
The table starts off in a spreadsheet/txt file:
x11,x12,x13,x14,...,y11,y12 x21,x22,x23,x24,...,y21,y22 "
(yn2 optional)
That sort of thing.
To answer second question - yes it can be modified in place it will have already been read from somewhere else.

I'll post it of anyone is interested
Thanks for all the help.
Here it is first cut working! without 'map' - yet!
#!/usr/bin/perl use strict; use Data::Dumper; use AI::NNFlex::Backprop; use AI::NNFlex::Dataset; my @AoA; my $iv; my $dv; my $file = "ivdv.txt"; # format - one dependent variable here: #x11,x12,x13,x14,...,y11 #x21,x22,x23,x24,...,y21 #example 7 cols - a b c d e f g # a b c d e are randomly 1 or -1 # f is random value -0.5 <= f <= 0.5 - just to see if it can ignore th +e uncorrelated noise # g is the sum/5 of a-f (to keep in range -1 to 1) #actual example: #-1.00 -1.00 1.00 -1.00 -1.00 -0.18 -0.6 #1.00 -1.00 1.00 -1.00 -1.00 0.00 -0.2 #1.00 -1.00 1.00 -1.00 -1.00 0.48 -0.2 #1.00 -1.00 1.00 -1.00 -1.00 0.12 -0.2 # my $network = AI::NNFlex::Backprop->new( learningrate=>.0001, fahlmanconstant=>0, momentum=>0.4, bias=>1); $network->add_layer( nodes=>6, activationfunction=>"tanh", decay=>0); $network->add_layer( nodes=>6, activationfunction=>"tanh", decay=>0); $network->add_layer( nodes=>1, activationfunction=>"tanh", decay=>0); $network->init(); open (FH, "< $file") or die "cannot open $file for reading: $!\n"; while (<FH>) { chomp; push @AoA, [ split '\t' ]; #split on TAB } my $vv = \@AoA; my $row; my $col; print "\$#AoA = $#AoA\n"; print "\$#AoA[0] = ".$#{$AoA[0]}."\n"; foreach $row (0..$#AoA){ foreach $col (0 .. ($#{$AoA[0]} - 1) ) { $iv->[$row *2][$col] = $vv->[$row][$col]; } $iv->[($row *2)+1][0] = $vv->[$row][$#{$AoA[0]}]; } $network->dump_state(filename=>'nn1.wts'); #$network->load_state(filename=>'nn.wts'); #print Dumper($vv); #print Dumper($iv); #print Dumper($dv); #print $AoA[0][0]."\n"; my $dataset = AI::NNFlex::Dataset->new($iv); my $err = 10; # Stop after 4096 epochs -- don't want to wait more than that for ( my $i = 0; ($err > 0.001) && ($i < 10000); $i++ ) { $err = $dataset->learn($network); print "Epoch = $i error = $err\n"; } $network->dump_state(filename=>'nn2.wts'); #close FH or die "cannot clode $file: $!\n"; ###### this bit is after the training - separate into different file $network->load_state (filename=>'nn2.wts'); $network->run([-1,-1,-1,-1,-1,1]); my $output = $network->output(); foreach (@$output){ print "output should be sum (a-f)/5 - ".$_."\n"; }

In reply to Re^4: array confusion by zerocred
in thread array confusion by zerocred

Title:
Use:  <p> text here (a paragraph) </p>
and:  <code> code here </code>
to format your post, it's "PerlMonks-approved HTML":



  • Posts are HTML formatted. Put <p> </p> tags around your paragraphs. Put <code> </code> tags around your code and data!
  • Titles consisting of a single word are discouraged, and in most cases are disallowed outright.
  • Read Where should I post X? if you're not absolutely sure you're posting in the right place.
  • Please read these before you post! —
  • Posts may use any of the Perl Monks Approved HTML tags:
    a, abbr, b, big, blockquote, br, caption, center, col, colgroup, dd, del, details, div, dl, dt, em, font, h1, h2, h3, h4, h5, h6, hr, i, ins, li, ol, p, pre, readmore, small, span, spoiler, strike, strong, sub, summary, sup, table, tbody, td, tfoot, th, thead, tr, tt, u, ul, wbr
  • You may need to use entities for some characters, as follows. (Exception: Within code tags, you can put the characters literally.)
            For:     Use:
    & &amp;
    < &lt;
    > &gt;
    [ &#91;
    ] &#93;
  • Link using PerlMonks shortcuts! What shortcuts can I use for linking?
  • See Writeup Formatting Tips and other pages linked from there for more info.