in reply to Kullback–Leibler divergence Module?

You may use a shell script, that calls perl (do the computations in perl) and then call R. R has a FNN package/module which has Kl.dist function which you may use for K-L distance.

Update: I have written the KL Divergence for discrete probability distribution samples. It is very naive but might be useful.

#!/usr/bin/perl -w ## for Discrete Probability Distribution ## considering that sum of probability values in both arrays equals to + 1 my $dist = 0; my @terms = ('a', 'b','c','d'); my @P = (0, 0.3, 0.4, 0.3); my @Q = (0.2, 0.2, 0.3, 0.3); if ( (scalar(@P) != scalar(@terms) ) && (scalar(@Q) != scalar(@terms)) + ){ print " The size should be same \n"; exit; } else{ for(my $i = 0; $i<= $#P; $i++){ my $temp = 0 if($P[$i] == 0 || $Q[$i] == 0); $temp = $P[$i]*log($P[$i]/$Q[$i]) if($P[$i] != 0 && $Q[$i] != +0); $dist = $dist + $temp; } } print "The Kullback Distance symmetric for discrete Distribution is :" +, $dist,"\n";