edwyr has asked for the wisdom of the Perl Monks concerning the following question:
|
---|
Replies are listed 'Best First'. | |
---|---|
Re: Transformative functions...?
by BrowserUk (Patriarch) on Feb 13, 2017 at 23:34 UTC | |
'm just squaring the calculated values before the values are fed to the neural network. Looking at your values (posted elsewhere) it seems to me that what you are doing is having exactly the opposite affect from what I think would be desirable. Ie. Your values are quite widely spread at the two extremes, but all clumped together in the middle. Squaring them, will makes the big gaps at each end even bigger (not to mention mapping the minus values to positives, thus conflating them), but will leave the gaps in the middle barely changed:
My suggestion (assuming that you are trying to even out the gaps in the distribution to make discrimination easier), would be to take the square root of the (absolute) values and then multiply by the sign of the original value and a 'spreading constant'. Eg.
The result of that applied to your posted values is that the overall range of the distribution is lessened, but the values are more even distributed within that range:
Without knowing what you values represent or the algorithm(s) used by your neural net, I would expect this to make it easier for your NN to discriminate between the bulk of values which sit in the middle of your distribution. In the same vein, you might go further and use the cube root and multiply by 100:
Giving:
With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority". The enemy of (IT) success is complexity.
In the absence of evidence, opinion is indistinguishable from prejudice.
| [reply] [d/l] [select] |
by edwyr (Sexton) on Feb 14, 2017 at 00:38 UTC | |
| [reply] |
Re: Transformative functions...?
by BillKSmith (Monsignor) on Feb 13, 2017 at 22:38 UTC | |
Bill
| [reply] |
Re: Transformative functions...?
by LanX (Saint) on Feb 13, 2017 at 20:17 UTC | |
I think discussing it would be even easier, if you'd showed us the connection to Perl (probably a module you use or some code) and provided some links pointing to definitions and underlying theory of "neural networks" and "transformative functions" in this context. :)
Cheers Rolf
| [reply] |
by edwyr (Sexton) on Feb 13, 2017 at 20:33 UTC | |
Mike | [reply] [d/l] |
Re: Transformative functions...?
by Cow1337killr (Monk) on Feb 14, 2017 at 21:40 UTC | |
"If I have seen further it is by standing on the shoulders of giants." Welcome to 2017, Sir Isaac. Neural network tutorialshttp://deeplearning.net/tutorial/deeplearning.pdf entitled Deep Learning Tutorial by LISA lab, University of Montreal from from https://jmozah.github.io/links/ entitled Deep learning Reading List Neural network codeneural network code learn - Google Search http://iamtrask.github.io/2015/07/12/basic-python-network/ entitled A Neural Network in 11 lines of Python (Part 1) - i am trask https://www.codeproject.com/Articles/16419/AI-Neural-Network-for-beginners-Part-of entitled AI : Neural Network for beginners (Part 1 of 3) - CodeProject Neural networkshttp://neuralnetworksanddeeplearning.com/ entitled Neural Networks and Deep Learning Quote: Neural Networks and Deep Learning is a free online book. from neural network reading list - Google Search Tensorflowhttps://www.wired.com/2015/11/google-open-sources-its-artificial-intelligence-engine/ entitled Google Just Open Sourced TensorFlow, Its Artificial Intelligence Engine | WIRED site:en.wikipedia.org TensorFlow - Google Search https://www.oreilly.com/learning/hello-tensorflow entitled Hello, TensorFlow! - O'Reilly Media http://www.kdnuggets.com/2015/11/google-tensorflow-deep-learning-disappoints.html entitled TensorFlow Disappoints – Google Deep Learning falls shallow from tensorflow - Google Search, page 2 Tensorflow exampleshttps://github.com/tensorflow/models/tree/master/syntaxnet entitled models/syntaxnet at master · tensorflow/models · GitHub Quote: We are excited to share the fruits of our research with the broader community by releasing SyntaxNet, an open-source neural network framework for TensorFlow that provides a foundation for Natural Language Understanding (NLU) systems. from https://www.oreilly.com/ideas/four-short-links-13-may-2016 entitled Four short links: 13 May 2016 by Nat Torkington Quote: 4.) SyntaxNet -- Google open sources a neural network framework for TensorFlow that provides a foundation for Natural Language Understanding (NLU) systems. Our release includes all the code needed to train new SyntaxNet models on your own data, as well as Parsey McParseface, an English parser that we have trained for you, and that you can use to analyze English text. Tensorflow toolshttps://github.com/dementrock/tensorfuse entitled GitHub - dementrock/tensorfuse: Common interface for Theano, CGT, and TensorFlow Neural network framework comparisonshttp://www.kdnuggets.com/2015/12/tensor-flow-terrific-deep-learning-library.html entitled TensorFlow is Terrific – A Sober Take on Deep Learning Acceleration from tensorflow vs theano - Google Search, page 2 https://en.wikipedia.org/wiki/Comparison_of_deep_learning_software https://deeplearning4j.org/compare-dl4j-torch7-pylearn entitled Deep Learning Comp Sheet: Deeplearning4j vs. Torch vs. Theano vs. Caffe vs. TensorFlow vs. MxNet vs. CNTK - Deeplearning4j: Open-source, Distributed Deep Learning for the JVM Oh. What is the Perl tie-in? There is no Perl tie-in, but there should be. | [reply] |
by Anonymous Monk on Mar 03, 2017 at 21:36 UTC | |
| [reply] |
by Cow1337killr (Monk) on Mar 04, 2017 at 18:50 UTC | |
Thank you for the heads-up, Anonymous Monk. AI::MXNet is on CPAN (just mere days ago). Here are some links for those that want to try it out. http://blogs.perl.org/users/sergey_kolychev/2017/02/machine-learning-in-perl.html http://mxnet.io/tutorials/python/mnist.html entitled Handwritten Digit Recognition — mxnet 0.9.3 documentation | [reply] |