Beefy Boxes and Bandwidth Generously Provided by pair Networks
more useful options
 
PerlMonks  

comment on

( #3333=superdoc: print w/replies, xml ) Need Help??
Felow monks,

Some days ago I posted a node asking for advices on Neural Networks with perl. The feedback didn't give me new ideas, so I started to setup my own NN. Now I want to share my experience here in the monastery:

First I have used the new module AI::NNFlex, that was the only one that works fine on Win32 and worked well with a real job. But testing other modules like AI::NeuralNet::Simple I saw that an easier interface and speed was needed.

So I started to write AI::NNEasy (http://search.cpan.org/~gmpassos/AI-NNEasy-0.03/). To do that I got the sources of AI::NNFlex, that are pure Perl, and rewrited them making some optimizations, specially in the access of the HASH keys, since is a OO module with a lot of access to attributes in many objects. Also I have fixed some node references that was using the object reference address as ID to identify the node bojects, what make impossible to serialize that, since the object address changes for each process execution.

To write all of that fast and add XS support I have used Class::HPLOO, that enables this kind of syntax:

class Foo { ## Object initializer: sub Foo (%args) { $this->{bar} = $arg{baz} ; } ## a Perl function: sub bar($x , $y) { $this->add($x , $y) ; } ## a C function that will be turned into XS: sub[C] int add( int x , int y ) { int res = x + y ; return res ; } }
The code above show how easy is to setup a class in Perl, and as a plus we can write C functions as we write normal Perl subs directly in the class body.

After rewrite all the NN code I started to analyze the methods that use more CPU using Devel::DProf. With that I found the subs that need to be turned into XS:

AI::NNEasy::NN::tanh AI::NNEasy::NN::feedforward::run AI::NNEasy::NN::backprop::hiddenToOutput AI::NNEasy::NN::backprop::hiddenOrInputToHidden
For example, the tanh function is called more than 30000 when the NN is learning a set of inputs, so I have writed 2 versions of the same function in the class:
class AI::NNEasy::NN[0.01] { ... *tanh = \&tanh_c ; sub tanh_pl ($value) { if ($value > 20) { return 1 ;} elsif ($value < -20) { return -1 ;} else { my $x = exp($value) ; my $y = exp(-$value) ; return ($x-$y)/($x+$y) ; } } sub[C] double tanh_c ( SV* self , double value ) { if ( value > 20 ) { return 1 ;} else if ( value < -20 ) { return -1 ;} else { double x = Perl_exp(value) ; double y = Perl_exp(-value) ; double ret = (x-y)/(x+y) ; return ret ; } } ... }
Finaly, after have made the NN work 10 times faster with this XS functions, I have added a more intuitive interface with the module and a winner algorithm to give to us the right output, and not just a decimal number near the real number of the output.

So, now we can set a NN with a simple OO interface, without need to write our won learning algorithms and output analyzer. Also we don't need to care (but can) about the hidden layers, since NNEasy will calculate them for you, you just need to paste the number of inputs and outputs:

use AI::NNEasy ; my $nn = AI::NNEasy->new( 'xor.nne' , ## file to save the NN. [0,1] , ## Output types of the NN. 0.1 , ## Maximal error for output. 2 , ## Number of inputs. 1 , ## Number of outputs. ) ; ## Our set of inputs and outputs to learn: my @set = ( [0,0] => [0], [0,1] => [1], [1,0] => [1], [1,1] => [0], ); ## learn the inputs: $nn->learn_set( \@set ) ; ## Save the NN: $nn->save ; ## Use the NN: my $out = $nn->run_get_winner([0,0]) ; print "0 0 => @$out\n" ; ## 0 0 => 0 my $out = $nn->run_get_winner([0,1]) ; print "0 1 => @$out\n" ; ## 0 1 => 1

Now I can work in the real project that made me research about NN. The project will be used to analyze texts on complex photos and identify them automatically. Now I have 60% of accurance, what is already a good result, since I have started to work in the NN part only in this week.

So, thanks to CPAN, to Inline::C, to AI::NNFlex, and all the resources that are there for free and open to be changed and improved, and now AI::NNEasy is there too. ;-P

Graciliano M. P.
"Creativity is the expression of liberty".


In reply to AI::NNEasy to setup fast a Neural Network using just Perl and XS. by gmpassos

Title:
Use:  <p> text here (a paragraph) </p>
and:  <code> code here </code>
to format your post; it's "PerlMonks-approved HTML":



  • Posts are HTML formatted. Put <p> </p> tags around your paragraphs. Put <code> </code> tags around your code and data!
  • Titles consisting of a single word are discouraged, and in most cases are disallowed outright.
  • Read Where should I post X? if you're not absolutely sure you're posting in the right place.
  • Please read these before you post! —
  • Posts may use any of the Perl Monks Approved HTML tags:
    a, abbr, b, big, blockquote, br, caption, center, col, colgroup, dd, del, div, dl, dt, em, font, h1, h2, h3, h4, h5, h6, hr, i, ins, li, ol, p, pre, readmore, small, span, spoiler, strike, strong, sub, sup, table, tbody, td, tfoot, th, thead, tr, tt, u, ul, wbr
  • You may need to use entities for some characters, as follows. (Exception: Within code tags, you can put the characters literally.)
            For:     Use:
    & &amp;
    < &lt;
    > &gt;
    [ &#91;
    ] &#93;
  • Link using PerlMonks shortcuts! What shortcuts can I use for linking?
  • See Writeup Formatting Tips and other pages linked from there for more info.
  • Log In?
    Username:
    Password:

    What's my password?
    Create A New User
    Chatterbox?
    and the web crawler heard nothing...

    How do I use this? | Other CB clients
    Other Users?
    Others taking refuge in the Monastery: (3)
    As of 2021-01-19 03:06 GMT
    Sections?
    Information?
    Find Nodes?
    Leftovers?
      Voting Booth?
      Notices?