Beefy Boxes and Bandwidth Generously Provided by pair Networks
go ahead... be a heretic
 
PerlMonks  

comment on

( [id://3333]=superdoc: print w/replies, xml ) Need Help??
Felow monks,

Some days ago I posted a node asking for advices on Neural Networks with perl. The feedback didn't give me new ideas, so I started to setup my own NN. Now I want to share my experience here in the monastery:

First I have used the new module AI::NNFlex, that was the only one that works fine on Win32 and worked well with a real job. But testing other modules like AI::NeuralNet::Simple I saw that an easier interface and speed was needed.

So I started to write AI::NNEasy (http://search.cpan.org/~gmpassos/AI-NNEasy-0.03/). To do that I got the sources of AI::NNFlex, that are pure Perl, and rewrited them making some optimizations, specially in the access of the HASH keys, since is a OO module with a lot of access to attributes in many objects. Also I have fixed some node references that was using the object reference address as ID to identify the node bojects, what make impossible to serialize that, since the object address changes for each process execution.

To write all of that fast and add XS support I have used Class::HPLOO, that enables this kind of syntax:

class Foo { ## Object initializer: sub Foo (%args) { $this->{bar} = $arg{baz} ; } ## a Perl function: sub bar($x , $y) { $this->add($x , $y) ; } ## a C function that will be turned into XS: sub[C] int add( int x , int y ) { int res = x + y ; return res ; } }
The code above show how easy is to setup a class in Perl, and as a plus we can write C functions as we write normal Perl subs directly in the class body.

After rewrite all the NN code I started to analyze the methods that use more CPU using Devel::DProf. With that I found the subs that need to be turned into XS:

AI::NNEasy::NN::tanh AI::NNEasy::NN::feedforward::run AI::NNEasy::NN::backprop::hiddenToOutput AI::NNEasy::NN::backprop::hiddenOrInputToHidden
For example, the tanh function is called more than 30000 when the NN is learning a set of inputs, so I have writed 2 versions of the same function in the class:
class AI::NNEasy::NN[0.01] { ... *tanh = \&tanh_c ; sub tanh_pl ($value) { if ($value > 20) { return 1 ;} elsif ($value < -20) { return -1 ;} else { my $x = exp($value) ; my $y = exp(-$value) ; return ($x-$y)/($x+$y) ; } } sub[C] double tanh_c ( SV* self , double value ) { if ( value > 20 ) { return 1 ;} else if ( value < -20 ) { return -1 ;} else { double x = Perl_exp(value) ; double y = Perl_exp(-value) ; double ret = (x-y)/(x+y) ; return ret ; } } ... }
Finaly, after have made the NN work 10 times faster with this XS functions, I have added a more intuitive interface with the module and a winner algorithm to give to us the right output, and not just a decimal number near the real number of the output.

So, now we can set a NN with a simple OO interface, without need to write our won learning algorithms and output analyzer. Also we don't need to care (but can) about the hidden layers, since NNEasy will calculate them for you, you just need to paste the number of inputs and outputs:

use AI::NNEasy ; my $nn = AI::NNEasy->new( 'xor.nne' , ## file to save the NN. [0,1] , ## Output types of the NN. 0.1 , ## Maximal error for output. 2 , ## Number of inputs. 1 , ## Number of outputs. ) ; ## Our set of inputs and outputs to learn: my @set = ( [0,0] => [0], [0,1] => [1], [1,0] => [1], [1,1] => [0], ); ## learn the inputs: $nn->learn_set( \@set ) ; ## Save the NN: $nn->save ; ## Use the NN: my $out = $nn->run_get_winner([0,0]) ; print "0 0 => @$out\n" ; ## 0 0 => 0 my $out = $nn->run_get_winner([0,1]) ; print "0 1 => @$out\n" ; ## 0 1 => 1

Now I can work in the real project that made me research about NN. The project will be used to analyze texts on complex photos and identify them automatically. Now I have 60% of accurance, what is already a good result, since I have started to work in the NN part only in this week.

So, thanks to CPAN, to Inline::C, to AI::NNFlex, and all the resources that are there for free and open to be changed and improved, and now AI::NNEasy is there too. ;-P

Graciliano M. P.
"Creativity is the expression of liberty".


In reply to AI::NNEasy to setup fast a Neural Network using just Perl and XS. by gmpassos

Title:
Use:  <p> text here (a paragraph) </p>
and:  <code> code here </code>
to format your post; it's "PerlMonks-approved HTML":



  • Are you posting in the right place? Check out Where do I post X? to know for sure.
  • Posts may use any of the Perl Monks Approved HTML tags. Currently these include the following:
    <code> <a> <b> <big> <blockquote> <br /> <dd> <dl> <dt> <em> <font> <h1> <h2> <h3> <h4> <h5> <h6> <hr /> <i> <li> <nbsp> <ol> <p> <small> <strike> <strong> <sub> <sup> <table> <td> <th> <tr> <tt> <u> <ul>
  • Snippets of code should be wrapped in <code> tags not <pre> tags. In fact, <pre> tags should generally be avoided. If they must be used, extreme care should be taken to ensure that their contents do not have long lines (<70 chars), in order to prevent horizontal scrolling (and possible janitor intervention).
  • Want more info? How to link or How to display code and escape characters are good places to start.
Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others sharing their wisdom with the Monastery: (6)
As of 2024-04-23 15:31 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found