Neural network with multiple layer: learning function

Here is my code to implement the learning of my neural network using the backpropagation learning. The algorithms is stable but I don't learn correctly the output. Do you see anything wrong in my learning process?

//### Parameter ###
#define Nb_entry 2
#define coeff_app 0.01 
#define par_momentum 0.8 
#define par_nb_test 0.2
#define para_tolerence 0.05
#define para_stop_learning 1000

Here are my learning functions

void fonction_neuron(double * input, Neuron * neuron_info)
{
    int i=0; // loop variable
    double net=0;

    // Computation of the net value type: net = w[0]*bias + sum(w[i]*imp[i])
    net=(*neuron_info).weight[0]; //bias = 1
    for(i=1;i(*neuron_info).nb_input;i++) net+=(*neuron_info).weight[i]*input[i];

    (*neuron_info).output=net;
    //print_stat_neuron(neuron_info);
}

void fonction_network(Neuron N_network[][10], int nb_layer, int *nb_neuron_per_layer,double *input)
{
    int i,j; //loop variable
    double previous_layer_output[10]={0};

    //Propagation of the signal into the neural network
    for(i=0;inb_layer;i++)
    {
        if(i!=0) for(j=0;jnb_neuron_per_layer[i-1];j++) 
        {
            previous_layer_output[j]=N_network[i-1][j].output;//save previous layer output
            //printf("previous_layer_output[%d]=%f\n",j,previous_layer_output[j]); 
        } 

        for(j=0;jnb_neuron_per_layer[i];j++)
        {
            //printf(" i=%d j=%d \n", i,j);
            N_network[i][j].old_output=N_network[i][j].output; //save previous value
            if(i==0)fonction_neuron(input, (N_network[i][j])); //first layer               
            else fonction_neuron(previous_layer_output, (N_network[i][j])); //other layer using the previous layer output
        }
    }
}

Topic unsupervised-learning c neural-network

Category Data Science


Here are some questions to help with the implementation of the algorithm:

1- You did the sum, but where/what is the activation function (sigmoid, hyperbolic tangent, Gaussian, etc.)?

2- Is multilayer? How did you forward? Did you use the sum along with activation of each neuron past for the other layers?

3- Have you stored the derivatives of activations to the backpropagation?

4- Will you use the traditional backpropagation algorithm or genetic algorithm to adjust the network weights?

5- If you choose the sigmoid and backpropagation, here is an example algorithm (this algorithm the derivative of the sigmoid is already calculated $O_{j}$ $(1-O_{j})$.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.