Help me please.

This is a discussion on Help me please. within the C Programming forums, part of the General Programming Boards category; Hi, I encountered some serious problems with multilayer perceptrons, I don't understand anything and everybody seems to talk about different ...

  1. #1
    Registered User yann's Avatar
    Join Date
    Sep 2009
    Location
    Zagreb, Croatia
    Posts
    186

    Unhappy Help me please.

    Hi, I encountered some serious problems with multilayer perceptrons, I don't understand anything and everybody seems to talk about different things...Please edit my code and tell me what i did wrong...

    Code:
    #include <stdbool.h>
    #include <stdio.h>
    #include <stdlib.h>
    #include <time.h>
    #include <math.h>
    
    float error[100];
    int i;
    float weight[100];
    bool percept[100];
    int input[100];
    double sum[100];
    int successive_right = 0;
    int total_go_round = 1;
    const float learning_rate = 0.1f;
    const float learning_rate2 = 0.25f;
    const float learning_rate3 = 0.1f;
    
    double f(double x){   //NOLINEAR ACTIVATION FUNCTION
       return sin(atan(x));
       }
    
    bool target(int y, int z) {   
    	if ((y && !z) || (z && !y)) return 1;
    	return 0;
    }
    
    
    int educate() {
      
    	input[0] = 1; //BIAS
            input[3] = 1; //BIAS
            input[4] = 1; //BIAS
            input[5] = 1; //BIAS
            input[6] = 1; //BIAS
            input[1] = rand() % 2; //INPUT1
            input[2] = rand() % 2; //INPUT2
            bool goal = target(input[1], input[2]);
       	sum[0] =  (weight[0]*input[0]+weight[1]*input[1]+weight[2]*input[2]);
    	sum[1] =  (weight[3]*input[3]+weight[4]*input[1]+weight[5]*input[2]);
     	sum[2] =  (weight[6]*input[4]+weight[7]*percept[0]+weight[8]*percept[1]);
       	sum[3] =  (weight[9]*input[5]+weight[10]*percept[0]+weight[11]*percept[1]);
        	sum[4] =  (weight[12]*input[6]+weight[13]*percept[2]+weight[14]*percept[3]);
            percept[0] = f(sum[0]); //ACTIVATION
            percept[1] = f(sum[1]); //ACTIVATION
            percept[2] = f(sum[2]); //ACTIVATION
            percept[3] = f(sum[3]); //ACTIVATION
            percept[4] = f(sum[4]); //ACTIVATION
            
            if (percept[4] == goal) {
            	successive_right++;
            }
            else {
                	successive_right = 0;
                	error[0] = goal - percept[4]; //ERRORS IN BACKPROPAGATION
                	error[1] = (error[0]*weight[14]);//ERRORS IN BACKPROPAGATION
           	     	error[2] = (error[0]*weight[13]);//ERRORS IN BACKPROPAGATION
                    error[3] = (error[1]*weight[1])+(error[2]*weight[2]);//ERRORS IN BACKPROPAGATION
                	error[4] = (error[1]*weight[4])+(error[2]*weight[5]);//ERRORS IN BACKPROPAGATION
                              
                	weight[0]  += learning_rate*error[4]*input[0];//input layer
                	weight[1]  += learning_rate*error[4]*input[1];//input
                	weight[2]  += learning_rate*error[4]*input[2];//input
                	weight[3]  += learning_rate*error[4]*input[3];//input
                	weight[4]  += learning_rate*error[3]*percept[1];//input
                	weight[5]  += learning_rate*error[3]*percept[0];//input
                	weight[6]  += learning_rate2*error[3]*input[4];//hiden layer
                	weight[7]  += learning_rate2*error[2]*percept[0];//hiden
                	weight[8]  += learning_rate2*error[2]*percept[1];//hiden
                	weight[9]  += learning_rate2*error[2]*input[5];//hiden
                	weight[10] += learning_rate2*error[1]*percept[0];//hiden
                	weight[11] += learning_rate2*error[1]*percept[1];//hiden
                	weight[12] += learning_rate3*error[1]*input[6];//output layer
                	weight[13] += learning_rate3*error[0]*percept[4];//output
                	weight[14] += learning_rate3*error[0]*percept[3];//output
    
    }
    	return;
    }
    
    
    int main(){
       	srand(time(NULL));
       	for(i=0;i<14;i++){
       		weight[i] = 0.5;
       	}    
       	total_go_round=1;
       	while(total_go_round <= 5000){
            	educate();
          		total_go_round++;
    	}
            printf("inputs:\n");
            scanf("%d", &input[1]);
            scanf("%d", &input[2]);
            sum[0] =  (weight[0]*input[0]+weight[1]*input[1]+weight[2]*input[2]);
    	sum[1] =  (weight[3]*input[3]+weight[4]*input[1]+weight[5]*input[2]);
     	sum[2] =  (weight[6]*input[4]+weight[7]*percept[0]+weight[8]*percept[1]);
       	sum[3] =  (weight[9]*input[5]+weight[10]*percept[0]+weight[11]*percept[1]);
        	sum[4] =  (weight[12]*input[6]+weight[13]*percept[2]+weight[14]*percept[3]);
            percept[0] = f(sum[0]); //ACTIVATION
            percept[1] = f(sum[1]); //ACTIVATION
            percept[2] = f(sum[2]); //ACTIVATION
            percept[3] = f(sum[3]); //ACTIVATION
            percept[4] = f(sum[4]); //ACTIVATION
            printf("%d\n", percept[4]);   	
    	return 0;
    }
    Last edited by yann; 09-28-2009 at 03:50 PM.
    Arduino rocks!

  2. #2
    Epy
    Epy is online now
    Fortran lover Epy's Avatar
    Join Date
    Sep 2009
    Location
    California, USA
    Posts
    994
    Well let's start with...did it compile? If not, what was the error?

  3. #3
    Registered User rogster001's Avatar
    Join Date
    Aug 2006
    Location
    Liverpool UK
    Posts
    1,438
    If it does compile but does not produce the results you expect where do you think its going wrong? which part of the code do you think is not working correctly?

  4. #4
    Registered User
    Join Date
    Sep 2006
    Posts
    8,868
    From his other thread, he's having a problem with the algorithm. This program started out as a single layer perceptron, and now he's trying to extend it to a multi-layer perceptron. Specifically, have it able to handle XOR learning. It did compile, but always gave the wrong answer: 1

    He's gotten some good advice, which although he was a bit stubborn about initially, he now sees as correct. The problem, is that the advice was too general, or at too high a level, for him to use it, to modify or create this multi-layer version of his program.

    Unfortunately, a lot of people who can give C code advice, have no idea what a perceptron is, let alone a multi-layer perceptron. << raises my hand >>

    I thought I'd learn something by following one of Abachler's links, but the video showed a guy with a big old absess/boil on his back, and they're draining out *ALL THIS PUSS* < you never saw so much puss in your life >. How bad was it? They're gagging from the smell alone - now that's just wrong, Abachler!

    Yann, be patient, and think positive.

    Oooh! Google paid off:

    More info on this program is here:
    http://lcn.epfl.ch/tutorial/english/...tml/index.html


    Code:
    /*
     * See bottom for address of author.
     *
     * title:       bpsim.c
     * author:      Josiah C. Hoskins
     * date:        June 1987
     *
     * purpose:     backpropagation learning rule neural net simulator
     *              for the tabula rasa Little Red Riding Hood example
     *
     * description: Bpsim provides an implementation of a neural network
     *              containing a single hidden layer which uses the
     *              generalized backpropagation delta rule for learning.
     *              A simple user interface is supplied for experimenting
     *              with a neural network solution to the Little Red Riding
     *              Hood example described in the text.
     *
     *              In addition, bpsim contains some useful building blocks
     *              for further experimentation with single layer neural
     *              networks. The data structure which describes the general
     *              processing unit allows one to easily investigate different
     *              activation (output) and/or error functions. The utility
     *              function create_link can be used to create links between
     *              any two units by supplying your own create_in_out_links
     *              function. The flexibility of creating units and links
     *              to your specifications allows one to modify the code
     *              to tune the network architecture to problems of interest.
     *
     *              There are some parameters that perhaps need some
     *              explanation. You will notice that the target values are
     *              either 0.1 or 0.9 (corresponding to the binary values
     *              0 or 1). With the sigmoidal function used in out_f the
     *              weights become very large if 0 and 1 are used as targets.
     *              The ON_TOLERANCE value is used as a criteria for an output
     *              value to be considered "on", i.e., close enough to the
     *              target of 0.9 to be considered 1. The learning_rate and
     *              momentum variables may be changed to vary the rate of
     *              learning, however, in general they each should be less
     *              than 1.0.
     *
     *              Bpsim has been compiled using CI-C86 version 2.30 on an
     *              IBM-PC and the Sun C compiler on a Sun 3/160.
     *
     *              Note to compile and link on U*IX machines use:
     *                      cc -o bpsim bpsim.c -lm
     *
     *              For other machines remember to link in the math library.
     *
     * status:      This program may be freely used, modified, and distributed
     *              except for commercial purposes.
     *
     * Copyright (c) 1987   Josiah C. Hoskins
     */
     /* Modified to function properly under Turbo C by replacing malloc(...)
        with calloc(...,1). Thanks to Pavel Rozalski who detected the error.
        He assumed that Turbo C's "malloc" doesn't automatically set pointers
        to NULL - and he was right!
        Thomas Muhr, Berlin April, 1988
     */
    
    #include <math.h>
    #include <stdio.h>
    #include <ctype.h>
    
    #define BUFSIZ          512
    
    #define FALSE           0
    #define TRUE            !FALSE
    #define NUM_IN          6       /* number of input units */
    #define NUM_HID         3       /* number of hidden units */
    #define NUM_OUT         7       /* number of output units */
    #define TOTAL           (NUM_IN + NUM_HID + NUM_OUT)
    #define BIAS_UID        (TOTAL) /* threshold unit */
    
    /* macros to provide indexes for processing units */
    #define IN_UID(X)       (X)
    #define HID_UID(X)      (NUM_IN + X)
    #define OUT_UID(X)      (NUM_IN + NUM_HID + X)
    #define TARGET_INDEX(X) (X - (NUM_IN + NUM_HID))
    
    #define WOLF_PATTERN    0
    #define GRANDMA_PATTERN 1
    #define WOODCUT_PATTERN 2
    #define PATTERNS        3       /* number of input patterns */
    #define ERROR_TOLERANCE 0.01
    #define ON_TOLERANCE    0.8     /* a unit's output is on if > ON_TOLERENCE */
    #define NOTIFY          10      /* iterations per dot notification */
    #define DEFAULT_ITER    250
    
    struct unit {                   /* general processing unit */
      int    uid;                   /* integer uniquely identifying each unit */
      char   *label;
      double output;                /* activation level */
      double (*unit_out_f)();       /* note output fcn == activation fcn*/
      double delta;                 /* delta for unit */
      double (*unit_delta_f)();     /* ptr to function to calc delta */
      struct link *inlinks;         /* for propagation */
      struct link *outlinks;        /* for back propagation */
    } *pu[TOTAL+1];                 /* one extra for the bias unit */
    
    struct link {                   /* link between two processing units */
      char   *label;
      double weight;                /* connection or link weight */
      double data;                  /* used to hold the change in weights */
      int    from_unit;             /* uid of from unit */
      int    to_unit;               /* uid of to unit */
      struct link *next_inlink;
      struct link *next_outlink;
    };
    
    int     iterations = DEFAULT_ITER;
    double  learning_rate = 0.2;
    double  momentum = 0.9;
    double  pattern_err[PATTERNS];
    
    /*
     * Input Patterns
     * {Big Ears, Big Eyes, Big Teeth, Kindly, Wrinkled, Handsome}
     *   unit 0    unit 1     unit 2   unit 3   unit 4    unit 5
     */
    double  input_pat[PATTERNS+1][NUM_IN] = {
      {1.0, 1.0, 1.0, 0.0, 0.0, 0.0},       /* Wolf */
      {0.0, 1.0, 0.0, 1.0, 1.0, 0.0},       /* Grandma */
      {1.0, 0.0, 0.0, 1.0, 0.0, 1.0},       /* Woodcutter */
      {0.0, 0.0, 0.0, 0.0, 0.0, 0.0},       /* Used for Recognize Mode */
    };
    
    /*
     * Target Patterns
     * {Scream, Run Away, Look for Woodcutter, Approach, Kiss on Cheek,
     *      Offer Food, Flirt with}
     */
    double  target_pat[PATTERNS][NUM_OUT] = {
      {0.9, 0.9, 0.9, 0.1, 0.1, 0.1, 0.1},  /* response to Wolf */
      {0.1, 0.1, 0.1, 0.9, 0.9, 0.9, 0.1},  /* response to Grandma */
      {0.1, 0.1, 0.1, 0.9, 0.1, 0.9, 0.9},  /* response to Woodcutter */
    };
    
    /*
     * function declarations
     */
    void    print_header();
    char    get_command();
    double  out_f(), delta_f_out(), delta_f_hid(), random(), pattern_error();
    
    
    main()
    {
      char   ch;
      extern struct unit *pu[];
    
      print_header();
      create_processing_units(pu);
      create_in_out_links(pu);
      for (;;) {
        ch = get_command("\nEnter Command (Learn, Recognize, Quit) => ");
        switch (ch) {
        case 'l':
        case 'L':
          printf("\n\tLEARN MODE\n\n");
          learn(pu);
          break;
        case 'r':
        case 'R':
          printf("\n\tRECOGNIZE MODE\n\n");
          recognize(pu);
          break;
        case 'q':
        case 'Q':
          exit(1);
          break;
        default:
          fprintf(stderr, "Invalid Command\n");
          break;
        }
      }
    }
    
    
    void
    print_header()
    {
      printf("%s%s%s",
             "\n\tBPSIM -- Back Propagation Learning Rule Neural Net Simulator\n",
             "\t\t for the tabula rasa Little Red Riding Hood example.\n\n",
             "\t\t Written by Josiah C. Hoskins\n");
    }
    
    
    /*
     * create input, hidden, output units (and threshold or bias unit)
     */
    create_processing_units(pu)
    struct  unit *pu[];
    {
      int   id;                     /* processing unit index */
      struct unit *create_unit();
    
      for (id = IN_UID(0); id < IN_UID(NUM_IN); id++)
        pu[id] = create_unit(id, "input", 0.0, NULL, 0.0, NULL);
      for (id = HID_UID(0); id < HID_UID(NUM_HID); id++)
        pu[id] = create_unit(id, "hidden", 0.0, out_f, 0.0, delta_f_hid);
      for (id = OUT_UID(0); id < OUT_UID(NUM_OUT); id++)
        pu[id] = create_unit(id, "output", 0.0, out_f, 0.0, delta_f_out);
      pu[BIAS_UID] = create_unit(BIAS_UID, "bias", 1.0, NULL, 0.0, NULL);
    }
    
    
    /*
     * create links - fully connected for each layer
     *                note: the bias unit has one link to ea hid and out unit
     */
    create_in_out_links(pu)
    struct  unit *pu[];
    {
      int   i, j;           /* i == to and j == from unit id's */
      struct link *create_link();
    
      /* fully connected units */
      for (i = HID_UID(0); i < HID_UID(NUM_HID); i++) { /* links to hidden */
        pu[BIAS_UID]->outlinks =
          pu[i]->inlinks = create_link(pu[i]->inlinks, i,
                                       pu[BIAS_UID]->outlinks, BIAS_UID,
                                       (char *)NULL,
                                       random(), 0.0);
        for (j = IN_UID(0); j < IN_UID(NUM_IN); j++) /* from input units */
          pu[j]->outlinks =
            pu[i]->inlinks = create_link(pu[i]->inlinks, i, pu[j]->outlinks, j,
                                         (char *)NULL, random(), 0.0);
      }
      for (i = OUT_UID(0); i < OUT_UID(NUM_OUT); i++) {     /* links to output */
        pu[BIAS_UID]->outlinks =
                pu[i]->inlinks = create_link(pu[i]->inlinks, i,
                                             pu[BIAS_UID]->outlinks, BIAS_UID,
                                             (char *)NULL, random(), 0.0);
        for (j = HID_UID(0); j < HID_UID(NUM_HID); j++) /* from hidden units */
          pu[j]->outlinks =
            pu[i]->inlinks = create_link(pu[i]->inlinks, i, pu[j]->outlinks, j,
                                         (char *)NULL, random(), 0.0);
      }
    }
    
    
    /*
     * return a random number bet 0.0 and 1.0
     */
    double
    random()
    {
      return((rand() % 32727) / 32737.0);
    }
    
    
    /*
     * the next two functions are general utility functions to create units
     * and create links
     */
    struct unit *
    create_unit(uid, label, output, out_f, delta, delta_f)
    int  uid;
    char *label;
    double   output, delta;
    double   (*out_f)(), (*delta_f)();
    {
      struct unit  *unitptr;
    
    /*
      if (!(unitptr = (struct unit *)malloc(sizeof(struct unit)))) {
    TURBO C doesnt automatically set pointers to NULL - so use calloc(...,1) */
      if (!(unitptr = (struct unit *)calloc(sizeof(struct unit),1))) {
        fprintf(stderr, "create_unit: not enough memory\n");
        exit(1);
      }
      /* initialize unit data */
      unitptr->uid = uid;
      unitptr->label = label;
      unitptr->output = output;
      unitptr->unit_out_f = out_f;  /* ptr to output fcn */
      unitptr->delta = delta;
      unitptr->unit_delta_f = delta_f;
      return (unitptr);
    }
    
    
    struct link *
    create_link(start_inlist, to_uid, start_outlist, from_uid, label, wt, data)
    struct  link *start_inlist, *start_outlist;
    int     to_uid, from_uid;
    char *  label;
    double  wt, data;
    {
      struct link  *linkptr;
    
    /*  if (!(linkptr = (struct link *)malloc(sizeof(struct link)))) { */
      if (!(linkptr = (struct link *)calloc(sizeof(struct link),1))) {
        fprintf(stderr, "create_link: not enough memory\n");
        exit(1);
      }
      /* initialize link data */
      linkptr->label = label;
      linkptr->from_unit = from_uid;
      linkptr->to_unit = to_uid;
      linkptr->weight = wt;
      linkptr->data = data;
      linkptr->next_inlink = start_inlist;
      linkptr->next_outlink = start_outlist;
      return(linkptr);
    }
    
    
    char
    get_command(s)
    char    *s;
    {
      char  command[BUFSIZ];
    
      fputs(s, stdout);
      fflush(stdin); fflush(stdout);
      (void)fgets(command, BUFSIZ, stdin);
      return((command[0]));         /* return 1st letter of command */
    }
    
    
    learn(pu)
    struct unit *pu[];
    {
      register i, temp;
      char   tempstr[BUFSIZ];
      extern int    iterations;
      extern double learning_rate, momentum;
      static char prompt[] = "Enter # iterations (default is 250) => ";
      static char quote1[] = "Perhaps, Little Red Riding Hood ";
      static char quote2[] = "should do more learning.\n";
    
      printf(prompt);
      fflush(stdin); fflush(stdout);
      gets(tempstr);
      if (temp = atoi(tempstr))
        iterations = temp;
    
      printf("\nLearning ");
      for (i = 0; i < iterations; i++) {
        if ((i % NOTIFY) == 0) {
          printf(".");
          fflush(stdout);
        }
        bp_learn(pu, (i == iterations-2 || i == iterations-1 || i == iterations));
      }
      printf(" Done\n\n");
      printf("Error for Wolf pattern = \t%lf\n", pattern_err[0]);
      printf("Error for Grandma pattern = \t%lf\n", pattern_err[1]);
      printf("Error for Woodcutter pattern = \t%lf\n", pattern_err[2]);
      if (pattern_err[WOLF_PATTERN] > ERROR_TOLERANCE) {
        printf("\nI don't know the Wolf very well.\n%s%s", quote1, quote2);
      } else if (pattern_err[GRANDMA_PATTERN] > ERROR_TOLERANCE) {
        printf("\nI don't know Grandma very well.\n%s%s", quote1, quote2);
      } else if (pattern_err[WOODCUT_PATTERN] > ERROR_TOLERANCE) {
        printf("\nI don't know Mr. Woodcutter very well.\n%s%s", quote1, quote2);
      } else {
        printf("\nI feel pretty smart, now.\n");
      }
    }
    
    
    /*
     * back propagation learning
     */
    bp_learn(pu, save_error)
    struct unit *pu[];
    int    save_error;
    {
      static int count = 0;
      static int pattern = 0;
      extern double pattern_err[PATTERNS];
    
      init_input_units(pu, pattern); /* initialize input pattern to learn */
      propagate(pu);                 /* calc outputs to check versus targets */
      if (save_error)
        pattern_err[pattern] = pattern_error(pattern, pu);
      bp_adjust_weights(pattern, pu);
      if (pattern < PATTERNS - 1)
        pattern++;
      else
          pattern = 0;
      count++;
    }
    
    
    /*
     * initialize the input units with a specific input pattern to learn
     */
    init_input_units(pu, pattern)
    struct unit *pu[];
    int    pattern;
    {
      int   id;
    
      for (id = IN_UID(0); id < IN_UID(NUM_IN); id++)
        pu[id]->output = input_pat[pattern][id];
    }
    
    
    /*
     * calculate the activation level of each unit
     */
    propagate(pu)
    struct unit *pu[];
    {
      int   id;
    
      for (id = HID_UID(0); id < HID_UID(NUM_HID); id++)
        (*(pu[id]->unit_out_f))(pu[id], pu);
      for (id = OUT_UID(0); id < OUT_UID(NUM_OUT); id++)
        (*(pu[id]->unit_out_f))(pu[id], pu);
    }
    
    
    /*
     * function to calculate the activation or output of units
     */
    double
    out_f(pu_ptr, pu)
    struct unit *pu_ptr, *pu[];
    {
      double sum = 0.0 , exp();
      struct link *tmp_ptr;
    
      tmp_ptr = pu_ptr->inlinks;
      while (tmp_ptr) {
        /* sum up (outputs from inlinks times weights on the inlinks) */
        sum += pu[tmp_ptr->from_unit]->output * tmp_ptr->weight;
        tmp_ptr = tmp_ptr->next_inlink;
      }
      pu_ptr->output = 1.0/(1.0 + exp(-sum));
    }
    
    
    /*
     * half of the sum of the squares of the errors of the
     * output versus target values
     */
    double
    pattern_error(pat_num, pu)
    int     pat_num;        /* pattern number */
    struct  unit *pu[];
    {
      int           i;
      double        temp, sum = 0.0;
    
      for (i = OUT_UID(0); i < OUT_UID(NUM_OUT); i++) {
        temp = target_pat[pat_num][TARGET_INDEX(i)] - pu[i]->output;
        sum += temp * temp;
      }
      return (sum/2.0);
    }
    
    
    bp_adjust_weights(pat_num, pu)
    int     pat_num;        /* pattern number */
    struct  unit *pu[];
    {
      int           i;              /* processing units id */
      double        temp1, temp2, delta, error_sum;
      struct link   *inlink_ptr, *outlink_ptr;
    
      /* calc deltas */
      for (i = OUT_UID(0); i < OUT_UID(NUM_OUT); i++) /* for each output unit */
        (*(pu[i]->unit_delta_f))(pu, i, pat_num); /* calc delta */
      for (i = HID_UID(0); i < HID_UID(NUM_HID); i++) /* for each hidden unit */
        (*(pu[i]->unit_delta_f))(pu, i);      /* calc delta */
      /* calculate weights */
      for (i = OUT_UID(0); i < OUT_UID(NUM_OUT); i++) {     /* for output units */
        inlink_ptr = pu[i]->inlinks;
        while (inlink_ptr) {        /* for each inlink to output unit */
          temp1 = learning_rate * pu[i]->delta *
            pu[inlink_ptr->from_unit]->output;
          temp2 = momentum * inlink_ptr->data;
          inlink_ptr->data = temp1 + temp2; /* new delta weight */
          inlink_ptr->weight += inlink_ptr->data;   /* new weight */
          inlink_ptr = inlink_ptr->next_inlink;
        }
      }
      for (i = HID_UID(0); i < HID_UID(NUM_HID); i++) { /* for ea hid unit */
        inlink_ptr = pu[i]->inlinks;
        while (inlink_ptr) {        /* for each inlink to output unit */
          temp1 = learning_rate * pu[i]->delta *
            pu[inlink_ptr->from_unit]->output;
          temp2 = momentum * inlink_ptr->data;
          inlink_ptr->data = temp1 + temp2; /* new delta weight */
          inlink_ptr->weight += inlink_ptr->data;   /* new weight */
            inlink_ptr = inlink_ptr->next_inlink;
        }
      }
    }
    
    
    /*
     * calculate the delta for an output unit
     */
    double
    delta_f_out(pu, uid, pat_num)
    struct unit *pu[];
    int    uid, pat_num;
    {
      double        temp1, temp2, delta;
    
      /* calc deltas */
      temp1 = (target_pat[pat_num][TARGET_INDEX(uid)] - pu[uid]->output);
      temp2 = (1.0 - pu[uid]->output);
      delta = temp1 * pu[uid]->output * temp2; /* calc delta */
      pu[uid]->delta = delta; /* store delta to pass on */
    }
    
    
    /*
     * calculate the delta for a hidden unit
     */
    double
    delta_f_hid(pu, uid)
    struct unit *pu[];
    int    uid;
    {
      double        temp1, temp2, delta, error_sum;
      struct link   *inlink_ptr, *outlink_ptr;
    
      outlink_ptr = pu[uid]->outlinks;
      error_sum = 0.0;
      while (outlink_ptr) {
        error_sum += pu[outlink_ptr->to_unit]->delta * outlink_ptr->weight;
        outlink_ptr = outlink_ptr->next_outlink;
      }
      delta = pu[uid]->output * (1.0 - pu[uid]->output) * error_sum;
      pu[uid]->delta = delta;
    }
    
    
    recognize(pu)
    struct unit *pu[];
    {
      int    i;
      char   tempstr[BUFSIZ];
      static char *p[] = {"Big Ears?", "Big Eyes?", "Big Teeth?",
                          "Kindly?\t", "Wrinkled?", "Handsome?"};
    
      for (i = 0; i < NUM_IN; i++) {
        printf("%s\t(y/n) ", p[i]);
        fflush(stdin); fflush(stdout);
        fgets(tempstr, BUFSIZ, stdin);
        if (tempstr[0] == 'Y' || tempstr[0] == 'y')
          input_pat[PATTERNS][i] = 1.0;
        else
          input_pat[PATTERNS][i] = 0.0;
      }
      init_input_units(pu, PATTERNS);
      propagate(pu);
      print_behaviour(pu);
    }
    
    
    print_behaviour(pu)
    struct unit *pu[];
    {
      int   id, count = 0;
      static char *behaviour[] = {
        "Screams", "Runs Away", "Looks for Woodcutter", "Approaches",
        "Kisses on Cheek", "Offers Food", "Flirts with Woodcutter" };
    
      printf("\nLittle Red Riding Hood: \n");
      for (id = OUT_UID(0); id < OUT_UID(NUM_OUT); id++){ /* links to out units */
        if (pu[id]->output > ON_TOLERANCE)
          printf("\t%s\n", behaviour[count]);
        count++;
      }
      printf("\n");
    }
    
    /*
    ! Thomas Muhr    Knowledge-Based Systems Dept. Technical University of Berlin !
    ! BITNET/EARN:   muhrth@db0tui11.bitnet                                       !
    ! UUCP:          morus@netmbx.UUCP (Please don't use from outside Germany)    !
    ! BTX:           030874162  Tel.: (Germany 0049) (Berlin 030) 87 41 62        !
    */
    If you google for "multilayer perceptrons in C", you'll also find a PDF on the subject, on the second page.
    Last edited by Adak; 09-29-2009 at 04:13 AM.

  5. #5
    Malum in se abachler's Avatar
    Join Date
    Apr 2007
    Posts
    3,189
    Well, to be fair, that boil lancing video is in my sig, not anything I actually gave out as advice. It's been 2 weeks since I watched that video, and I had just about stopped gagging whenever I ate or drank anything white, thanks for reminding me

    The issue with yann's network is that he doesn't fully grasp the language he is trying to use to write a fairly high level algorithm in, so 90% of the problems he is running into are issues with his as yet undeveloped programming style. Basically he is putting the cart before the horse, just like everyone here did when they were his age. It's no big deal, it just takes a little more patience to explain things than most people on this board have. Personally I cut him a lot of slack, because I remember back when I was 13, posting on Compuserve as user 74723,83 and the EE's would be less than diplomatic in their responses to my newbie questions.
    Until you can build a working general purpose reprogrammable computer out of basic components from radio shack, you are not fit to call yourself a programmer in my presence. This is cwhizard, signing off.

  6. #6
    Registered User yann's Avatar
    Join Date
    Sep 2009
    Location
    Zagreb, Croatia
    Posts
    186
    well, the does compile well, but for some reason it always gives an output of 1:
    111
    101
    011
    001
    like this...I would be very glad if someone would give me a good tutorial so i could fill up holes in my knowledge And I would aprishiate a lot if someone would wrote a VERY simple example of neural network that learns XOR so I could learn from it please...
    Arduino rocks!

  7. #7
    Registered User
    Join Date
    Sep 2006
    Posts
    8,868
    You haven't read that PDF yet, have you Yann?

    tsk, tsk!

    I just had to throw that boil lancing in there, Abachler - it was too "precious" to go unremarked.

    Now - who wants some vanilla yoghurt or cottage cheese?

  8. #8
    Registered User yann's Avatar
    Join Date
    Sep 2009
    Location
    Zagreb, Croatia
    Posts
    186
    what is the actual name of the pdf, there are many...
    Arduino rocks!

  9. #9
    Malum in se abachler's Avatar
    Join Date
    Apr 2007
    Posts
    3,189
    Quote Originally Posted by Adak View Post
    You haven't read that PDF yet, have you Yann?

    tsk, tsk!

    I just had to throw that boil lancing in there, Abachler - it was too "precious" to go unremarked.

    Now - who wants some vanilla yoghurt or cottage cheese?
    check out boblinks.limewebs.com I keep the nasty and gross stuff there, just links, no pics or anything, so its safe to go to the page, but the links are well, not kid or work safe to say the least.

    I did a google for that phrase, and theres tons of stuff on the first page, including examples written as tutorials by some university students.

    http://www.emilstefanov.net/Projects...lNetworks.aspx
    Last edited by abachler; 09-29-2009 at 11:08 AM.
    Until you can build a working general purpose reprogrammable computer out of basic components from radio shack, you are not fit to call yourself a programmer in my presence. This is cwhizard, signing off.

  10. #10
    Registered User yann's Avatar
    Join Date
    Sep 2009
    Location
    Zagreb, Croatia
    Posts
    186
    I tried to use tanh(), but it also gives me always one...
    Arduino rocks!

  11. #11
    Registered User yann's Avatar
    Join Date
    Sep 2009
    Location
    Zagreb, Croatia
    Posts
    186
    i think it is the learning function, i printed out all the weights and sums, some weights are "nan", what does that mean, and sums are all nan, except the last one which is 8707.250488
    Arduino rocks!

  12. #12
    spurious conceit MK27's Avatar
    Join Date
    Jul 2008
    Location
    segmentation fault
    Posts
    8,300
    Quote Originally Posted by yann View Post
    i think it is the learning function, i printed out all the weights and sums, some weights are "nan", what does that mean, and sums are all nan, except the last one which is 8707.250488
    nan or NaN stands for "Not a Number"
    C programming resources:
    GNU C Function and Macro Index -- glibc reference manual
    The C Book -- nice online learner guide
    Current ISO draft standard
    CCAN -- new CPAN like open source library repository
    3 (different) GNU debugger tutorials: #1 -- #2 -- #3
    cpwiki -- our wiki on sourceforge

  13. #13
    Malum in se abachler's Avatar
    Join Date
    Apr 2007
    Posts
    3,189
    Here is a compilable example. Its not my work, so I'm not sure of the board rules, or if the author cares if I post it here. It's from C++ Neural Networks & Fuzzy Logic by Rao and Rao The book is at least 14 years old, so it may be a bit outdated.

    I noted the changes I made, which were only to get it to compile under VS 2008
    Attached Files Attached Files
    Last edited by abachler; 09-29-2009 at 01:27 PM.
    Until you can build a working general purpose reprogrammable computer out of basic components from radio shack, you are not fit to call yourself a programmer in my presence. This is cwhizard, signing off.

  14. #14
    spurious conceit MK27's Avatar
    Join Date
    Jul 2008
    Location
    segmentation fault
    Posts
    8,300
    Quote Originally Posted by abachler View Post
    Here is a compilable example. Its not my work, so I'm not sure of the board rules, or if the author cares if I post it here. It's from C++ Neural Networks & Fuzzy Logic by Rao and Rao The book is at least 14 years old, so it may be a bit outdated.
    Considering yann is still not very interested in C syntax , I am not sure how much help some C++ is going to be...
    C programming resources:
    GNU C Function and Macro Index -- glibc reference manual
    The C Book -- nice online learner guide
    Current ISO draft standard
    CCAN -- new CPAN like open source library repository
    3 (different) GNU debugger tutorials: #1 -- #2 -- #3
    cpwiki -- our wiki on sourceforge

  15. #15
    Registered User yann's Avatar
    Join Date
    Sep 2009
    Location
    Zagreb, Croatia
    Posts
    186
    Yay, I am officially stopping with NN's, but I am going to get back to them...now i have a new simple project that will have a purpose of teaching me the c syntax better...any tutorials are welcome, as soon as I get started with it I will create a new thread...anyway...is there a way of changing a threads name...Thanks Abachler and MK27, I will follow the advices you gave me .
    Arduino rocks!

Page 1 of 2 12 LastLast
Popular pages Recent additions subscribe to a feed

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21