Go back to Richel Bilderbeek's homepage.

Go back to Richel Bilderbeek's C++ page.

 

 

 

 

 

(C++) Example 1: neural net solving the XOR problem

 

Shark example 1: neural net solving the XOR problem is a Shark example to let a neural network train to solve the XOR problem.

 

 

 

 

 

 

Operating system: Ubuntu 10.04 LTS Lucid Lynx

IDE: Qt Creator 2.0.0

Project type: GUI application

Compiler: G++ 4.4.1

Libraries used:

 

 

 

 

 

Qt project file

 

#-------------------------------------------------
#
# Project created by QtCreator 2010-08-15T23:08:25
#
#-------------------------------------------------
QT += core
QT -= gui
TARGET = CppSharkExample1
CONFIG += console
CONFIG -= app_bundle
LIBS += -L/usr/local/lib -lshark
TEMPLATE = app
SOURCES += main.cpp

 

 

 

 

 

main.cpp

 

#include <iostream>
#include <iterator>
#include <Array/Array.h>
#include <ReClaM/FFNet.h>
#include <ReClaM/createConnectionMatrix.h>
#include <ReClaM/CrossEntropy.h>
#include <ReClaM/Rprop.h>
#include <ReClaM/ClassificationError.h>

//Modified from the Shark library tutorial
int main()
{
  //Construct XOR problem input and target output
  Array<double> trainInput( 4,2);
  Array<double> trainTarget(4,1);
  for(int k=0, i=0; i!=2; ++i)
  {
    for(int j=0; j!=2; ++j)
    {
      trainInput(k,0) = i;
      trainInput(k,1) = j;
      trainTarget(k, 0) = (i+j) % 2;
      ++k;
    }
  }

  //Define neural net topology
  const int n_inputs = 2;
  const int n_hidden = 2;
  const int n_outputs = 1;
  //Create neural net connection matrix
  Array<int> connection_matrix;
  createConnectionMatrix(connection_matrix,n_inputs, n_hidden, n_outputs);

  //Display the connection matrix
  std::cout << "Display the connection matrix:\n";
  std::copy(connection_matrix.begin(),connection_matrix.end(),
    std::ostream_iterator<int>(std::cout," "));
  std::cout << '\n';

  //Create the feed-forward neural network
  FFNet net(n_inputs, n_outputs, connection_matrix);
  std::cout << "Display the neural network (note that there are no weights set yet):\n";
  net.write(std::cout);
  std::cout << '\n';

  std::cout << "Initializing the weights (uniformly <-0.1,0.1>)...\n";
  net.initWeights(-0.1, 0.1);

  std::cout << "Display the neural net:\n";
  net.write(std::cout);
  std::cout << '\n';

  //Error function
  CrossEntropy error;
  ClassificationError accuracy(.5);

  //Optimizer
  IRpropPlus optimizer;
  optimizer.init(net);

  //Training loop
  const int n_learning_cycles = 100;
  std::cout << "Start training for " << n_learning_cycles << " learning cycles.\n";
  for (int i = 0; i!= n_learning_cycles; ++i)
  {
    //Train the network
    optimizer.optimize(net, error, trainInput, trainTarget);

    //Show results
    std::cout << i << "\t"
        << accuracy.error(net, trainInput, trainTarget) << "\t"
        << error.error(net, trainInput, trainTarget) << std::endl;
  }

  std::cout << "Display the neural network after training:\n";
  net.write(std::cout);
  std::cout << '\n';
}

 

 

 

 

 

Screen output

 

Display the connection matrix:
0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 1 1 1 0 0 0 1 1 1 1 1 0 1
Display the neural network (note that there are no weights set yet):
2 1
0 0 0 0 0 0
0 0 0 0 0 0
1 1 0 0 0 1
1 1 0 0 0 1
1 1 1 1 0 1

0 0 0 0 0 0
0 0 0 0 0 0
0 0 0 0 0 0
0 0 0 0 0 0
0 0 0 0 0 0


Initializing the weights (uniformly <-0.1,0.1>)...
Display the neural net:
2 1
0 0 0 0 0 0
0 0 0 0 0 0
1 1 0 0 0 1
1 1 0 0 0 1
1 1 1 1 0 1

0 0 0 0 0 0
0 0 0 0 0 0
0.0165561 0.0178207 0 0 0 0.0924421
0.0502306 -0.00540507 0 0 0 -0.0577283
-0.0534691 0.048459 0.0704912 -0.00753265 0 0.0944008


Start training for 100 learning cycles.
0 0.5 2.77777
1 0.5 2.77509
2 0.5 2.77374
3 0.5 2.77408
4 0.5 2.77364
5 0.5 2.77329
6 0.5 2.77323
7 0.5 2.77295
8 0.5 2.77285
9 0.5 2.77289
10 0.5 2.77279
11 0.5 2.7727
12 0.5 2.77263
13 0.5 2.7726
14 0.5 2.77261
15 0.25 2.77259
16 0.5 2.77263
17 0.5 2.77259
18 0.5 2.7726
19 0.5 2.77259
20 0.5 2.77259
21 0.5 2.77259
22 0.5 2.77259
23 0.5 2.77259
24 0.5 2.77259
25 0.5 2.77259
26 0.25 2.77259
27 0.5 2.77259
28 0.5 2.77259
29 0.5 2.77259
30 0.5 2.77259
31 0.75 2.77259
32 0.5 2.77259
33 0.5 2.77259
34 0.5 2.77259
35 0.5 2.77259
36 0.5 2.77259
37 0.75 2.77259
38 0.25 2.77259
39 0.25 2.77259
40 0.25 2.77259
41 0.25 2.77259
42 0.25 2.77259
43 0.25 2.77259
44 0.25 2.77259
45 0.25 2.77259
46 0.5 2.77259
47 0.5 2.77259
48 0.5 2.77259
49 0.5 2.77259
50 0.25 2.77259
51 0.25 2.77259
52 0.25 2.77258
53 0.25 2.77258
54 0.25 2.77257
55 0.25 2.77256
56 0.25 2.77254
57 0.25 2.77252
58 0.25 2.77248
59 0.25 2.77242
60 0.25 2.77234
61 0.25 2.77225
62 0.25 2.77212
63 0.25 2.77197
64 0.25 2.7718
65 0 2.77159
66 0 2.77133
67 0 2.77104
68 0 2.77069
69 0 2.77027
70 0 2.76976
71 0.25 2.76914
72 0.25 2.76837
73 0.25 2.76739
74 0.25 2.76614
75 0.25 2.76453
76 0.25 2.76249
77 0.25 2.75993
78 0.25 2.75677
79 0.25 2.75297
80 0.25 2.74849
81 0.25 2.74327
82 0.25 2.73724
83 0.25 2.73022
84 0.25 2.72202
85 0.25 2.71235
86 0.25 2.70091
87 0.25 2.68738
88 0.25 2.67141
89 0.25 2.65263
90 0.25 2.63065
91 0.25 2.60513
92 0.25 2.57575
93 0.25 2.54237
94 0.25 2.50502
95 0.25 2.46408
96 0.25 2.42048
97 0.25 2.37597
98 0.25 2.33356
99 0.25 2.27967
Display the neural network after training:
2 1
0 0 0 0 0 0
0 0 0 0 0 0
1 1 0 0 0 1
1 1 0 0 0 1
1 1 1 1 0 1

0 0 0 0 0 0
0 0 0 0 0 0
89.9339 83.6439 0 0 0 -43.2266
-1.54356 12.7449 0 0 0 1.17246
-0.185063 -0.185816 1.28051 -0.244617 0 -0.157372

 

 

 

 

 

Go back to Richel Bilderbeek's C++ page.

Go back to Richel Bilderbeek's homepage.

 

Valid XHTML 1.0 Strict