Neural Network Classifier in Matlab











up vote
2
down vote

favorite












I am trying to build a neural network classifier. I have created a neural network with 1 hidden layer (25 hidden neurons) and 1 output layer (1 neuron/binary classification).



The dataset I am using has the following dimensions:



size(X_Train): 125973 x 122
size(Y_Train): 125973 x 1
size(X_Test): 22543 x 122
size(Y_test): 22543 x 1


My overall goal is to compare different training functions. But I would like first to get your feedback about my code and how I improve it.



% Neural Network Binary-classification

clear ; close all; clc

%% =========== Part 1: Loading Data =============

%% Load Training Data
fprintf('Loading Data ...n');

load('dataset.mat'); % training data stored in arrays X, y
X_training=X_training';
Y_training=Y_training';
X_testing=X_testing';
Y_testing=Y_testing';

%% Create the neural network
% 1, 2: ONE input, TWO layers (one hidden layer and one output layer)
% [1; 1]: both 1st and 2nd layer have a bias node
% [1; 0]: the input is a source for the 1st layer
% [0 0; 1 0]: the 1st layer is a source for the 2nd layer
% [0 1]: the 2nd layer is a source for your output
net = network(1, 2, [1; 1], [1; 0], [0 0; 1 0], [0 1]);
net.inputs{1}.size = 122; % input size
net.layers{1}.size = 25; % hidden layer size
net.layers{2}.size = 1; % output layer size

%% Transfer function in layers
net.layers{1}.transferFcn = 'logsig';
net.layers{2}.transferFcn = 'logsig';

net.layers{1}.initFcn = 'initnw';
net.layers{2}.initFcn = 'initnw';

net=init(net);

%% divide data into training and test
net.divideFcn= 'dividerand';
net.divideParam.trainRatio = 60/100; % 80% training
net.divideParam.valRatio = 20/100; % 20% validation set
net.divideParam.testRatio = 20/100; % 20% validation set

net.performFcn = 'crossentropy';

%% Training functions
net.trainFcn = 'trainscg'; %Scaled conjugate gradient backpropagation

%% Train the neural network
[net,tr] = train(net,X_training,Y_training); % return the network and training record

%% Test the Neural Network on the training set
outputs = net(X_training);
errors = gsubtract(Y_training,outputs);
performance = perform(net,Y_training,outputs);

%% Plots (%training)
figure, plotperform(tr)
figure, plottrainstate(tr)

%% Test the Neural Network on the testing test
outputs1 = net(X_testing);
errors1 = gsubtract(Y_testing,outputs1);
performance1 = perform(net,Y_testing,outputs1);

figure, plotconfusion(Y_testing,outputs1)
figure, ploterrhist(errors1)


Below if the validation curve.



enter image description here



Confusion Matrix (Training set)



enter image description here



Confusion Matrix (Testing set)
enter image description here



Any remarks?



Edit:



I have used feature scaling or normalization:



net.performParam.normalization = 'standard';


which improved the overall accuracy:enter image description here



For more information, I have added the error histogram:



enter image description here










share|improve this question




























    up vote
    2
    down vote

    favorite












    I am trying to build a neural network classifier. I have created a neural network with 1 hidden layer (25 hidden neurons) and 1 output layer (1 neuron/binary classification).



    The dataset I am using has the following dimensions:



    size(X_Train): 125973 x 122
    size(Y_Train): 125973 x 1
    size(X_Test): 22543 x 122
    size(Y_test): 22543 x 1


    My overall goal is to compare different training functions. But I would like first to get your feedback about my code and how I improve it.



    % Neural Network Binary-classification

    clear ; close all; clc

    %% =========== Part 1: Loading Data =============

    %% Load Training Data
    fprintf('Loading Data ...n');

    load('dataset.mat'); % training data stored in arrays X, y
    X_training=X_training';
    Y_training=Y_training';
    X_testing=X_testing';
    Y_testing=Y_testing';

    %% Create the neural network
    % 1, 2: ONE input, TWO layers (one hidden layer and one output layer)
    % [1; 1]: both 1st and 2nd layer have a bias node
    % [1; 0]: the input is a source for the 1st layer
    % [0 0; 1 0]: the 1st layer is a source for the 2nd layer
    % [0 1]: the 2nd layer is a source for your output
    net = network(1, 2, [1; 1], [1; 0], [0 0; 1 0], [0 1]);
    net.inputs{1}.size = 122; % input size
    net.layers{1}.size = 25; % hidden layer size
    net.layers{2}.size = 1; % output layer size

    %% Transfer function in layers
    net.layers{1}.transferFcn = 'logsig';
    net.layers{2}.transferFcn = 'logsig';

    net.layers{1}.initFcn = 'initnw';
    net.layers{2}.initFcn = 'initnw';

    net=init(net);

    %% divide data into training and test
    net.divideFcn= 'dividerand';
    net.divideParam.trainRatio = 60/100; % 80% training
    net.divideParam.valRatio = 20/100; % 20% validation set
    net.divideParam.testRatio = 20/100; % 20% validation set

    net.performFcn = 'crossentropy';

    %% Training functions
    net.trainFcn = 'trainscg'; %Scaled conjugate gradient backpropagation

    %% Train the neural network
    [net,tr] = train(net,X_training,Y_training); % return the network and training record

    %% Test the Neural Network on the training set
    outputs = net(X_training);
    errors = gsubtract(Y_training,outputs);
    performance = perform(net,Y_training,outputs);

    %% Plots (%training)
    figure, plotperform(tr)
    figure, plottrainstate(tr)

    %% Test the Neural Network on the testing test
    outputs1 = net(X_testing);
    errors1 = gsubtract(Y_testing,outputs1);
    performance1 = perform(net,Y_testing,outputs1);

    figure, plotconfusion(Y_testing,outputs1)
    figure, ploterrhist(errors1)


    Below if the validation curve.



    enter image description here



    Confusion Matrix (Training set)



    enter image description here



    Confusion Matrix (Testing set)
    enter image description here



    Any remarks?



    Edit:



    I have used feature scaling or normalization:



    net.performParam.normalization = 'standard';


    which improved the overall accuracy:enter image description here



    For more information, I have added the error histogram:



    enter image description here










    share|improve this question


























      up vote
      2
      down vote

      favorite









      up vote
      2
      down vote

      favorite











      I am trying to build a neural network classifier. I have created a neural network with 1 hidden layer (25 hidden neurons) and 1 output layer (1 neuron/binary classification).



      The dataset I am using has the following dimensions:



      size(X_Train): 125973 x 122
      size(Y_Train): 125973 x 1
      size(X_Test): 22543 x 122
      size(Y_test): 22543 x 1


      My overall goal is to compare different training functions. But I would like first to get your feedback about my code and how I improve it.



      % Neural Network Binary-classification

      clear ; close all; clc

      %% =========== Part 1: Loading Data =============

      %% Load Training Data
      fprintf('Loading Data ...n');

      load('dataset.mat'); % training data stored in arrays X, y
      X_training=X_training';
      Y_training=Y_training';
      X_testing=X_testing';
      Y_testing=Y_testing';

      %% Create the neural network
      % 1, 2: ONE input, TWO layers (one hidden layer and one output layer)
      % [1; 1]: both 1st and 2nd layer have a bias node
      % [1; 0]: the input is a source for the 1st layer
      % [0 0; 1 0]: the 1st layer is a source for the 2nd layer
      % [0 1]: the 2nd layer is a source for your output
      net = network(1, 2, [1; 1], [1; 0], [0 0; 1 0], [0 1]);
      net.inputs{1}.size = 122; % input size
      net.layers{1}.size = 25; % hidden layer size
      net.layers{2}.size = 1; % output layer size

      %% Transfer function in layers
      net.layers{1}.transferFcn = 'logsig';
      net.layers{2}.transferFcn = 'logsig';

      net.layers{1}.initFcn = 'initnw';
      net.layers{2}.initFcn = 'initnw';

      net=init(net);

      %% divide data into training and test
      net.divideFcn= 'dividerand';
      net.divideParam.trainRatio = 60/100; % 80% training
      net.divideParam.valRatio = 20/100; % 20% validation set
      net.divideParam.testRatio = 20/100; % 20% validation set

      net.performFcn = 'crossentropy';

      %% Training functions
      net.trainFcn = 'trainscg'; %Scaled conjugate gradient backpropagation

      %% Train the neural network
      [net,tr] = train(net,X_training,Y_training); % return the network and training record

      %% Test the Neural Network on the training set
      outputs = net(X_training);
      errors = gsubtract(Y_training,outputs);
      performance = perform(net,Y_training,outputs);

      %% Plots (%training)
      figure, plotperform(tr)
      figure, plottrainstate(tr)

      %% Test the Neural Network on the testing test
      outputs1 = net(X_testing);
      errors1 = gsubtract(Y_testing,outputs1);
      performance1 = perform(net,Y_testing,outputs1);

      figure, plotconfusion(Y_testing,outputs1)
      figure, ploterrhist(errors1)


      Below if the validation curve.



      enter image description here



      Confusion Matrix (Training set)



      enter image description here



      Confusion Matrix (Testing set)
      enter image description here



      Any remarks?



      Edit:



      I have used feature scaling or normalization:



      net.performParam.normalization = 'standard';


      which improved the overall accuracy:enter image description here



      For more information, I have added the error histogram:



      enter image description here










      share|improve this question















      I am trying to build a neural network classifier. I have created a neural network with 1 hidden layer (25 hidden neurons) and 1 output layer (1 neuron/binary classification).



      The dataset I am using has the following dimensions:



      size(X_Train): 125973 x 122
      size(Y_Train): 125973 x 1
      size(X_Test): 22543 x 122
      size(Y_test): 22543 x 1


      My overall goal is to compare different training functions. But I would like first to get your feedback about my code and how I improve it.



      % Neural Network Binary-classification

      clear ; close all; clc

      %% =========== Part 1: Loading Data =============

      %% Load Training Data
      fprintf('Loading Data ...n');

      load('dataset.mat'); % training data stored in arrays X, y
      X_training=X_training';
      Y_training=Y_training';
      X_testing=X_testing';
      Y_testing=Y_testing';

      %% Create the neural network
      % 1, 2: ONE input, TWO layers (one hidden layer and one output layer)
      % [1; 1]: both 1st and 2nd layer have a bias node
      % [1; 0]: the input is a source for the 1st layer
      % [0 0; 1 0]: the 1st layer is a source for the 2nd layer
      % [0 1]: the 2nd layer is a source for your output
      net = network(1, 2, [1; 1], [1; 0], [0 0; 1 0], [0 1]);
      net.inputs{1}.size = 122; % input size
      net.layers{1}.size = 25; % hidden layer size
      net.layers{2}.size = 1; % output layer size

      %% Transfer function in layers
      net.layers{1}.transferFcn = 'logsig';
      net.layers{2}.transferFcn = 'logsig';

      net.layers{1}.initFcn = 'initnw';
      net.layers{2}.initFcn = 'initnw';

      net=init(net);

      %% divide data into training and test
      net.divideFcn= 'dividerand';
      net.divideParam.trainRatio = 60/100; % 80% training
      net.divideParam.valRatio = 20/100; % 20% validation set
      net.divideParam.testRatio = 20/100; % 20% validation set

      net.performFcn = 'crossentropy';

      %% Training functions
      net.trainFcn = 'trainscg'; %Scaled conjugate gradient backpropagation

      %% Train the neural network
      [net,tr] = train(net,X_training,Y_training); % return the network and training record

      %% Test the Neural Network on the training set
      outputs = net(X_training);
      errors = gsubtract(Y_training,outputs);
      performance = perform(net,Y_training,outputs);

      %% Plots (%training)
      figure, plotperform(tr)
      figure, plottrainstate(tr)

      %% Test the Neural Network on the testing test
      outputs1 = net(X_testing);
      errors1 = gsubtract(Y_testing,outputs1);
      performance1 = perform(net,Y_testing,outputs1);

      figure, plotconfusion(Y_testing,outputs1)
      figure, ploterrhist(errors1)


      Below if the validation curve.



      enter image description here



      Confusion Matrix (Training set)



      enter image description here



      Confusion Matrix (Testing set)
      enter image description here



      Any remarks?



      Edit:



      I have used feature scaling or normalization:



      net.performParam.normalization = 'standard';


      which improved the overall accuracy:enter image description here



      For more information, I have added the error histogram:



      enter image description here







      machine-learning matlab neural-network






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited 16 hours ago

























      asked yesterday









      U. User

      286




      286



























          active

          oldest

          votes











          Your Answer





          StackExchange.ifUsing("editor", function () {
          return StackExchange.using("mathjaxEditing", function () {
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["\$", "\$"]]);
          });
          });
          }, "mathjax-editing");

          StackExchange.ifUsing("editor", function () {
          StackExchange.using("externalEditor", function () {
          StackExchange.using("snippets", function () {
          StackExchange.snippets.init();
          });
          });
          }, "code-snippets");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "196"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          convertImagesToLinks: false,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: null,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














           

          draft saved


          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcodereview.stackexchange.com%2fquestions%2f207937%2fneural-network-classifier-in-matlab%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown






























          active

          oldest

          votes













          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes
















           

          draft saved


          draft discarded



















































           


          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcodereview.stackexchange.com%2fquestions%2f207937%2fneural-network-classifier-in-matlab%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Costa Masnaga

          Fotorealismo

          Sidney Franklin