Neural Network Classifier in Matlab
up vote
2
down vote
favorite
I am trying to build a neural network classifier. I have created a neural network with 1 hidden layer (25 hidden neurons) and 1 output layer (1 neuron/binary classification).
The dataset I am using has the following dimensions:
size(X_Train): 125973 x 122
size(Y_Train): 125973 x 1
size(X_Test): 22543 x 122
size(Y_test): 22543 x 1
My overall goal is to compare different training functions. But I would like first to get your feedback about my code and how I improve it.
% Neural Network Binary-classification
clear ; close all; clc
%% =========== Part 1: Loading Data =============
%% Load Training Data
fprintf('Loading Data ...n');
load('dataset.mat'); % training data stored in arrays X, y
X_training=X_training';
Y_training=Y_training';
X_testing=X_testing';
Y_testing=Y_testing';
%% Create the neural network
% 1, 2: ONE input, TWO layers (one hidden layer and one output layer)
% [1; 1]: both 1st and 2nd layer have a bias node
% [1; 0]: the input is a source for the 1st layer
% [0 0; 1 0]: the 1st layer is a source for the 2nd layer
% [0 1]: the 2nd layer is a source for your output
net = network(1, 2, [1; 1], [1; 0], [0 0; 1 0], [0 1]);
net.inputs{1}.size = 122; % input size
net.layers{1}.size = 25; % hidden layer size
net.layers{2}.size = 1; % output layer size
%% Transfer function in layers
net.layers{1}.transferFcn = 'logsig';
net.layers{2}.transferFcn = 'logsig';
net.layers{1}.initFcn = 'initnw';
net.layers{2}.initFcn = 'initnw';
net=init(net);
%% divide data into training and test
net.divideFcn= 'dividerand';
net.divideParam.trainRatio = 60/100; % 80% training
net.divideParam.valRatio = 20/100; % 20% validation set
net.divideParam.testRatio = 20/100; % 20% validation set
net.performFcn = 'crossentropy';
%% Training functions
net.trainFcn = 'trainscg'; %Scaled conjugate gradient backpropagation
%% Train the neural network
[net,tr] = train(net,X_training,Y_training); % return the network and training record
%% Test the Neural Network on the training set
outputs = net(X_training);
errors = gsubtract(Y_training,outputs);
performance = perform(net,Y_training,outputs);
%% Plots (%training)
figure, plotperform(tr)
figure, plottrainstate(tr)
%% Test the Neural Network on the testing test
outputs1 = net(X_testing);
errors1 = gsubtract(Y_testing,outputs1);
performance1 = perform(net,Y_testing,outputs1);
figure, plotconfusion(Y_testing,outputs1)
figure, ploterrhist(errors1)
Below if the validation curve.
Confusion Matrix (Training set)
Confusion Matrix (Testing set)
Any remarks?
Edit:
I have used feature scaling or normalization:
net.performParam.normalization = 'standard';
which improved the overall accuracy:
For more information, I have added the error histogram:
machine-learning matlab neural-network
add a comment |
up vote
2
down vote
favorite
I am trying to build a neural network classifier. I have created a neural network with 1 hidden layer (25 hidden neurons) and 1 output layer (1 neuron/binary classification).
The dataset I am using has the following dimensions:
size(X_Train): 125973 x 122
size(Y_Train): 125973 x 1
size(X_Test): 22543 x 122
size(Y_test): 22543 x 1
My overall goal is to compare different training functions. But I would like first to get your feedback about my code and how I improve it.
% Neural Network Binary-classification
clear ; close all; clc
%% =========== Part 1: Loading Data =============
%% Load Training Data
fprintf('Loading Data ...n');
load('dataset.mat'); % training data stored in arrays X, y
X_training=X_training';
Y_training=Y_training';
X_testing=X_testing';
Y_testing=Y_testing';
%% Create the neural network
% 1, 2: ONE input, TWO layers (one hidden layer and one output layer)
% [1; 1]: both 1st and 2nd layer have a bias node
% [1; 0]: the input is a source for the 1st layer
% [0 0; 1 0]: the 1st layer is a source for the 2nd layer
% [0 1]: the 2nd layer is a source for your output
net = network(1, 2, [1; 1], [1; 0], [0 0; 1 0], [0 1]);
net.inputs{1}.size = 122; % input size
net.layers{1}.size = 25; % hidden layer size
net.layers{2}.size = 1; % output layer size
%% Transfer function in layers
net.layers{1}.transferFcn = 'logsig';
net.layers{2}.transferFcn = 'logsig';
net.layers{1}.initFcn = 'initnw';
net.layers{2}.initFcn = 'initnw';
net=init(net);
%% divide data into training and test
net.divideFcn= 'dividerand';
net.divideParam.trainRatio = 60/100; % 80% training
net.divideParam.valRatio = 20/100; % 20% validation set
net.divideParam.testRatio = 20/100; % 20% validation set
net.performFcn = 'crossentropy';
%% Training functions
net.trainFcn = 'trainscg'; %Scaled conjugate gradient backpropagation
%% Train the neural network
[net,tr] = train(net,X_training,Y_training); % return the network and training record
%% Test the Neural Network on the training set
outputs = net(X_training);
errors = gsubtract(Y_training,outputs);
performance = perform(net,Y_training,outputs);
%% Plots (%training)
figure, plotperform(tr)
figure, plottrainstate(tr)
%% Test the Neural Network on the testing test
outputs1 = net(X_testing);
errors1 = gsubtract(Y_testing,outputs1);
performance1 = perform(net,Y_testing,outputs1);
figure, plotconfusion(Y_testing,outputs1)
figure, ploterrhist(errors1)
Below if the validation curve.
Confusion Matrix (Training set)
Confusion Matrix (Testing set)
Any remarks?
Edit:
I have used feature scaling or normalization:
net.performParam.normalization = 'standard';
which improved the overall accuracy:
For more information, I have added the error histogram:
machine-learning matlab neural-network
add a comment |
up vote
2
down vote
favorite
up vote
2
down vote
favorite
I am trying to build a neural network classifier. I have created a neural network with 1 hidden layer (25 hidden neurons) and 1 output layer (1 neuron/binary classification).
The dataset I am using has the following dimensions:
size(X_Train): 125973 x 122
size(Y_Train): 125973 x 1
size(X_Test): 22543 x 122
size(Y_test): 22543 x 1
My overall goal is to compare different training functions. But I would like first to get your feedback about my code and how I improve it.
% Neural Network Binary-classification
clear ; close all; clc
%% =========== Part 1: Loading Data =============
%% Load Training Data
fprintf('Loading Data ...n');
load('dataset.mat'); % training data stored in arrays X, y
X_training=X_training';
Y_training=Y_training';
X_testing=X_testing';
Y_testing=Y_testing';
%% Create the neural network
% 1, 2: ONE input, TWO layers (one hidden layer and one output layer)
% [1; 1]: both 1st and 2nd layer have a bias node
% [1; 0]: the input is a source for the 1st layer
% [0 0; 1 0]: the 1st layer is a source for the 2nd layer
% [0 1]: the 2nd layer is a source for your output
net = network(1, 2, [1; 1], [1; 0], [0 0; 1 0], [0 1]);
net.inputs{1}.size = 122; % input size
net.layers{1}.size = 25; % hidden layer size
net.layers{2}.size = 1; % output layer size
%% Transfer function in layers
net.layers{1}.transferFcn = 'logsig';
net.layers{2}.transferFcn = 'logsig';
net.layers{1}.initFcn = 'initnw';
net.layers{2}.initFcn = 'initnw';
net=init(net);
%% divide data into training and test
net.divideFcn= 'dividerand';
net.divideParam.trainRatio = 60/100; % 80% training
net.divideParam.valRatio = 20/100; % 20% validation set
net.divideParam.testRatio = 20/100; % 20% validation set
net.performFcn = 'crossentropy';
%% Training functions
net.trainFcn = 'trainscg'; %Scaled conjugate gradient backpropagation
%% Train the neural network
[net,tr] = train(net,X_training,Y_training); % return the network and training record
%% Test the Neural Network on the training set
outputs = net(X_training);
errors = gsubtract(Y_training,outputs);
performance = perform(net,Y_training,outputs);
%% Plots (%training)
figure, plotperform(tr)
figure, plottrainstate(tr)
%% Test the Neural Network on the testing test
outputs1 = net(X_testing);
errors1 = gsubtract(Y_testing,outputs1);
performance1 = perform(net,Y_testing,outputs1);
figure, plotconfusion(Y_testing,outputs1)
figure, ploterrhist(errors1)
Below if the validation curve.
Confusion Matrix (Training set)
Confusion Matrix (Testing set)
Any remarks?
Edit:
I have used feature scaling or normalization:
net.performParam.normalization = 'standard';
which improved the overall accuracy:
For more information, I have added the error histogram:
machine-learning matlab neural-network
I am trying to build a neural network classifier. I have created a neural network with 1 hidden layer (25 hidden neurons) and 1 output layer (1 neuron/binary classification).
The dataset I am using has the following dimensions:
size(X_Train): 125973 x 122
size(Y_Train): 125973 x 1
size(X_Test): 22543 x 122
size(Y_test): 22543 x 1
My overall goal is to compare different training functions. But I would like first to get your feedback about my code and how I improve it.
% Neural Network Binary-classification
clear ; close all; clc
%% =========== Part 1: Loading Data =============
%% Load Training Data
fprintf('Loading Data ...n');
load('dataset.mat'); % training data stored in arrays X, y
X_training=X_training';
Y_training=Y_training';
X_testing=X_testing';
Y_testing=Y_testing';
%% Create the neural network
% 1, 2: ONE input, TWO layers (one hidden layer and one output layer)
% [1; 1]: both 1st and 2nd layer have a bias node
% [1; 0]: the input is a source for the 1st layer
% [0 0; 1 0]: the 1st layer is a source for the 2nd layer
% [0 1]: the 2nd layer is a source for your output
net = network(1, 2, [1; 1], [1; 0], [0 0; 1 0], [0 1]);
net.inputs{1}.size = 122; % input size
net.layers{1}.size = 25; % hidden layer size
net.layers{2}.size = 1; % output layer size
%% Transfer function in layers
net.layers{1}.transferFcn = 'logsig';
net.layers{2}.transferFcn = 'logsig';
net.layers{1}.initFcn = 'initnw';
net.layers{2}.initFcn = 'initnw';
net=init(net);
%% divide data into training and test
net.divideFcn= 'dividerand';
net.divideParam.trainRatio = 60/100; % 80% training
net.divideParam.valRatio = 20/100; % 20% validation set
net.divideParam.testRatio = 20/100; % 20% validation set
net.performFcn = 'crossentropy';
%% Training functions
net.trainFcn = 'trainscg'; %Scaled conjugate gradient backpropagation
%% Train the neural network
[net,tr] = train(net,X_training,Y_training); % return the network and training record
%% Test the Neural Network on the training set
outputs = net(X_training);
errors = gsubtract(Y_training,outputs);
performance = perform(net,Y_training,outputs);
%% Plots (%training)
figure, plotperform(tr)
figure, plottrainstate(tr)
%% Test the Neural Network on the testing test
outputs1 = net(X_testing);
errors1 = gsubtract(Y_testing,outputs1);
performance1 = perform(net,Y_testing,outputs1);
figure, plotconfusion(Y_testing,outputs1)
figure, ploterrhist(errors1)
Below if the validation curve.
Confusion Matrix (Training set)
Confusion Matrix (Testing set)
Any remarks?
Edit:
I have used feature scaling or normalization:
net.performParam.normalization = 'standard';
which improved the overall accuracy:
For more information, I have added the error histogram:
machine-learning matlab neural-network
machine-learning matlab neural-network
edited 16 hours ago
asked yesterday
U. User
286
286
add a comment |
add a comment |
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcodereview.stackexchange.com%2fquestions%2f207937%2fneural-network-classifier-in-matlab%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown