does anybody ever use neural network to do a prediction with regularization optimization instead of early stopping? Hi, everyone, I am trying to train a neural network (NN) for prediction, to prevent overfitting, I chose to use regularization method for optimization, so I chose 'trainbr' as the training function, and 'msereg' as the performance function. The input and output data are preprocessed to constrain them to be within [-1,1]. And the data is divided into 2 groups randomly, one for training (70%), one for testing (30%). Below is part of my codes, does anyone can help me to check it? I am new learner of NN, not sure whether it's correct or not. The NN I am designing includes one hidden layer, one input layer (6 inputs), and one output layer (one output). I am trying to loop through 1 to 60 of the hidden neurons to find a good result, but right now, the result I get is not good at all, I am considering, maybe the code is not properly written. Thanks! %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% for k=1:num %clear; %clc; RandStream.setGlobalStream(RandStream('mt19937ar','seed',1)); % reset the global random number stream to its initial settings, this cause rand,randi and randn to start over, as if in a new matlab session.
Kshitij Singh answered .
2025-11-20
Great approach! Using Bayesian Regularization (`trainbr`) and Mean Squared Error with Regularization (`msereg`) is an excellent strategy to prevent overfitting when training neural networks. Here are the steps to ensure your process is on point:
1. Preprocess Data:
- Normalize your data to be within the range `[-1, 1]` as you mentioned. This helps in better convergence during training.
2. Create and Configure the Neural Network:
- Create a feedforward neural network and configure it to use the `trainbr` training function and `msereg` performance function.
3. Split Data:
- Divide your dataset into training (70%) and testing (30%) sets randomly.
4. Train the Network:
- Train the network with the training data using the `train` function.
5. Evaluate Performance:
- Evaluate the network's performance using the testing data to ensure that it generalizes well.
Here’s a concise code snippet to guide you through the process:
% Assuming 'inputData' and 'targetData' are your preprocessed datasets
% Split the data into training (70%) and testing (30%) sets
[trainInd, ~, testInd] = dividerand(size(inputData, 2), 0.7, 0, 0.3);
trainInput = inputData(:, trainInd);
trainTarget = targetData(:, trainInd);
testInput = inputData(:, testInd);
testTarget = targetData(:, testInd);
% Create and configure the neural network
hiddenLayerSize = 10; % Adjust based on your needs
net = feedforwardnet(hiddenLayerSize, 'trainbr');
net.performFcn = 'msereg';
% Train the network
[net, tr] = train(net, trainInput, trainTarget);
% Evaluate the network on test data
predictions = net(testInput);
% Calculate performance on test data
performance = perform(net, testTarget, predictions);
disp(['Performance on test data: ', num2str(performance)]);
Key Points:
- Regularization: By using `trainbr`, you're automatically applying Bayesian regularization, which helps control the complexity of the network and reduces the risk of overfitting.
- Performance Evaluation: Evaluate your network on unseen data (test set) to ensure it generalizes well and does not overfit the training data.
This should set you on the right path to training a robust neural network for your prediction task.