im new in matlab, please sorry if its stupid question. and sorry my english. trying to train feedforward network to solve XOR function 1 hidden layer with 2 neurons, other settings are default: TANSIG, Backprop, TRAINLM, LEARNGDM, MSE R2012b matlab version close all, clear all, clc, format compact p = [0 1 0 1 ; 0 0 1 1]; t = [0 1 1 0]; net = feedforwardnet(2,'trainlm'); net = train(net,p,t); a = net(p) ive tried this code, and tried 'nntool' and 'nnstart' too. its always seems like training algorithm splits 'p' set for 2 - training set, 1 - validation set, 1 - testing set as a result - network is training on partial data (2 pair of digits instead 4), and training process generates Validation done or Minimum gradient reached (1.00e-010) in very few iteration (1-10 iterations) and simulation shows that network untrained. Is my guess right (about splitting 'p' set)? how i can manually give validation data (input and output sets) to training algorithm? should i somehow expand 'p' and 't' sets, and then use divideblock? any other ideas?
Prashant Kumar answered .
2025-11-20
1.[ I N ] = size(x) % [ 2 4 ] [ O N ] = size(t) % [ 1 4 ] Neq = prod(size(t) % 4 = No. of training equations
2. For tthis small data set it doesn't make sense to use data division for validation stopping. So,
net.divideFcn = 'dividetrain'; % or equivalently, = ' ';
3. Since the No. of estimated weights for H hidden nodes is
%Nw = (I+1)*O = 3 for H=0 %Nw = (I+1)*H+(H+1)*O for H >0
the condition Neq >= Nw yields the following upper bound for H
Hub = (Neq-O)/(I+O+1) % 3/4
net = patternnet(2); % for classification
net.trainParam.goal = 0.01*var(t',1);
[ net tr y e ] = train(net,x,t);