I am working on a neural network and would like to use cross entropy as my error function. I noticed from a previous question that MATLAB added this functionality starting with R2013b. I decided to test the crossentropy function by running the simple example provided in the documentation. The code is reprinted below for convenience: [x,t] = iris_dataset; net = patternnet(10); net = train(net,x,t); y = net(x); perf = crossentropy(net,t,y) When I run this code, I get perf = 0.0367. To verify this result, I ran the code: ce = -mean(sum(t.*log(y)+(1-t).*log(1-y))) which resulted in ce = 0.1100. Why are perf and ce unequal? Do I have an error in my calculation?
Prashant Kumar answered .
2025-11-20
Xent1 = -sum( t.*log(y))
For non-mutually exclusive targets and a logsig output, the corresponding form for crossentropy is
Xent2 = -sum( t.*log(y)) + (1-t).*log(1-y))
For your example I get
clear all, clc [ x, t ] = iris_dataset; [ O N ] = size(t) % [ 3 150 ] minmax0 = repmat([0 1],3,1) checkt1 = max(abs( minmax(t)- minmax0))%[0 0] checkt2 = max(abs(sum(t)-ones(1,N))) % 0 net = patternnet(10); rng(0) [ net tr y ] = train(net,x,t); checky1 = max(abs( minmax(y)- minmax0)) % checky1 = [ 2.4214e-4 1.8807e-3 ] checky2 = max(abs(sum(y)-ones(1,N))) % 2.2204e-16 perf = crossentropy(net,t,y) % 0.033005 Xent1 = mean(-sum(t.*log(y))) % 0.049552 Xent3 = mean(-sum((1-t).*log(1-y))) % 0.049464 Xent2 = mean(-sum(t.*log(y)+ (1-t).*log(1-y))) % 0.099015
Unfortunately, none of the following gives a formula
help crossentropy doc crossentropy type crossentropy