Applying z-score before scaling to [0,1]?

Illustration
Sepp - 2022-06-09T12:53:09+00:00
Question: Applying z-score before scaling to [0,1]?

Hello   I'm currently using neural network for classification of a dataset. Of course before doing classification either the data points or the features should be normalized. The toolbox which I'm using for neural network requires all values to be in range [0,1].   Does it make sense to first apply z-score and then to scale to range [0,1]?   Second, should I normalize along the feature vectors or the data points (either applying z-score or to range [0,1])?

Expert Answer

Profile picture of John Michell John Michell answered . 2025-11-20

It is well known (e.g., see the comp.ai.neural-nets FAQ) that the most efficient MLP nets are those which have
  • 1. Bipolar sigmoid hidden node transfer functions, e.g., TANSIG( == TANH ), NOT LOGSIG !
  • 2. Bipolar scaled input variables. For example
  • a. Normalized to [-1,1] via MAPMINMAX (MATLAB's default)
  • b. Standardized to zero-mean/unit-variance via MAPSTD or ZSCORE
  • 3. However, the initial weight assignments should assure that initial hidden node outputs are in the linear region of the sigmoid.
Before training I always use the functions MINMAX (NOT mapminmax), ZSCORE and PLOT to eliminate or modify outliers and incorrect data.
 
Even though I prefer standardization, I accept MATLAB's [-1,1] default, which I assume is taken into account by MATLAB's default weight initialization. (I guess I should check this ... I've been burned by other logical assumptions).
 
BOTTOM LINE: Always use centered inputs and tansig hidden layer functions for MLPs. [If you don't, people may point at you and laugh (:>( ].
 


Not satisfied with the answer ?? ASK NOW

Get a Free Consultation or a Sample Assignment Review!