Nonlinear Regression 2

Examine Quality and Adjust the Fitted Nonlinear Model

There are diagnostic plots to help you examine the quality of a model. plotDiagnostics(mdl) gives a variety of plots, including leverage and Cook's distance plots. plotResiduals(mdl) gives the difference between the fitted model and the data.

There are also properties of mdl that relate to the model quality. mdl.RMSE gives the root mean square error between the data and the fitted model. mdl.Residuals.Raw gives the raw residuals. mdl.Diagnostics contains several fields, such as Leverage and CooksDistance, that can help you identify particularly interesting observations.

This example shows how to examine a fitted nonlinear model using diagnostic, residual, and slice plots.

Load the sample data.

load reaction

Create a nonlinear model of rate as a function of reactants using the hougen.m function.

beta0 = ones(5,1);
mdl = fitnlm(reactants,...
    rate,@hougen,beta0);

Make a leverage plot of the data and model.

plotDiagnostics(mdl)

Figure contains an axes object. The axes object with title Case order plot of leverage contains 2 objects of type line. These objects represent Leverage, Reference Line.

There is one point that has high leverage. Locate the point.

[~,maxl] = max(mdl.Diagnostics.Leverage)
maxl = 6

Examine a residuals plot.

plotResiduals(mdl,'fitted')

Figure contains an axes object. The axes object with title Plot of residuals vs. fitted values contains 2 objects of type line.

Nothing stands out as an outlier.

Use a slice plot to show the effect of each predictor on the model.

plotSlice(mdl)

Figure Prediction Slice Plots contains 3 axes objects and other objects of type uimenu, uicontrol. Axes object 1 contains 5 objects of type line. Axes object 2 contains 5 objects of type line. Axes object 3 contains 5 objects of type line.

You can drag the vertical dashed blue lines to see the effect of a change in one predictor on the response. For example, drag the X2 line to the right, and notice that the slope of the X3 line changes.

Predict or Simulate Responses Using a Nonlinear Model

This example shows how to use the methods predictfeval, and random to predict and simulate responses to new data.

Randomly generate a sample from a Cauchy distribution.

rng('default')
X = rand(100,1);
X = tan(pi*X - pi/2);

Generate the response according to the model y = b1*(pi /2 + atan((x - b2) / b3)) and add noise to the response.

modelfun = @(b,x) b(1) * ...
    (pi/2 + atan((x - b(2))/b(3)));
y = modelfun([12 5 10],X) + randn(100,1);

Fit a model starting from the arbitrary parameters b = [1,1,1].

beta0 = [1 1 1]; % An arbitrary guess
mdl = fitnlm(X,y,modelfun,beta0)
mdl = 
Nonlinear regression model:
    y ~ b1*(pi/2 + atan((x - b2)/b3))

Estimated Coefficients:
          Estimate      SE       tStat       pValue  
          ________    _______    ______    __________

    b1     12.082     0.80028    15.097    3.3151e-27
    b2     5.0603      1.0825    4.6747    9.5063e-06
    b3       9.64     0.46499    20.732    2.0382e-37


Number of observations: 100, Error degrees of freedom: 97
Root Mean Squared Error: 1.02
R-Squared: 0.92,  Adjusted R-Squared 0.918
F-statistic vs. zero model: 6.45e+03, p-value = 1.72e-111

The fitted values are within a few percent of the parameters [12,5,10].

Examine the fit.

plotSlice(mdl)

Figure Prediction Slice Plots contains an axes object and other objects of type uimenu, uicontrol. The axes object contains 6 objects of type line.

predict

The predict method predicts the mean responses and, if requested, gives confidence bounds. Find the predicted response values and predicted confidence intervals about the response at X values [-15;5;12].

Xnew = [-15;5;12];
[ynew,ynewci] = predict(mdl,Xnew)
ynew = 3×1

    5.4122
   18.9022
   26.5161

ynewci = 3×2

    4.8233    6.0010
   18.4555   19.3490
   25.0170   28.0151

The confidence intervals are reflected in the slice plot.

feval

The feval method predicts the mean responses. feval is often more convenient to use than predict when you construct a model from a dataset array.

Create the nonlinear model from a dataset array.

ds = dataset({X,'X'},{y,'y'});
mdl2 = fitnlm(ds,modelfun,beta0);

Find the predicted model responses (CDF) at X values [-15;5;12].

Xnew = [-15;5;12];
ynew = feval(mdl2,Xnew)
ynew = 3×1

    5.4122
   18.9022
   26.5161

random

The random method simulates new random response values, equal to the mean prediction plus a random disturbance with the same variance as the training data.

Xnew = [-15;5;12];
ysim = random(mdl,Xnew)
ysim = 3×1

    6.0505
   19.0893
   25.4647

Rerun the random method. The results change.

ysim = random(mdl,Xnew)
ysim = 3×1

    6.3813
   19.2157
   26.6541

 

Matlabsolutions.com provides guaranteed satisfaction with a commitment to complete the work within time. Combined with our meticulous work ethics and extensive domain experience, We are the ideal partner for all your homework/assignment needs. We pledge to provide 24*7 support to dissolve all your academic doubts. We are composed of 300+ esteemed Matlab and other experts who have been empanelled after extensive research and quality check.

Matlabsolutions.com provides undivided attention to each Matlab assignment order with a methodical approach to solution. Our network span is not restricted to US, UK and Australia rather extends to countries like Singapore, Canada and UAE. Our Matlab assignment help services include Image Processing Assignments, Electrical Engineering Assignments, Matlab homework help, Matlab Research Paper help, Matlab Simulink help. Get your work done at the best price in industry.

Machine Learning in MATLAB

Train Classification Models in Classification Learner App

Train Regression Models in Regression Learner App

Distribution Plots

Explore the Random Number Generation UI

Design of Experiments

Machine Learning Models

Logistic regression

Logistic regression create generalized linear regression model - MATLAB fitglm 2

Support Vector Machines for Binary Classification

Support Vector Machines for Binary Classification 2

Support Vector Machines for Binary Classification 3

Support Vector Machines for Binary Classification 4

Support Vector Machines for Binary Classification 5

Assess Neural Network Classifier Performance

Naive Bayes Classification

ClassificationTree class

Discriminant Analysis Classification

Ensemble classifier

ClassificationTree class 2

Train Generalized Additive Model for Binary Classification

Train Generalized Additive Model for Binary Classification 2

Classification Using Nearest Neighbors

Classification Using Nearest Neighbors 2

Classification Using Nearest Neighbors 3

Classification Using Nearest Neighbors 4

Classification Using Nearest Neighbors 5

Linear Regression

Linear Regression 2

Linear Regression 3

Linear Regression 4

Nonlinear Regression

Nonlinear Regression 2

Visualizing Multivariate Data

Generalized Linear Models

Generalized Linear Models 2

RegressionTree class

RegressionTree class 2

Neural networks

Gaussian Process Regression Models

Gaussian Process Regression Models 2

Understanding Support Vector Machine Regression

Understanding Support Vector Machine Regression 2

RegressionEnsemble