Least Square Regression in MATLAB Programming

MATLAB Illustration

Introduction

Least Squares Regression is a fundamental technique in statistics and data analysis used to find the best-fitting line or curve for a set of data points. In MATLAB, it is widely used for predictive modeling, data fitting, and trend analysis.

The goal is to minimize the sum of squared differences between the observed data and the model prediction:

S=∑i=1n(yi−yi^)2S = sum_{i=1}^{n} (y_i - hat{y_i})^2

where yiy_i are the observed values, and yi^hat{y_i} are the predicted values from the regression model.


Step 1: Define Data in MATLAB

Suppose you have the following dataset:

 
x = [1 2 3 4 5]; y = [2.2 2.8 3.6 4.5 5.1];

These vectors represent independent variable xx and dependent variable yy.


Step 2: Perform Least Squares Regression

For linear regression, the model is:

y=a∗x+by = a*x + b

You can compute the coefficients using the backslash operator or polyfit() in MATLAB.

Method 1: Using polyfit()

 
coeff = polyfit(x, y, 1); % 1 indicates linear fit a = coeff(1); b = coeff(2); fprintf('Slope: %.2f, Intercept: %.2f ', a, b);

Method 2: Using Matrix Approach

 
X = [x' ones(length(x),1)]; % Design matrix Y = y'; theta = (X'*X)(X'*Y); % Least squares solution a = theta(1); b = theta(2); fprintf('Slope: %.2f, Intercept: %.2f ', a, b);

Step 3: Predict and Visualize

Predict the fitted values:

 
y_fit = a*x + b;

Plot the data and the fitted regression line:

 
scatter(x, y, 'filled') hold on plot(x, y_fit, 'r-', 'LineWidth', 2) title('Least Squares Regression in MATLAB') xlabel('X') ylabel('Y') legend('Data points', 'Fitted line') grid on hold off

Step 4: Evaluate Regression Accuracy

Compute R-squared to measure the goodness of fit:

 
SS_res = sum((y - y_fit).^2); SS_tot = sum((y - mean(y)).^2); R_squared = 1 - SS_res/SS_tot; fprintf('R-squared: %.4f ', R_squared);

A higher R-squared (close to 1) indicates a better fit.


Step 5: Nonlinear Regression (Optional)

MATLAB also supports nonlinear regression using fit() or nlinfit():

 
ft = fittype('a*exp(b*x)'); f = fit(x', y', ft, 'StartPoint', [1 0.1]); plot(f, x, y)

This allows fitting more complex models beyond straight lines.


Conclusion

Least Squares Regression in MATLAB provides a simple yet powerful way to fit data, make predictions, and analyze trends. With functions like polyfit, matrix operations, and fit, you can handle both linear and nonlinear regression efficiently. This method is widely used in engineering, finance, and scientific research.

What Our Students Say

★★★★★

“I got full marks on my MATLAB assignment! The solution was perfect and delivered well before the deadline. Highly recommended!”

Aditi Sharma, Mumbai
★★★★☆

“Quick delivery and excellent communication. The team really understood the problem and provided a great solution. Will use again.”

John M., Australia

Latest Blogs

Explore how MATLAB Solutions has helped clients achieve their academic and research goals through practical, tailored assistance.

MCP-Enabled Robotics Control Systems with MATLAB

In today\\\'s rapidly advancing era of automation, robotics control systems are evolving to meet the demand for smarter, faster, and more reliable performance. Among the many innovations driving this transformation is the use of MCP (Model-based Control Paradigms)

LLM-Driven Financial Forecasting Models in MATLAB

The financial sector is witnessing a technological revolution with the rise of Large Language Models (LLMs). Traditionally used for text analysis, LLMs are now being integrated with powerful platforms like MATLAB to develop financial forecasting models