Large Language Models (LLMs) like ChatGPT have transformed natural language processing, code generation, and data analysis. As of 2025, MATLAB makes it easier than ever to integrate powerful LLMs directly into your workflows using the official Large Language Models (LLMs) with MATLAB add-on (available via Add-On Explorer or GitHub). This supports OpenAI (ChatGPT), Azure OpenAI, and local models via Ollama – all without leaving the MATLAB environment.
Whether you're generating code, analyzing text, building chatbots, or running private offline models, this guide walks you through setup and hands-on examples.
Step 1: Install the LLMs with MATLAB Add-On
This add-on provides functions like openAIChat, azureOpenAIChat, and ollamaChat for seamless integration.
Connecting to OpenAI (ChatGPT API)
Access GPT models like gpt-4o, gpt-4o-mini, or gpt-3.5-turbo.
Example Code:
% Initialize ChatGPT connection
chat = openAIChat("You are a helpful MATLAB assistant.", ...
Model="gpt-4o-mini");
% Generate response
response = generate(chat, "Write a MATLAB function to plot sine wave");
disp(response);
This can generate, explain, or debug MATLAB code instantly.
Connecting to Azure OpenAI
Ideal for enterprise users needing secure, compliant deployments.
Example Code:
Matlab
chat = azureOpenAIChat( ...
Endpoint="https://your-resource.openai.azure.com/", ...
Deployment="gpt-4o-deployment", ...
APIKey="your-azure-key");
response = generate(chat, "Summarize this signal processing data: " + dataSummary);
disp(response);
Use for sensitive data with Azure's governance features.
Running Local LLMs with Ollama
Run models offline for privacy and zero cost.
Example Code:
Matlab
% Connect to local Ollama model
chat = ollamaChat(Model="llama3");
% Query the local model
response = generate(chat, "Explain Fourier transform in simple terms");
disp(response);
Supports tool calling, JSON output, and RAG workflows.
Advanced Features