Results for
- Promote Your Books: Are you an author of a book on MATLAB or Simulink? Feel free to share your work with our community. We’re eager to learn about your insights and contributions to the field.
- Request Recommendations: Looking for a book on a specific topic? Whether you're diving into advanced simulations or just starting with MATLAB, our community is here to help you find the perfect read.
- Ask Questions: Curious about the MathWorks Book Program, or need guidance on finding resources? Post your questions and let our knowledgeable community assist you.
![](https://www.mathworks.com/matlabcentral/discussions/uploaded_files/36672/image.png)
![](https://www.mathworks.com/matlabcentral/discussions/uploaded_files/36673/image.png)
![](https://www.mathworks.com/matlabcentral/discussions/uploaded_files/36674/image.png)
![](https://www.mathworks.com/matlabcentral/discussions/uploaded_files/36675/image.png)
![](https://www.mathworks.com/matlabcentral/discussions/uploaded_files/36676/image.png)
![](https://www.mathworks.com/matlabcentral/discussions/uploaded_files/36677/image.png)
Local LLMs with MATLAB
Local large language models (LLMs), such as llama, phi3, and mistral, are now available in the Large Language Models (LLMs) with MATLAB repository through Ollama™! This is such exciting news that I can’t think of a better introduction than to share with you this amazing development. Even if you don’t read any further (but I hope you do), because you
![](https://www.mathworks.com/matlabcentral/discussions/uploaded_files/36608/image.png)
![](https://www.mathworks.com/matlabcentral/discussions/uploaded_files/35616/image.png)
Large Language Models (LLMs) with MATLAB
Connect MATLAB to Ollama™ (for local LLMs), OpenAI® Chat Completions API (which powers ChatGPT™), and Azure® OpenAI Services