Results for
Local LLMs with MATLAB
Local large language models (LLMs), such as llama, phi3, and mistral, are now available in the Large Language Models (LLMs) with MATLAB repository through Ollama™! This is such exciting news that I can’t think of a better introduction than to share with you this amazing development. Even if you don’t read any further (but I hope you do), because you
- Read the Community Guidelines: Understanding our community standards is crucial. Please take a moment to familiarize yourself with them. Keep in mind that posts not adhering to these guidelines may be flagged by moderators or other community members.
- Ask Technical Questions at MATLAB Answers: If you have questions related to MathWorks products, head over to MATLAB Answers (new question form - Ask the community). It's the go-to spot for technical inquiries, with responses often provided within an hour, depending on the complexity of the question and volunteer availability. To increase your chances of a speedy reply, check out our tips on how to craft a good question (link to post on asking good questions).
- Choosing the Right Channel: We offer a variety of discussion channels tailored to different contexts. Select the one that best fits your post. If you're unsure, the General channel is always a safe bet. If you feel there's a need for a new channel, we encourage you to suggest it in the Ideas channel.
- Reporting Issues: If you encounter posts that violate our guidelines, please use the 🚩Flag/Report feature (found in the 3-dot menu) to bring them to our attention.
- Quality Control: We strive to maintain a high standard of discussion. Accounts that post spam or too much nonsense may be subject to moderation, which can include temporary suspensions or permanent bans.
- Share Your Ideas: Your feedback is invaluable. If you have suggestions on how we can improve the community or MathWorks products, the Ideas channel is the perfect place to voice your thoughts.