Main Content
Results for
You've probably heard about the DeepSeek AI models by now. Did you know you can run them on your own machine (assuming its powerful enough) and interact with them on MATLAB?
In my latest blog post, I install and run one of the smaller models and start playing with it using MATLAB.
Larger models wouldn't be any different to use assuming you have a big enough machine...and for the largest models you'll need a HUGE machine!
Even tiny models, like the 1.5 billion parameter one I demonstrate in the blog post, can be used to demonstrate and teach things about LLM-based technologies.
Have a play. Let me know what you think.
When I want to understand a problem, I'll often use different sources. I'll read different textbooks, blog posts, research papers and ask the same question to different people. The differences in the solutions are almost always illuminating.
I feel the same way about AIs. Sometimes, I don't want to ask *THE* AI...I want to ask a bunch of them. They'll have different strengths and weaknesses..different personalities if you want to think of it that way.
I've been playing with the AI chat arena and there really is a lot of difference between the answers returned by different models. https://lmarena.ai/?arena
I think it would be great if the MATLAB Chat playgroundwere to allow the user to change which AI they were talking with.
What does everyone else think?