This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
I have a 128gb m3 macbook pro. Sometimes I use Llama other times I use LM studio. The latter will give me an approx that certain models that are about 40-60gb will run (some smaller goliaths come to mind on what I used) but ultimately didnt launch.
What is the best way to run the models on a mac?
I really want to try "command r" (any suggestions? A quick look left me confused on what to do where) and the new mixtral, but am also down to test anything with my hardware if somebody wants to suggest anything.
Another question; is there a way I can have a simple OS (like a recovery mode) more focused on the LM so I can squeeze more out of the chip? I dont think ive heard of it but ive also seen some interfaces on here I didnt recognize.
Subreddit
Post Details
- Posted
- 5 months ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/LocalLLaMA/...