This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
1
Cline with local LLM on Mac
Post Flair (click to view more posts with a particular flair)
Post Body
Does anyone had any success using Ollama with cline on a mac? I do have a macbook pro M3 Max so it should handle local LLMs pretty decently. When trying to run Ollama it does respond, but just repeating the same on all questions (regardless of what model I choose) - also tried with LLM Studio - there it does work better, but I feel LLM studio does have a bit higher response time them Ollama.
Any suggestions here how to get Cline to work decently with any local LLM on macs?
Author
Account Strength
90%
Account Age
10 years
Verified Email
Yes
Verified Flair
No
Total Karma
2,786
Link Karma
1,174
Comment Karma
1,612
Profile updated: 1 week ago
Posts updated: 1 day ago
Subreddit
Post Details
We try to extract some basic information from the post title. This is not
always successful or accurate, please use your best judgement and compare
these values to the post title and body for confirmation.
- Posted
- 1 week ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/ChatGPTCodi...