This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
3
Best and fastest 2-3b model I can run?
Post Flair (click to view more posts with a particular flair)
Post Body
So this space changes so fast it's nuts.
I have LMStudio and Openwebui running on my PC, 8gb RTX 4060 GPU.
I want to run a small model that is as fast as possible, and also as good as possible for text summarization and similar tasks, as an API.
I know there's unsloth, bnb, exlama, all these things. Im just not updated enough on what to run here.
Currently I'm using LMStudio with their Gemma 2b. It's alright, but I assume there's a much better solution out there? Any help would be greatly appreciated.
Author
Account Strength
10%
Account Age
1 year
Verified Email
No
Verified Flair
No
Total Karma
409
Link Karma
112
Comment Karma
297
Profile updated: 3 days ago
Posts updated: 2 days ago
Subreddit
Post Details
We try to extract some basic information from the post title. This is not
always successful or accurate, please use your best judgement and compare
these values to the post title and body for confirmation.
- Posted
- 2 weeks ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/LocalLLaMA/...