Coming soon - Get a detailed view of why an account is flagged as spam!
view details

This post has been de-listed

It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.

18
[Opinion] What's the best LLM for 12gb VRAM?
Post Flair (click to view more posts with a particular flair)
Author Summary
roz303 is in Opinion
Post Body

Hi all, been getting back into LLMs lately - I've been working with them for about two years, locally off and on for the past year. My local server is a humble Xeon 64gb 3060 12gb. And, as we all know, what was SOTA three months ago might not be SOTA today. So I'd like your opinions: for scientific-oriented text generation (maybe code too, but tiny models aren't the best at that imo?), what's the best performing model, or model and quant, for my little LLM server? Huggingface links would be most appreciated too 🤗

Author
Account Strength
100%
Account Age
9 years
Verified Email
Yes
Verified Flair
No
Total Karma
32,980
Link Karma
23,523
Comment Karma
9,126
Profile updated: 4 days ago
Posts updated: 2 months ago

Subreddit

Post Details

Location
We try to extract some basic information from the post title. This is not always successful or accurate, please use your best judgement and compare these values to the post title and body for confirmation.
Posted
4 months ago