Coming soon - Get a detailed view of why an account is flagged as spam!
view details

This post has been de-listed

It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.

10
Trying to figure out what GPU to buy...
Post Body

Hi r/deeplearning. I'm very interested in AI and computer vision and I'm building a workstation so I can start doing my own projects. My problem is VRAM. I wanted to buy an AMD card because it's more bang for your bux and you get 16GB of vram. But found out that AMD's software stack sucks so Nvidia seems like the way to go (almost required apparently).

3070ti (8gb vram) was my choice before I found out how important vram was, so 3060ti/3070/3070ti is out of the question. 4000-series seems very overpriced, and anything older than 3000-series seems outdated. That leaves me with 3060 (12gb/3584cc), 3080 (10gb/8704cc). Rtx3080ti fall outside my budget.

My thoughts here is 3080 has twice as many cuda cores, but only 10gb ram (Very hard to find 12gb version in my country). The 3060 has more ram, is a lot cheaper. But again, only have half the cuda cores of the 3080.

The obvious answer to me here seem to be to buy the 3080 now, and if vram ever end up being a limitation, I can buy a riser, a 3060, and have a combined 22GB of memory while running the 3060 externally. Does this seem like a good way of solving this problem?

Author
Account Strength
50%
Account Age
1 year
Verified Email
Yes
Verified Flair
No
Total Karma
2,477
Link Karma
197
Comment Karma
2,280
Profile updated: 2 days ago
Posts updated: 3 months ago

Subreddit

Post Details

We try to extract some basic information from the post title. This is not always successful or accurate, please use your best judgement and compare these values to the post title and body for confirmation.
Posted
1 year ago