This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
Hi r/deeplearning. I'm very interested in AI and computer vision and I'm building a workstation so I can start doing my own projects. My problem is VRAM. I wanted to buy an AMD card because it's more bang for your bux and you get 16GB of vram. But found out that AMD's software stack sucks so Nvidia seems like the way to go (almost required apparently).
3070ti (8gb vram) was my choice before I found out how important vram was, so 3060ti/3070/3070ti is out of the question. 4000-series seems very overpriced, and anything older than 3000-series seems outdated. That leaves me with 3060 (12gb/3584cc), 3080 (10gb/8704cc). Rtx3080ti fall outside my budget.
My thoughts here is 3080 has twice as many cuda cores, but only 10gb ram (Very hard to find 12gb version in my country). The 3060 has more ram, is a lot cheaper. But again, only have half the cuda cores of the 3080.
The obvious answer to me here seem to be to buy the 3080 now, and if vram ever end up being a limitation, I can buy a riser, a 3060, and have a combined 22GB of memory while running the 3060 externally. Does this seem like a good way of solving this problem?
Subreddit
Post Details
- Posted
- 1 year ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/deeplearnin...