This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
Situation: A hospital client has upgraded their VM infrastructure to the cloud. They have asked me to configure their 20x S9150 cards to work with Shark for AI Workload Distribution That’s 320GB beautiful image generating glory spread across 880 Compute Units AKA 56,320 Stream Processors. **Use Case: They want to give kids and adult patients the ability to generate (censored of course) images after brain trauma or other neurological issues. They also want to incorporate it in the psychiatric unit to help people express themselves who are having issues communicating.
I get it, there is a duty to stockholders and time is money. But for customers who have the 2x 4GB Radeon Pro Duo, the 16GB or 32GB W9100, the 8GB 290X, 390 or 390X, 4GB Fury, the 8GB W7100 or 8GB W8100, or the there may be a sense of disappointment. There is money for them too, and it’s being lost. And they may opt for an alternate product that didn’t orphan device architectures for internet political reasons (Raja, Polaris, Vega, etc)
For server cards, there are the FirePro 8GB 7100X and S7150, the 16GB S7150X2, the 12GB S9100, 16GB S9150, 32GB S7190, and then 2x 4GB S9300 (Fiji w HBM)
These customers bought AMD’s highest end products at a time when everyone thought AMD was about to fail.
I get that there was some “Raja Drama” where previous projects stopped to pivot on GCN4 and Vega.
But everyone should be over that by now.
TL;DR Lots of great cards for Tensorflow/Torch/PyTorch on ROCM/HIP. Just need the actual driver support to finally built.
**AND this would benefit other applications like Blender, and a heck of a lot more.*
AND this might get those Bristol Ridge APUs sold into 2018 with GCN3 some support as well for shared-memory compute.
Subreddit
Post Details
- Posted
- 1 year ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/Amd/comment...