This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
1,058
Chinese company trained GPT-4 rival with just 2,000 GPUs — 01.ai spent $3M compared to OpenAI's $80M to $100M
Post Flair (click to view more posts with a particular flair)
Comments
[not loaded or deleted]
Author
Account Strength
100%
Account Age
8 years
Verified Email
Yes
Verified Flair
No
Total Karma
23,094
Link Karma
15,609
Comment Karma
7,389
Profile updated: 1 week ago
Subreddit
Post Details
We try to extract some basic information from the post title. This is not
always successful or accurate, please use your best judgement and compare
these values to the post title and body for confirmation.
- Posted
- 2 months ago
- Reddit URL
- View post on reddit.com
- External URL
- tomshardware.com/tech-in...
There's a graph somewhere showing how fast it would be to train AlexNet on modern hardware with all the software efficieny gains. It would take just seconds. Anybody remember that graph?