This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
211
[N] Huggingface/nvidia release open source GPT-2B trained on 1.1T tokens
Post Flair (click to view more posts with a particular flair)
Post Body
https://huggingface.co/nvidia/GPT-2B-001
Model Description
GPT-2B-001 is a transformer-based language model. GPT refers to a class of transformer decoder-only models similar to GPT-2 and 3 while 2B refers to the total trainable parameter count (2 Billion) [1, 2].
This model was trained on 1.1T tokens with NeMo.
Requires Ampere or Hopper devices.
Duplicate Posts
2 posts with the exact same title by 1 other authors
View Details
Author
Account Strength
100%
Account Age
8 years
Verified Email
Yes
Verified Flair
No
Total Karma
113,079
Link Karma
86,043
Comment Karma
24,311
Profile updated: 12 hours ago
Subreddit
Post Details
We try to extract some basic information from the post title. This is not
always successful or accurate, please use your best judgement and compare
these values to the post title and body for confirmation.
- Posted
- 1 year ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/MachineLear...