Coming soon - Get a detailed view of why an account is flagged as spam!
view details

This post has been de-listed

It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.

3
Contribution of improved algorithms/software to CGI quality over time?
Post Body

Everyone knows that in CGI, as in all fields of computation, Moore's law has been a boon. However, in addition to that we have the many improvements due to better algorithms/software/architectures etc. which allow us to better exploit the computational power we have.

My question is how much of the gains that we've seen over 10, 15, 20 years in CGI can be attributed to improved software? I'm aware this is not exactly an easy question to answer quantitatively, so perhaps it might be better asked in terms of hardware improvements- if I were to dredge up an average computer from 10, 15, 20 years ago, and install modern software or equivalents, what sorts of image qualities could I expect? How would pre-rendered vs realtime rendered images and videos compare? How much would having an old GPU compare to none at all? To what degree are modern efficient algorithms/engines reliant on a certain baseline amount of computational horsepower to unlock their potential?

I'm aware that the answers to these questions probably vary from case to case, so don't be afraid to get into the weeds if need be.

Author
Account Strength
100%
Account Age
8 years
Verified Email
Yes
Verified Flair
No
Total Karma
5,906
Link Karma
381
Comment Karma
5,513
Profile updated: 5 days ago
Posts updated: 10 months ago

Subreddit

Post Details

We try to extract some basic information from the post title. This is not always successful or accurate, please use your best judgement and compare these values to the post title and body for confirmation.
Posted
6 years ago