This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
Everyone knows that in CGI, as in all fields of computation, Moore's law has been a boon. However, in addition to that we have the many improvements due to better algorithms/software/architectures etc. which allow us to better exploit the computational power we have.
My question is how much of the gains that we've seen over 10, 15, 20 years in CGI can be attributed to improved software? I'm aware this is not exactly an easy question to answer quantitatively, so perhaps it might be better asked in terms of hardware improvements- if I were to dredge up an average computer from 10, 15, 20 years ago, and install modern software or equivalents, what sorts of image qualities could I expect? How would pre-rendered vs realtime rendered images and videos compare? How much would having an old GPU compare to none at all? To what degree are modern efficient algorithms/engines reliant on a certain baseline amount of computational horsepower to unlock their potential?
I'm aware that the answers to these questions probably vary from case to case, so don't be afraid to get into the weeds if need be.
Subreddit
Post Details
- Posted
- 6 years ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/computergra...