This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
So I'm wondering about the effectiveness of using Variable Refresh Rate technology on an external display vs locking the FPS of games. I'm leaving it open ended on either tech so both parties can chime in.
So the use case would be a 4k display with a minimum 60Hz and having the laptop games limited to 60fps. I'm not playing eSports games where higher fps is needed. And I'd most likely be happy with 30fps.
I'm a noob when it comes to VRR. I understand it to be a tech that works between the displays and computer to set and display the same frame rate on the fly instead of having to worry about manually setting this up. I figure it's most helpful when you're struggling to meet a certain frame rate instead of exceeding it. So the display can do 144fps but you can only output 100fps with 1%lows of 80fps so the display sets itself to something like 100.
Does VRR matter if you can outdo the displays refresh rate? Say the computer can output 120 fps and the display is limited to 60 Hz. Does VRR do anything? Or is there even a problem with visuals in this case?
How about if the computer can only do 30fps while the non-VRR display is 60Hz? I'm thinking this is where VRR would be useful
Thanks for the help
Subreddit
Post Details
- Posted
- 2 years ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/GamingLapto...