This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
When you’re using a laptop and plug into external monitors, it takes a while, often with chunks of black screens or weird formatting, until the screens become usable.
Why is that? It doesn’t make sense to me intuitively since the screens are being updated 60 times a second anyway and windows and content is constantly changing. It’s just the initialization that seems to take so long. Why?
When you connect a monitor, the graphics card detects it and communicates with the OS. The OS checks the monitor's info (name, resolution, refresh rate) and decides how to use it (extend, mirror). If extending, the OS tells the graphics card to start displaying on it, initializing the monitor, which may show a black screen initially. The OS reorganizes the desktop, potentially moving windows. Applications might re-render or rearrange, causing temporary glitches until everything settles.
4o
Subreddit
Post Details
- Posted
- 3 months ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/AskEngineer...