Resolution and HDR are the two specs that sound the most important and that marketing departments lie about the most. Resolution decides how sharp text and detail look. HDR decides how rich and bright bright scenes can be. Both interact with screen size in ways the box never explains.
Resolution basics
Resolution is the count of pixels across × down. Common gaming monitor resolutions:
| Name | Pixels | Total pixels | Common sizes |
|---|---|---|---|
| 1080p (FHD) | 1920 × 1080 | 2.1 M | 22"–25" |
| 1440p (QHD) | 2560 × 1440 | 3.7 M | 27" |
| 4K (UHD) | 3840 × 2160 | 8.3 M | 27"–32"+ |
| Ultrawide 1440p | 3440 × 1440 | 5.0 M | 34" |
| Ultrawide 4K | 3840 × 1600 or 5120 × 2160 | 6.1 M / 11 M | 38"–40" |
| Super-ultrawide | 5120 × 1440 or 5120 × 2160 | 7.4 M / 11 M | 49" |
Pixel density — the spec that matters more than resolution
Pixel density (pixels per inch, PPI) is what your eyes actually perceive. A 4K display can be sharp (27") or coarse (50") depending on size. The right starting question is 'what PPI do I want?' — then resolution and size follow.
| Size | 1080p PPI | 1440p PPI | 4K PPI |
|---|---|---|---|
| 24" | 92 PPI (acceptable) | 122 PPI (good) | 184 PPI (excessive) |
| 27" | 82 PPI (visibly grainy) | 109 PPI (good) | 163 PPI (excellent) |
| 32" | 69 PPI (poor) | 92 PPI (acceptable) | 138 PPI (good) |
| 34" UW | 82 PPI (poor) | 109 PPI (good) | — |
Rough rule of thumb: 90 PPI is the floor for text legibility at desk distance. 110+ PPI is genuinely sharp. 140+ PPI is 'retina-class' and you stop noticing pixels at any normal distance. Almost no one needs > 200 PPI on a desktop.
Scaling — when high-PPI fights your OS
At 4K on a 27" monitor, Windows and macOS scale the UI to 150–200% so text and icons aren't tiny. Modern Windows handles this well for most apps; older apps and some Java/Electron apps can look blurry at non-100% scaling. macOS handles scaling significantly better. Linux ranges from 'fine' to 'rough' depending on desktop environment.
What HDR actually is
Standard dynamic range (SDR) content is mastered for ~100 nits of peak brightness and uses an 8-bit color space (sRGB). HDR (High Dynamic Range) content is mastered for 1000+ nits, a wider color space (DCI-P3, Rec. 2020), and 10-bit color. The result: brighter highlights, deeper shadows, more saturated colors that look closer to real life.
For HDR to actually look like HDR, three things need to happen together: the content must be mastered in HDR, your monitor must be able to display it (a real HDR panel, not just 'HDR support'), and your OS/game must send the HDR signal correctly. All three break independently — HDR on Windows has been notorious for misconfiguration since launch.
HDR tiers — what the labels mean
- DisplayHDR 400
- VESA's lowest tier. Requires 400 nits peak, 8-bit color, no local dimming. Functionally a marketing label — most DisplayHDR 400 monitors look identical in SDR and 'HDR' mode. Avoid as a buying signal.
- DisplayHDR 600
- 600 nits peak, local dimming required. Genuine HDR experience, though limited. Common on better mainstream gaming monitors.
- DisplayHDR 1000
- 1000 nits peak, more rigorous color requirements, real local dimming. Real HDR. Common on Mini-LED monitors.
- DisplayHDR True Black 400 / 500
- OLED-specific tier. Lower peak brightness numbers (400/500 nits) but with true blacks and per-pixel dimming. Real HDR despite the lower numbers — contrast is what your eyes actually perceive.
- HDR10
- The most common open HDR signal format. Static metadata (one set of brightness/color targets per piece of content). Supported by basically everything.
- HDR10+
- Open format with dynamic metadata (per-scene targets). Better than HDR10. Supported in Samsung devices, some streaming services.
- Dolby Vision
- Proprietary HDR with dynamic metadata and 12-bit color targets. The best HDR format. Widely supported on TVs and streaming, rare on PC monitors (some recent Apple displays support it).
Color gamut and accuracy
- sRGB
- The standard color space for the web and most older content. Almost every monitor covers 100% sRGB.
- DCI-P3
- The cinema color space, used by HDR content. Wider than sRGB. Good HDR monitors cover 90–98% DCI-P3.
- Adobe RGB
- Print/photography color space. Important for photo editing. Independent of DCI-P3 in shape — a monitor can cover one well and the other poorly.
- Rec. 2020
- Ultra-wide color space targeted by future HDR. No consumer display covers more than ~80% of it today; 98% DCI-P3 ≈ 75% Rec. 2020.
Quantum dots — wider color, same brightness
Quantum dots (QD) are nanoscale particles that convert one wavelength of light into another with very high purity. Adding a QD layer to an LCD (QLED) or OLED (QD-OLED) widens the color gamut without changing brightness. QLED and QD-OLED panels typically cover 95%+ DCI-P3.
Reading a real-world spec sheet
Pretend you're looking at the Dell U3225QE, an example monitor. The interesting specs are:
- Resolution + size — 31.5" 4K → ~140 PPI → excellent sharpness.
- Panel type — IPS Black (a Dell/LG Display innovation, 2000:1 native contrast — twice normal IPS).
- Peak brightness — 600 nits → True HDR territory.
- Color coverage — 99% sRGB, 99% DCI-P3, 100% Rec. 709 → great for SDR and HDR.
- Refresh — 120 Hz (not 144) → fine for productivity, mid for gaming.
- Ports — Thunderbolt 4, DP 2.1, HDMI 2.1 → covers any GPU and laptop.
Read every monitor spec sheet the same way: pull the four or five things that decide your use case (sharpness, panel, HDR, refresh, ports, ergonomics), ignore the rest.
More monitors guides
- Monitor panel types: IPS, VA, TN, OLEDThe technology behind every monitor on the shelf — what each panel type is good at, what it's bad at, and which to buy for which job.
- Refresh rate, response time, and motion clarityHz, GtG vs MPRT, BFI, sample-and-hold blur, VRR — what actually makes a screen feel smooth, and what's just numbers on a box.
