Guides/Monitors·beginner·8 min read

Resolution, pixel density, and HDR

1080p / 1440p / 4K, why pixel density matters more than resolution, and how to read the HDR tiers without being misled.

Resolution and HDR are the two specs that sound the most important and that marketing departments lie about the most. Resolution decides how sharp text and detail look. HDR decides how rich and bright bright scenes can be. Both interact with screen size in ways the box never explains.

Resolution basics

Resolution is the count of pixels across × down. Common gaming monitor resolutions:

NamePixelsTotal pixelsCommon sizes
1080p (FHD)1920 × 10802.1 M22"–25"
1440p (QHD)2560 × 14403.7 M27"
4K (UHD)3840 × 21608.3 M27"–32"+
Ultrawide 1440p3440 × 14405.0 M34"
Ultrawide 4K3840 × 1600 or 5120 × 21606.1 M / 11 M38"–40"
Super-ultrawide5120 × 1440 or 5120 × 21607.4 M / 11 M49"

Pixel density — the spec that matters more than resolution

Pixel density (pixels per inch, PPI) is what your eyes actually perceive. A 4K display can be sharp (27") or coarse (50") depending on size. The right starting question is 'what PPI do I want?' — then resolution and size follow.

Size1080p PPI1440p PPI4K PPI
24"92 PPI (acceptable)122 PPI (good)184 PPI (excessive)
27"82 PPI (visibly grainy)109 PPI (good)163 PPI (excellent)
32"69 PPI (poor)92 PPI (acceptable)138 PPI (good)
34" UW82 PPI (poor)109 PPI (good)

Rough rule of thumb: 90 PPI is the floor for text legibility at desk distance. 110+ PPI is genuinely sharp. 140+ PPI is 'retina-class' and you stop noticing pixels at any normal distance. Almost no one needs > 200 PPI on a desktop.

Watch out
A 27" 1080p monitor is too low-PPI for modern eyes — text will look noticeably soft. If you're stuck at 1080p, go 24". If you want 27", go 1440p or 4K.

Scaling — when high-PPI fights your OS

At 4K on a 27" monitor, Windows and macOS scale the UI to 150–200% so text and icons aren't tiny. Modern Windows handles this well for most apps; older apps and some Java/Electron apps can look blurry at non-100% scaling. macOS handles scaling significantly better. Linux ranges from 'fine' to 'rough' depending on desktop environment.

What HDR actually is

Standard dynamic range (SDR) content is mastered for ~100 nits of peak brightness and uses an 8-bit color space (sRGB). HDR (High Dynamic Range) content is mastered for 1000+ nits, a wider color space (DCI-P3, Rec. 2020), and 10-bit color. The result: brighter highlights, deeper shadows, more saturated colors that look closer to real life.

For HDR to actually look like HDR, three things need to happen together: the content must be mastered in HDR, your monitor must be able to display it (a real HDR panel, not just 'HDR support'), and your OS/game must send the HDR signal correctly. All three break independently — HDR on Windows has been notorious for misconfiguration since launch.

HDR tiers — what the labels mean

DisplayHDR 400
VESA's lowest tier. Requires 400 nits peak, 8-bit color, no local dimming. Functionally a marketing label — most DisplayHDR 400 monitors look identical in SDR and 'HDR' mode. Avoid as a buying signal.
DisplayHDR 600
600 nits peak, local dimming required. Genuine HDR experience, though limited. Common on better mainstream gaming monitors.
DisplayHDR 1000
1000 nits peak, more rigorous color requirements, real local dimming. Real HDR. Common on Mini-LED monitors.
DisplayHDR True Black 400 / 500
OLED-specific tier. Lower peak brightness numbers (400/500 nits) but with true blacks and per-pixel dimming. Real HDR despite the lower numbers — contrast is what your eyes actually perceive.
HDR10
The most common open HDR signal format. Static metadata (one set of brightness/color targets per piece of content). Supported by basically everything.
HDR10+
Open format with dynamic metadata (per-scene targets). Better than HDR10. Supported in Samsung devices, some streaming services.
Dolby Vision
Proprietary HDR with dynamic metadata and 12-bit color targets. The best HDR format. Widely supported on TVs and streaming, rare on PC monitors (some recent Apple displays support it).
Tip
On a monitor box, 'DisplayHDR 400' is often the first sign of fake HDR. 'DisplayHDR 600' or higher (or True Black 400 for OLED) is where HDR starts being a real feature.

Color gamut and accuracy

sRGB
The standard color space for the web and most older content. Almost every monitor covers 100% sRGB.
DCI-P3
The cinema color space, used by HDR content. Wider than sRGB. Good HDR monitors cover 90–98% DCI-P3.
Adobe RGB
Print/photography color space. Important for photo editing. Independent of DCI-P3 in shape — a monitor can cover one well and the other poorly.
Rec. 2020
Ultra-wide color space targeted by future HDR. No consumer display covers more than ~80% of it today; 98% DCI-P3 ≈ 75% Rec. 2020.

Quantum dots — wider color, same brightness

Quantum dots (QD) are nanoscale particles that convert one wavelength of light into another with very high purity. Adding a QD layer to an LCD (QLED) or OLED (QD-OLED) widens the color gamut without changing brightness. QLED and QD-OLED panels typically cover 95%+ DCI-P3.

Reading a real-world spec sheet

Pretend you're looking at the Dell U3225QE, an example monitor. The interesting specs are:

  • Resolution + size — 31.5" 4K → ~140 PPI → excellent sharpness.
  • Panel type — IPS Black (a Dell/LG Display innovation, 2000:1 native contrast — twice normal IPS).
  • Peak brightness — 600 nits → True HDR territory.
  • Color coverage — 99% sRGB, 99% DCI-P3, 100% Rec. 709 → great for SDR and HDR.
  • Refresh — 120 Hz (not 144) → fine for productivity, mid for gaming.
  • Ports — Thunderbolt 4, DP 2.1, HDMI 2.1 → covers any GPU and laptop.

Read every monitor spec sheet the same way: pull the four or five things that decide your use case (sharpness, panel, HDR, refresh, ports, ergonomics), ignore the rest.

More monitors guides