← All articles 13 min read

HDR Image Explained: What It Is and How to Create One

An HDR image captures the full brightness range of a scene — from deep shadows under a bridge to blown-out clouds in a noon sky — in a single file. Standard cameras force a compromise: expose for the sky and lose the shadows, or expose for the shadows and blow the sky. HDR eliminates that trade-off by merging multiple exposures into one image that preserves detail everywhere.

The technique has been around since the 1850s (Gustave Le Gray combined two negatives of sea and sky), but modern HDR is a computational process: shoot bracketed exposures, merge them into a 32-bit file, and tone-map the result into something a screen can display. Phones now do this automatically. Desktop software gives you granular control. And HDR display formats like HDR10 and Dolby Vision are changing what "finished" even means.

Here is how all of it works.

What Makes an Image HDR

Dynamic range measures the ratio between the brightest and darkest tones an image can reproduce. A standard JPEG captures roughly 8 to 10 stops of dynamic range. The human eye perceives about 20 stops in a single glance (and up to 24 when adapting over time). An HDR image bridges that gap by storing 14 to 32 stops, depending on the source exposures and merge technique.

The key concept is bracketed exposures — shooting the same scene at multiple exposure values (EVs). A typical bracket set is three frames:

Some photographers shoot 5, 7, or even 9 brackets at 1-stop intervals for extreme scenes like cathedral interiors. The merge algorithm picks the best-exposed pixels from each frame and combines them into a single high-bit-depth file.

That merged file is a 32-bit EXR or HDR — technically accurate but impossible to display on a standard 8-bit monitor without tone mapping. Tone mapping compresses the enormous dynamic range into a viewable range while preserving the impression of contrast and detail. This is where the "HDR look" comes from, and where creative control lives.

When to Use HDR

HDR shines in specific situations:

HDR is not useful for fast-moving subjects (the brackets won't align), low-contrast scenes (there is nothing to recover), or portraits (tone mapping often looks unnatural on skin). If your image histogram fits comfortably within the 0-255 range without clipping either end, a single well-exposed shot is fine.

For related creative photography techniques, see double exposure photography and long exposure photography.

How to Create an HDR Image in Desktop Software

Photoshop 25.x: Merge to HDR Pro

Adobe Photoshop 25.x (Creative Cloud, proprietary license, $22.99/mo) includes Merge to HDR Pro — the most established desktop HDR merge tool.

  1. Shoot brackets. Use your camera's auto-bracketing mode (AEB). Three frames at -2/0/+2 EV is the starting point. Shoot on a tripod. Use a 2-second timer or remote shutter to avoid vibration.
  2. Import. Open Photoshop. Go to File > Automate > Merge to HDR Pro.
  3. Select source images. Add your bracketed exposures. Check "Attempt to Automatically Align Source Images" — essential if you shot handheld.
  4. Set bit depth. Choose 32-bit for maximum data retention, or 16-bit if you want to tone-map inside Photoshop immediately.
  5. Tone-map. If you selected 16-bit, the tone mapping dialog appears. Adjust Edge Glow (radius and strength), Tone and Detail (gamma, exposure, detail), and Color (vibrance, saturation). Start with the defaults, then reduce Detail to avoid the over-processed "HDR look" that screams 2010.
  6. Save. Export as a 16-bit TIFF or PNG for editing, or compress to JPEG for web delivery.

Pro tip: For the cleanest results, merge to 32-bit first, then use Image > Mode > 16 Bits/Channel with the "Local Adaptation" method. This gives you a curves control on the tone-mapping that Merge to HDR Pro's direct 16-bit output does not.

Lightroom Classic 13.x: HDR Merge

Adobe Lightroom Classic 13.x (proprietary, $11.99/mo as part of Photography Plan) offers a faster, more photographer-friendly HDR workflow.

  1. Import brackets into Lightroom.
  2. Select all bracketed frames for one scene. Right-click and choose Photo Merge > HDR (or press Ctrl+H / Cmd+H).
  3. Configure options. Auto Align corrects handheld misalignment. Auto Settings applies Lightroom's best-guess tone mapping. Deghost Amount handles moving objects between frames — set to Low for slight leaf movement, High for pedestrians.
  4. Merge. Lightroom produces a .dng file — a 32-bit floating-point DNG that lives in your catalog alongside the source frames.
  5. Edit. The merged DNG responds to every Lightroom slider with far more headroom than a single RAW file. Push Highlights to -100 and Shadows to +100 without banding or noise blowup.

Lightroom's HDR merge is non-destructive and produces smaller files than Photoshop's 32-bit EXR output. The trade-off: fewer manual tone-mapping controls. For most photography workflows, Lightroom is the better choice. For compositing or pixel-level control, use Photoshop.

Aurora HDR 2022

Skylum Aurora HDR 2022 (proprietary, $99 one-time purchase) is a dedicated HDR editor built specifically for this workflow.

  1. Import 3-9 bracketed exposures.
  2. Aurora auto-aligns and merges them.
  3. Apply one of 100+ HDR presets — organized by style (realistic, dramatic, architecture, landscape).
  4. Fine-tune with layer-based editing: HDR Enhance, Color Toning, Glow, Polarizing Filter, Image Radiance.
  5. Export as JPEG, TIFF, or PNG.

Aurora produces more natural-looking results by default than most competitors. It also supports single-image HDR (tone-mapping a single RAW file as if it were merged), though results are limited to the dynamic range the sensor captured. Best suited for photographers who want HDR-specific tools without learning Photoshop.

GIMP 2.10.36: Free HDR With Plug-ins

GIMP 2.10.36 (GPLv3, free) does not have built-in HDR merge, but you can create HDR images through manual layer blending or the Exposure Blend plug-in.

  1. Open all bracketed exposures as layers in GIMP (File > Open as Layers).
  2. Align layers using Filters > Align > Align Visible Layers (or use Hugin's align_image_stack externally).
  3. Use layer masks and blending modes to combine the best-exposed regions from each layer. Paint white on the mask where you want that layer's exposure, black where you do not.
  4. Flatten and export.

This is manual and time-consuming compared to one-click merge tools, but it is free and gives you full control over which pixels come from which exposure. For automated HDR on a zero budget, Luminance HDR 2.6.1 (GPLv2, free) is a better choice — it supports proper merge-and-tone-map workflows with Mantiuk, Fattal, and Drago operators.

HDR on Phones: Automatic and Everywhere

Modern smartphones shoot HDR by default. The computational photography pipeline captures multiple frames in rapid succession and merges them before you ever see the result.

iPhone Smart HDR (iOS 18.x)

Apple's Smart HDR — now in its fifth generation on the iPhone 16 series — uses the A18 neural engine to:

Smart HDR is enabled by default in Settings > Camera. You can turn it off, but there is rarely a reason to. The neural processing is sophisticated enough that the "HDR" label has effectively disappeared from the camera UI — every photo is HDR.

Samsung Pro HDR (One UI 6.x)

Samsung's flagship Galaxy S24 and S25 series use a similar multi-frame merge approach through Scene Optimizer:

Samsung's implementation favors punchy contrast over Apple's more neutral rendering. For manual HDR control, use Pro mode in the Samsung Camera app — you can set AEB (auto exposure bracketing) and process the brackets externally.

Phone HDR Limitations

Phone HDR is genuinely impressive for casual shooting, but it has constraints:

HDR vs SDR: What Is the Difference?

SDR (Standard Dynamic Range) is everything we have been looking at on screens for decades — 8-bit color, 100 nits peak brightness, Rec. 709 / sRGB color space. HDR expands every dimension.

Property SDR HDR
Bit depth 8-bit (256 levels per channel) 10-bit (1,024 levels) or 12-bit (4,096 levels)
Peak brightness 100-300 nits 1,000-10,000 nits
Color space sRGB / Rec. 709 DCI-P3 / Rec. 2020
Dynamic range ~6-8 stops displayable ~12-20+ stops displayable
Contrast ratio 1,000:1 typical 100,000:1+ (OLED)
Metadata None Static (HDR10) or dynamic (Dolby Vision, HDR10+)
Tone mapping Done at creation time Done at display time (with metadata guidance)

The critical shift with HDR displays is when tone mapping happens. In SDR, the photographer or editor makes all tone-mapping decisions at export time and the image is fixed. In HDR, the image carries metadata that tells each display how to render the content for its specific capabilities. A 600-nit laptop and a 4,000-nit reference monitor receive the same file but render it differently.

This matters for photographers and web creators: if your audience views your work on HDR-capable displays (most phones and laptops sold since 2023), delivering HDR content means your work looks as intended across a wider range of screens.

HDR Display Formats Explained

HDR content needs a container format that carries the extended brightness and color data plus instructions for displays. Four formats dominate.

Format Bit Depth Metadata Max Brightness License Primary Use
HDR10 10-bit Static (one set of values for entire content) 10,000 nits (theoretical) Open standard (royalty-free) Baseline HDR for TVs, monitors, streaming
HDR10+ 10-bit Dynamic (per-scene or per-frame) 10,000 nits Samsung-led, royalty-free Samsung TVs, Amazon Prime Video
Dolby Vision 12-bit Dynamic (per-scene) 10,000 nits Proprietary (Dolby license required) Premium TVs, Apple devices, Netflix
HLG (Hybrid Log-Gamma) 10-bit None (backwards-compatible curve) ~1,000 nits BBC/NHK, royalty-free Broadcast TV, live sports

HDR10 is the universal baseline. Every HDR display supports it. Its limitation is static metadata — one set of brightness instructions for the entire video or image. A movie that opens in a dark cave and ends on a sunlit beach uses the same tone-mapping parameters throughout.

Dolby Vision solves this with dynamic metadata that adjusts per scene or per frame. It also supports 12-bit color, though most current displays render at 10-bit internally. The cost is a per-device licensing fee from Dolby, which means budget displays often skip it.

HDR10+ is Samsung's royalty-free answer to Dolby Vision — dynamic metadata without the license fee. Adoption is strong in Samsung's ecosystem but limited elsewhere.

HLG was designed for broadcast. Its clever trick: the signal is backwards-compatible with SDR displays. An SDR TV shows a reasonable image; an HDR TV shows the full range. No metadata negotiation required.

For still images on the web, the relevant formats are AVIF (supports HDR natively with 10/12-bit depth and wide color gamut) and JPEG XL (supports HDR via transfer functions like PQ and HLG). Standard JPEG and PNG are SDR-only. If you are preparing images for web delivery, converting to modern formats gives you both HDR capability and better compression.

Tone Mapping: Turning 32-Bit Data Into a Viewable Image

Tone mapping is the bridge between the 32-bit HDR merge (which no standard display can show directly) and an image you can actually look at. Every HDR workflow includes it, whether you control it manually or a phone algorithm handles it automatically.

Global vs Local Tone Mapping

Global operators apply the same curve to every pixel. They are fast and artifact-free but can look flat because they do not account for local contrast. The Reinhard operator and simple gamma curves fall into this category.

Local operators adjust based on neighboring pixels. They preserve local contrast (the texture of stone, the ridges on a leaf) while compressing global range. Fattal, Mantiuk, and the "Local Adaptation" option in Photoshop are local operators. The risk is halos — bright artifacts around high-contrast edges where the algorithm overcompensates.

Tone Mapping Tips

HDR Image Tools Compared

Tool Price License HDR Merge Tone Mapping Batch Support Best For
Photoshop 25.x $22.99/mo Proprietary Merge to HDR Pro Global + Local Yes (via Actions) Pixel-level control, compositing
Lightroom Classic 13.x $11.99/mo Proprietary HDR Merge to DNG Auto + manual sliders Yes (batch merge) Photography workflows, catalog management
Aurora HDR 2022 $99 one-time Proprietary Built-in 100+ presets + manual Limited Dedicated HDR editing
Luminance HDR 2.6.1 Free GPLv2 Built-in Mantiuk, Fattal, Drago, Reinhard Yes Free, open-source HDR
GIMP 2.10.36 Free GPLv3 Manual (layers) Manual (curves) No Budget editing, no merge automation
Photomatix Pro 7.1 $39.99 Proprietary Built-in Extensive presets Yes Real estate, architecture photography

For web delivery after HDR processing, compress your final images to keep page loads fast without visible quality loss. A well-compressed JPEG or WebP at 85% quality preserves the tonal detail that HDR work captured while staying under 200 KB for most web sizes.

Frequently Asked Questions

What does HDR mean for images?

HDR stands for High Dynamic Range. An HDR image captures a wider range of brightness levels than a standard photo — from the darkest shadows to the brightest highlights — by merging multiple exposures of the same scene. The result preserves detail that would be lost in a single exposure.

Do I need special equipment to shoot HDR?

No. Any camera with auto exposure bracketing (AEB) can shoot HDR source frames. Most DSLRs, mirrorless cameras, and even smartphones support it. A tripod helps for sharp alignment between frames, but modern software can align handheld brackets.

What is the difference between HDR and RAW?

RAW is a file format — unprocessed sensor data from a single exposure. HDR is a technique — merging multiple exposures to extend dynamic range. You can create HDR images from RAW brackets (best quality), JPEG brackets (lower quality), or even from a single RAW file using tone mapping (limited to the sensor's native dynamic range). See RAW vs JPEG for format details.

Is phone HDR the same as camera HDR?

Conceptually, yes — both merge multiple exposures. Practically, phone HDR is fully automatic with no access to intermediate frames, uses aggressive noise reduction, and outputs an 8-bit file. Camera HDR gives you the source brackets, 32-bit merge files, and full tone-mapping control.

Why do some HDR photos look unnatural?

Over-processed tone mapping. When local contrast, saturation, and detail sliders are pushed too high, the image develops halos around edges, hyper-saturated colors, and a grungy look. Subtle tone mapping that preserves the scene's natural appearance produces better results.

What file format should I save HDR images in?

For editing: save the 32-bit merge as EXR or 32-bit TIFF. For web: export as JPEG, WebP, or AVIF after tone mapping. AVIF supports 10-bit color natively, making it the best web format for preserving HDR tonal range. For archival: keep both the source brackets and the merged file.

Can I create an HDR image from a single photo?

Yes, but the result is limited. Tone-mapping a single RAW image can recover highlight and shadow detail that the camera captured but JPEG conversion would discard. This is sometimes called "single-exposure HDR" or "pseudo-HDR." The dynamic range cannot exceed what the sensor recorded, so the effect is modest compared to true multi-exposure HDR.

Do HDR images work on all screens?

Tone-mapped HDR images exported as JPEG, PNG, or WebP display correctly on any screen — they are standard SDR files. True HDR display (using the extended brightness and color) requires an HDR-capable monitor or phone screen, an HDR file format (AVIF, JPEG XL, or video containers like HEVC), and software that supports HDR rendering. Most phones and laptops sold since 2023 support at least HDR10.