High Dynamic Range (HDR) in Computer Vision, Photography and Embedded Vision
What is High Dynamic Range?
Dynamic Range is the ratio of maximum to minimum detectable signal within an image.
Wide Dynamic Range (WDR) describes sensors that have above-average dynamic range with a single exposure.
High Dynamic Range (HDR) describes images that are created with temporal, spatial, or split pixel multiplexing.
What does Dynamic Range Look Like?
Dynamic range is noticeable when a camera's dynamic range is lower than a scene's dynamic range. Regions of the image will be too bright, or too dim, relative to the rest of the image which has correct exposure.
Camera dynamic range limitations occur in enclosed locations, like looking out of a tunnel.
In this Cathedral example, the low dynamic range image (left) has reduced feature visibility in the center. In the HDR image, the feautres and textures can be identified.
Hassinoff et. al “Burst photography for high dynamic range and low-light imaging on mobile cameras”
How is High Dynamic Range Achieved?
There are three different methods commonly used to compose High Dynamic Range (HDR) images:
- Temporal HDR uses multiple frames over time, each with different exposure length or gain. This is also know as exposure bracketing.
- Spatial HDR uses different exposure and/or gain values for different rows or columns of pixels.
- Split pixel HDR uses new bayer patterns, such as Sony's Quad Bayer, with different pixel sensitivities, exposure, and/or gains.
HDR Can Lead to Problematic Artifacts for Computer Vision and Machine Vision
The research team at Algolux shows a few common issues with HDR multi-plexing techniques. Even with state of the art HDR techniques, adverse artifacts can occur, resulting in edge cases and issues with classification and detection methods.
Nicolas Robidoux, Luis Eduardo García Capel, Dong-eun Seo, Avinash Sharma, Federico Ariza, Felix Heide.
CVPR 2021, "End-to-end High Dynamic Range Camera Pipeline Optimization"
References and Related Links
Nicolas Robidoux, Luis Eduardo García Capel, Dong-eun Seo, Avinash Sharma, Federico Ariza, Felix Heide.
CVPR 2021, "End-to-end High Dynamic Range Camera Pipeline Optimization"
Geese, Seger, and Paolillo "Detection Probabilities: Performance Prediction for Sensors of Autonomous Vehicles"
View Other Image Quality and Computer Vision Topics
- Camera Exposure and Computer Vision
- Motion Blur
- High Dynamic Range
- Resolution and Sharpness
- Shading and Vignetting
- Noise
- Fisheye Distortion and Wide Angle Lenses
- Fringing and Chromatic Aberration
Trying to Determine Your Camera Requirements?
Use our free web-based AoV Calculator to determine your system's Field of View Requirements. Then, use the M12 Lens calculator to match your requirements with the available lenses. Our Depth of Field Calculator also provides the hyperfocal distance and depth of field for every sensor and lens combination.
We also have a couple of other calculators that many engineers find interesting.