DJI Mavic, Air and Mini Drones
Friendly, Helpful & Knowledgeable Community
Join Us Now

Mavic 2 Pro Photography – HDR

I merged 3 RAW files, and converted to Jpg in order to post. Again, you have to shoot the way you wish to. Most photographers believe that bracketing 7 exposure values is uncalled for. To each your own.

I don’t respond to threads to argue or debate but rather to share information based on my experiences to edify others. Good day, signing off this thread.

Good evening gentlemen

This is the one that makes no sense to me:

Bingo you got it! I took the photo with an exposure value of 0 and created 2 copies one at +2 EV and the other at -2 EV then merged them. Good eye!

Unless I'm missing something, you are describing one raw image, processed into three different exposures, and then recombined back into one image.
 
  • Like
Reactions: AnzacJack
A few things scanning this thread. M2 files are NOT 16 bit at all BUT its a feature of the DNG file type.
In other words, DNG can contain 8 or 16 bits of data. If the actual camera is producing over 8 bits it'll be put in a DNG container as 16 bit and reported as such.
I would guess the M2P is about 10 usable bits of image data depth roughly so it;ll be reported as 16 by the DNG metadata.

Also, the 5 image HDR isn't great. DJI for some reason STILL refuse to allow manual configuration of the bracket range and fix it on a very low 0.67 stops between each exposure. Quite often this isn't actually enough to produce a good dynamic range without burning out things like a low sun (and even less chance on the zoom or older drones). On my DSLR (bigger sensor, bigger pixels, way way better dynamic range) i normally do a 5 shot sequence using 2EV changes between each for best results.

In some situations the only way i can get a decent HDR out of the M2 is by manually taking a bracket by changing the EV. Obviously this is much slower and leaves the image far more open to ghosting as things have more time to move.

Theres no mirror to cause vibration and the gimbal is pretty good so as long as your shutter speed is sensible (DONT USE ND FILTERS FOR STILLS!!) shouldn't have an effect on image quality. What does screw up though are random movements such as leaves on trees, water ripples, sun reflections and so on.

People seem to be treating image alignment and image ghosting as the same thing. They're not. The first is easy to fix. The second is not.

Most photographers believe that bracketing 7 exposure values is uncalled for.

Yes BUT see above. 7 shots at say 2EV difference then yes, almost certainly not needed these days. However, a tiny little 0.67 bracket range it maybe.
 
  • Like
Reactions: ABLomas
Unless I'm missing something, you are describing one raw image, processed into three different exposures, and then recombined back into one image.

Yep - he's not creating HDR at all and isnt getting any more dynamic range as its not recorded. All that is doing is the equivalent of dragging the shadows slider to full, highlights down to zero and stretching the histogram. It can help an image (if you then go to work increasing contrast and clarity) but it doesn't produce any more dynamic range. If something is burnt out its still going to be burnt out.
 
Yep - he's not creating HDR at all and isnt getting any more dynamic range as its not recorded. All that is doing is the equivalent of dragging the shadows slider to full, highlights down to zero and stretching the histogram. It can help an image (if you then go to work increasing contrast and clarity) but it doesn't produce any more dynamic range. If something is burnt out its still going to be burnt out.

It's more pointless than that because it creates three files, each with less data than the original raw, and then recombines them.
 
Also, the 5 image HDR isn't great. DJI for some reason STILL refuse to allow manual configuration of the bracket range and fix it on a very low 0.67 stops between each exposure. Quite often this isn't actually enough to produce a good dynamic range without burning out things like a low sun (and even less chance on the zoom or older drones). On my DSLR (bigger sensor, bigger pixels, way way better dynamic range) i normally do a 5 shot sequence using 2EV changes between each for best results.
Couldn't agree more except to say that the DNG, 5 image bracket (HDR post processed ) is always better than a single image as long as the shutter speed doesn't cause movement. I completely agree that DJI should allow the bracketing be user defined. I'm hopeful DJI might facilitate this but the problem is, HDR is most valuable in lower light photography which can often mean the longest exposure of the five can be more than one second.
 
A few things scanning this thread. M2 files are NOT 16 bit at all BUT its a feature of the DNG file type.
In other words, DNG can contain 8 or 16 bits of data. If the actual camera is producing over 8 bits it'll be put in a DNG container as 16 bit and reported as such.
I would guess the M2P is about 10 usable bits of image data depth roughly so it;ll be reported as 16 by the DNG metadata.

I keep explaining that but, apparently, not clearly enough.

Also, the 5 image HDR isn't great. DJI for some reason STILL refuse to allow manual configuration of the bracket range and fix it on a very low 0.67 stops between each exposure. Quite often this isn't actually enough to produce a good dynamic range without burning out things like a low sun (and even less chance on the zoom or older drones). On my DSLR (bigger sensor, bigger pixels, way way better dynamic range) i normally do a 5 shot sequence using 2EV changes between each for best results.

In some situations the only way i can get a decent HDR out of the M2 is by manually taking a bracket by changing the EV. Obviously this is much slower and leaves the image far more open to ghosting as things have more time to move.

That's what I do - it's pretty quick if you use the wheel.

Theres no mirror to cause vibration and the gimbal is pretty good so as long as your shutter speed is sensible (DONT USE ND FILTERS FOR STILLS!!) shouldn't have an effect on image quality. What does screw up though are random movements such as leaves on trees, water ripples, sun reflections and so on.

People seem to be treating image alignment and image ghosting as the same thing. They're not. The first is easy to fix. The second is not.

OK - did you not read any of my posts?
 
Couldn't agree more except to say that the DNG, 5 image bracket (HDR post processed ) is always better than a single image as long as the shutter speed doesn't cause movement.

Not always - wind moves leaves, the surface of water, sunlight reflections that even a fast shutter cant cope with as it moves between shots.

HDR will also inevitably increase noise.

I do agree however its going to be better than processing the same raw 3 times and merging (which was the old way of doing it before LR/PS gave you shadows and highlight sliders to do the same thing).
 
Not always - wind moves leaves, the surface of water, sunlight reflections that even a fast shutter cant cope with as it moves between shots.

HDR will also inevitably increase noise.

I do agree however its going to be better than processing the same raw 3 times and merging (which was the old way of doing it before LR/PS gave you shadows and highlight sliders to do the same thing).
The biggy here was when LR/PS allowed the merging of the bracketed images into s single RAW image.
But yes, noise can be a problem especially in lower light images where shadow detail is drawn out.
 
Last edited:
They are 16 bit files. That has nothing to do with the actual dynamic range. As I pointed out, and you repeated for some reason, that's around 12 EV for that sensor.

I'm afraid they are not. You cannot magically get a true 16bit file from a 12bit sensor readout / 12bit ADC. No mainstream cameras can shoot 16bit RAWs - there are a handful of cameras that can and they are mostly MF cameras that cost as much as a car. Most of them are also CCD sensors with off-sensor ADC.

Here is a good read on how bit depth affects DR - I normally don't like DPreview but the author of this article did a pretty good job: Raw bit depth is about dynamic range, not the number of colors you get to capture
 
I'm afraid they are not. You cannot magically get a true 16bit file from a 12bit sensor readout. No mainstream cameras can shoot 16bit RAWs - there are a handful of cameras that can and they are mostly MF cameras that cost as much as a car.

Here is a good read on how bit depth affects DR - I normally don't like DPreview but the author of this article did a pretty good job: Raw bit depth is about dynamic range, not the number of colors you get to capture
Yep the Hasselblad 400 mp camera H6D 400c is 16 bit but in Australia it will set you back $75K before you get a lens. I'm thinking of getting one as a point and shoot.
 
I'm afraid they are not. You cannot magically get a true 16bit file from a 12bit sensor readout. No mainstream cameras can shoot 16bit RAWs - there are a handful of cameras that can and they are mostly MF cameras that cost as much as a car.

Here is a good read on how bit depth affects DR - I normally don't like DPreview but the author of this article did a pretty good job: Raw bit depth is about dynamic range, not the number of colors you get to capture

We must be using language differently here. They are 16 bit files. 16 bits are assigned to each pixel. That doesn't mean they are using them all or that there is 16 EV of usable dynamic range is coming from the sensor.

The article you linked is good, but says exactly the same thing.
 
The Mavic Air still image quality is bad enough (I have one) that as long as you weren't pushing the file too much in post processing, I would much rather have the M2P file in most scenarios.

The Air is great for video, but stills don't hold a candle to the M2P. The Air uses a tiny 1/2.3" sensor (same as your cell phone or a cheap point & shoot camera) and is very limited by physics.

Can't comments on the Air video quality but I can say that if you took an HDR image from the Mavic Air in an environment that had a wide dynamic range (range of darks to lights exceed 12 EV) and a single shot from the Mavic 2 Pro, of the same subject\environment you would end of with a properly exposed Mavic Air photo of relatively low quality (when I say relatively low quality i mean pixels and quality of pixels which determine how large you would be able to do a print and push from a post process perspective) and from the Mavic 2 pro you would have either crushed blacks or blown whites on a high quality 20 million pixel (minus crush or blowouts) image that could be stretched further in post-processing without introducing unwanted elements (noise, artifacting, chromatic aberration, green shift, halos, etc). But it begs the question...why not just shoot the same bracket with the MP2 and have the best of both worlds.

Other things to note and apologies if this is duplicative:
10-bit 12-bit -16-bit are all HD color profiles, they do not extend the range of light the camera can capture in a still image or video for that matter
Mavic air used a fixed 2.8 aperture which will ultimately reduce the quality particularly sharpness around the edges of your images whereas the MP2's adjustable aperture will allow you to optimize sharpness across the entire image.
Higher quality hardware captures better pixels. the sensor on the MP2 is much higher quality than the Mavic Air and will capture better pixels, particularly in dark areas that are not crushed!
MegaPixels are key to high quality images and ability to push in post-processing. Much easier to get the desired effect with 20MP vs 12MP.

Hope this is helpful!
 
MegaPixels are key to high quality images and ability to push in post-processing. Much easier to get the desired effect with 20MP vs 12MP.

Not sure i agree entirely with this. You can have tiny sensors that will produce 'megapixels' but not necessarily produce high quality images. Megapixels is much more about one's ability to crop without degrading image quality.
 
  • Like
Reactions: cpfninja11
Yep, the "megapixel wars" of previous years are over. They dont define quality.

Pixel density and pixel size are what define quality, noise and so on not the number of pixels.
 
  • Like
Reactions: Grey
Not sure i agree entirely with this. You can have tiny sensors that will produce 'megapixels' but not necessarily produce high quality images. Megapixels is much more about one's ability to crop without degrading image quality.

Totally agree with what you're saying which is why I called that out right before my last comment. It's two fold. Megapixel count and quality of pixels.

cheers
 
The 1" Sony sensor has way more DR than the 1/2.3" sensor in the Mavic Air.

HDRs are fairly simple - they are used to capture a larger dynamic range than can be done with a single exposure (since a camera can only expose for one thing at a time), or to avoid having to push a single photo too far in post processing.

The more DR you have, the fewer photos you need to make a similar HDR. With the Mavic 2 Pro, you can probably get away with -2EV, 0EV, +2EV for most HDRs as a rough starting point. You would want to take more photos with smaller EV steps to get a similar result out of the Air, but the 1" sensor in the M2P really is a lot better.

This is roughly the dynamic range difference between a Mavic 2 Pro and a Mavic Air:

Photographic Dynamic Range versus ISO Setting

Best results will always be to shoot in RAW and process the HDR in proper software. The auto HDRs the drone spits out are pretty bad.
Hi Just learning still and want better hdr photos. I am using video pro software w my mp2 and am not sure what processing is involved. Picture are great on the phone nut do not transfer to the laptop as good.
cheers
Brian C
 
Lycus Tech Mavic Air 3 Case

DJI Drone Deals

New Threads

Forum statistics

Threads
131,086
Messages
1,559,699
Members
160,068
Latest member
Bahamaboy242