DJI Mavic, Air and Mini Drones
Friendly, Helpful & Knowledgeable Community
Join Us Now

ND Filters

A question about the filters: do you think that the filter are really very important to take ONLY PICTURES by drone ?
no. Unless you need to slow the shutter down for say - a waterfall.
A CPL is nice but you really have to watch for the hokey look when the sun angle is split leaving one side blue the other not so much. It's a real issue with the wide angle lenses on drones.
 
I have the freewell nd variable filter works great and don't have to remove just twist to change
Where'd you get that for the Mini 3? I've been looking.
 
A question about the filters: do you think that the filter are really very important to take ONLY PICTURES by drone ?
No. You almost never want an ND filter for stills. You want a high shutter speed and low iso.
Hyperlapse maybe an exception.

A circular polariser can be handy for stills though.

NDs are really only to slow shutterspeeds for smooth video.
 
Do you have a link? I can't find it.
That person HAD to be talking about another bird since nobody has it in stock. Even the folks that make it - Freewell - doesn't have it listed yet.
I've had mine on pre-order since first part of July.
 
I have to admit I was skeptical about using ND filters. I ordered the DJI set and received yesterday. I used them today and the difference is staggering. The colors are much more saturated and not blown out. The fixed, wide aperture (f 1.4, I believe) is just too much for a bright, sunny day. Many thanks to you all for pushing the use of ND filters on the M3P. Cheers.
 
ND filters have absolutely no effect on colour saturation (they're *neutral* density filters).
They also have absolutely no effect on dynamic range so do nothing to stop things getting blown out.

You're seeing placebo effect.

NDs purely allow a slower shutter speed to be used for a correct exposure. Nothing more.
 
Last edited:
  • Like
Reactions: MS Coast
ND filters have absolutely no effect on colour saturation (they're *neutral* density filters).
They also have absolutely no effect on dynamic range so do nothing to stop things getting blown it.

You're seeing placebo effect.

NDs purely allow a slower shutter speed to be used for a correct exposure. Nothing more.
Amen. The same effect could have been obtained by reducing the ISO or increasing the shutter speed by a corresponding number of stops.

And since ND filters can never be optically perfect, they add a small, but finite, amount of distortion or blur. And another two surfaces for introducing flare into shots with intense light sources.
 
ND filters have absolutely no effect on colour saturation (they're *neutral* density filters).
They also have absolutely no effect on dynamic range so do nothing to stop things getting blown out.

You're seeing placebo effect.

NDs purely allow a slower shutter speed to be used for a correct exposure. Nothing more.
Well, strictly true considering the filter in isolation.

However, they are not used in isolation, but rather part of a system, and relevant to this discussion is the non-linear response of the sensor near saturation.

Because of this, color can be distorted at light intensity that exceeds the sensor detection range.

Color is changed near saturation because, for example, red and blue can be in range, while green exceeds the sensor ability, casting a magenta hue to the recorded pixel.

So, while the attenuation through the ND filter is uniform across RGB, the response of the sensor is not at the limits. An ND filter can bring the incident light back fully into linear detection for all frequencies.

This does truly result in more saturation, richer color, but most importantly accurate relative color across the entire image.
 
Well, strictly true considering the filter in isolation.

However, they are not used in isolation, but rather part of a system, and relevant to this discussion is the non-linear response of the sensor near saturation.

Because of this, color can be distorted at light intensity that exceeds the sensor detection range.

Color is changed near saturation because, for example, red and blue can be in range, while green exceeds the sensor ability, casting a magenta hue to the recorded pixel.

So, while the attenuation through the ND filter is uniform across RGB, the response of the sensor is not at the limits. An ND filter can bring the incident light back fully into linear detection for all frequencies.

This does truly result in more saturation, richer color, but most importantly accurate relative color across the entire image.
Unfortunately none of what you claim is in anywhere near relevant.
The sensor is not and never is where best saturation point.

So no, an ND filter does absolutely nothing to colour saturation or dynamic range at all.

No general shooting conditions operate anywhere near sensor saturation at all.
That is by design.

ND filters slow a shutter speed. That's it.

Taking a sensor that's 3 full stops from saturation, adding a filter and taking it to 7 stops away is doing absolutely nothing to sensor response at all.


Anyone claiming to see differences are seeing a placebo effect and nothing more.
 
  • Like
Reactions: Meta4
Unfortunately none of what you claim is in anywhere near relevant.
The sensor is not and never is where best saturation point.

So no, an ND filter does absolutely nothing to colour saturation or dynamic range at all.

No general shooting conditions operate anywhere near sensor saturation at all.
That is by design.

ND filters slow a shutter speed. That's it.

Taking a sensor that's 3 full stops from saturation, adding a filter and taking it to 7 stops away is doing absolutely nothing to sensor response at all.


Anyone claiming to see differences are seeing a placebo effect and nothing more.
Okay. I seriously misunderstand this, and would like to understand, if you can explain it.

I just took these two pictures. No filter at all, let's forget that for the moment because it's understanding the operation and limits of the sensor that you say I've got wrong.

First photo taken with Auto:
DJI_0168.JPG

Second taken in Pro, again no filter, deliberately over-exposed:

DJI_0169.JPG

Detail is completely lost in the sky, and it's truly not there in the data (no color grading brings it out).

Could you please explain what is going on with sensor at the photoreceptors that results in that error in the data?

My explanation is the sensor has a limited range of exposure it can measure, and in this case that limit has been reached – so that the values measured for different colors under the Bayer filter are all the same, pegged to the max value, even though clearly there is detail and color there, resulting in pure white pixels after demosaicing. Yet the light hitting those photo receptor sites was not.

What part of the above is wrong?

If you agree with my explanation, then the question reduces to, can a bright sunny day overexpose part of the image within the iso and shutter speed ranges available?

I say YES. And if so, IF, then color saturation, accuracy, and dynamic range can be affected. Hence ND filters can improve these aspects of an image, if there are overexposed pixels.

I think the mistake you are making is thinking of the sensor as uniformly exposed, while of course it is not. Yet EV is determined by some averaging method over all the pixels, leaving the possibility of portions being overexposed. ND filters help with this too.

Second picture taken again, with exact same Pro settings, but with an ND16 filter:

DJI_0170.JPG

16 was too much, but that's not the point... the rich color is back, the detail above the horizon, etc.
 
Okay. I seriously misunderstand this, and would like to understand, if you can explain it.

I just took these two pictures. No filter at all, let's forget that for the moment because it's understanding the operation and limits of the sensor that you say I've got wrong.

First photo taken with Auto:
View attachment 153667

Second taken in Pro, again no filter, deliberately over-exposed:

View attachment 153670

Detail is completely lost in the sky, and it's truly not there in the data (no color grading brings it out).

Could you please explain what is going on with sensor at the photoreceptors that results in that error in the data?

My explanation is the sensor has a limited range of exposure it can measure, and in this case that limit has been reached – so that the values measured for different colors under the Bayer filter are all the same, pegged to the max value, even though clearly there is detail and color there, resulting in pure white pixels after demosaicing. Yet the light hitting those photo receptor sites was not.

What part of the above is wrong?

If you agree with my explanation, then the question reduces to, can a bright sunny day overexpose part of the image within the iso and shutter speed ranges available?

I say YES. And if so, IF, then color saturation, accuracy, and dynamic range can be affected. Hence ND filters can improve these aspects of an image, if there are overexposed pixels.

I think the mistake you are making is thinking of the sensor as uniformly exposed, while of course it is not. Yet EV is determined by some averaging method over all the pixels, leaving the possibility of portions being overexposed. ND filters help with this too.

Second picture taken again, with exact same Pro settings, but with an ND16 filter:

View attachment 153673

16 was too much, but that's not the point... the rich color is back, the detail above the horizon, etc.
It's all about exposure. You change the amount of light that reaches the sensor by changing the time that the shutter is open or you can change it by adding or removing something that restricts the amount of light that passes through the lens. Whichever means you use to over-expose or under-expose, the result is the same.
 
Last edited:
  • Like
Reactions: MiniMoose and Meta4
Okay. I seriously misunderstand this, and would like to understand, if you can explain it.
You are looking at photos that under/over and correctly exposed.
The under and over exposed were shot at incorrect exposure settings, because you used manual settings that were wrong.
If you want to properly compare with/without ND filters, you need to expose the image correctly.
Try your comparison with autoexposure.
That way the camera will adjust the shutter speed to achieve correct exposure.

With the ND filter, it will have to use a longer shutter speed to compensate for the reduced light levels caused by the ND filter.
But both images should look the same.
 
Unfortunately none of what you claim is in anywhere near relevant.
The sensor is not and never is where best saturation point.

So no, an ND filter does absolutely nothing to colour saturation or dynamic range at all.

No general shooting conditions operate anywhere near sensor saturation at all.
That is by design.

ND filters slow a shutter speed. That's it.

Taking a sensor that's 3 full stops from saturation, adding a filter and taking it to 7 stops away is doing absolutely nothing to sensor response at all.


Anyone claiming to see differences are seeing a placebo effect and nothing more.

This is incorrect in terms of the color cast. You are correct that it has no effect on dynamic range of the sensor as that is entirely separate.

Most ND filters can give a little bit of a color cast, especially cheap ones and really strong ones. It's the same in the traditional photography world, I've been dealing with it for around 20 years now. Same for polarizers as well. It's usually not severe nor is it a big deal but not every ND filter is 100% color neutral nor is every ND filter made to the same quality standards. It's most obvious with long exposures and buying good quality filters mitigates the issue to a level that is very easily corrected in post.

If you don't want to take it from me, here is an article from Hoya, probably the most well known filter manufacturer (they also make the filters for most OEMs), and they literally say: Almost all ND filters have some sort of color shift.

 
Last edited:
Detail is completely lost in the sky, and it's truly not there in the data (no color grading brings it out).

Anything that is over/under exposed beyond the dynamic range of the sensor will not be recoverable. The better the sensor's dynamic range, the less noise that will be introduced when more modest recoveries are made as well. Shadows are also easier to recover than highlights on most modern sensors. Shooting LOG footage for video actually allows the user to choose where they want more dynamic range (within the sensor's limits), in the highlights or the shadows. This is totally different than a RAW photograph however.

Could you please explain what is going on with sensor at the photoreceptors that results in that error in the data?

My explanation is the sensor has a limited range of exposure it can measure, and in this case that limit has been reached – so that the values measured for different colors under the Bayer filter are all the same, pegged to the max value, even though clearly there is detail and color there, resulting in pure white pixels after demosaicing. Yet the light hitting those photo receptor sites was not.

What part of the above is wrong?

That is pretty much correct. In a completely overexposed image every RGB value is 255,255,255 and everything is pure white and the pixels are oversaturated. There will be zero detail and nothing can be recovered. This can also happen to just parts of an image.

If you agree with my explanation, then the question reduces to, can a bright sunny day overexpose part of the image within the iso and shutter speed ranges available?

It depends what's in the image. Every exposure is a balancing act, and without an HDR image or blended exposures, a camera can only expose for one specific value. A normal bright sunny day is no problem for any camera, but unless the scene is 100% perfectly evenly lit with 100% perfect reflective properties, there will always be parts of the image that are over and under exposed relative to middle grey. If, say, you have an otherwise normal photo but there is some chrome in it, or the sun is in the corner, those items are going to be completely blown out if the exposure is set such that the rest of the image appears properly exposed. Alternatively if you exposed specifically for the sun or that piece of chrome, the rest of the image would be black.

I say YES. And if so, IF, then color saturation, accuracy, and dynamic range can be affected. Hence ND filters can improve these aspects of an image, if there are overexposed pixels.

The dynamic range capability of the sensor never changes, and will be highest at base ISO. Base ISO is where the sensor's full well capacity (FWC) is the largest and the most light is collected without becoming oversaturated. In the simplest terms, the larger the FWC the better the dynamic range. Increasing ISO adds digital gain and tells the camera the FWC is reached at an earlier stage (after collecting less light), which is why high ISO images get noisy. Adding a ND filter has absolutely no effect on the dynamic range of the sensor, but it will help you get a usable image if your desired exposure lies outside the Aperture/ISO/Shutter speed parameters of the camera.

For example, if you take a photo at ISO 100, F1.7, and 1/2000sec shutter speed it will be exactly the same as an image taken at ISO 100, F1.7, and 1/250 with an ND4 attached. You will have the same image and the same dynamic range. Color accuracy will likely change a little bit as it does with most NDs but not enough that it can't be easily corrected.
 
Last edited:
I think some of this comes from non-linear behavior near the extremes, but that's just my speculation.

Every sensor is different here. Some are extremely linear, especially within their native ISO ranges, others are not. The extremes are where ISO ranges are usually expanded (not available in drones) but you can see below the expanded ranges are denoted by hollow circles. I don't have a chart for the M3P but just for example on the M3 here is what the DR and Read Noise curves would looks like. The M3P will be similar but not the same as most modern sensors manufactured by Sony Semiconductor have a fairly similar curve shape, but obviously the 4/3 sensor in the M3 outperforms the M3P sensor by a wide margin:

screenshot (2).png

Note the dip in the curve around ISO 800 which means it's a dual gain sensor (tech that Sony licenses from Aptina):

screenshot (1).png
 

DJI Drone Deals

New Threads

Members online

Forum statistics

Threads
134,589
Messages
1,596,575
Members
163,092
Latest member
jonwill89
Want to Remove this Ad? Simply login or create a free account