“ it takes longer for the same amount of light to collect on the sensor with the same clarity from a further object then a nearer object. So the exposure time must be increased to allow the light from the far off object to accumulate on the sensor”It’s possible I didn’t use the correct term but the op is using a Mavic Pro which has a fixed aperture and fixed focal length. So neither of those things can changed.
Like I said maybe not the right technical term but since a far away object reflects light in a more dispersed angle then a closer object it takes longer for the same amount of light to collect on the sensor with the same clarity from a further object then a nearer object. So the exposure time must be increased to allow the light from the far off object to accumulate on the sensor.
Same idea as using ND filters for astrophotography where by using a long exposure the additive effect of the light on the sensor reduces the atmospheric haze and allows greater clarity of the light from a distant object.
Not sure what this is called. View attachment 70950
View attachment 70951
If that were true, how could you ever take a picture with something in the foreground and something in the deep background and have them both be correctly exposed? You couldn’t, ever. Everything in the background would be underexposed if the near subject was correctly exposed. Yet we see landscape photos, for example, with foreground objects and background objects both exposed correctly. Or sports photos, or wildlife...