Okay. I seriously misunderstand this, and would like to understand, if you can explain it.
I just took these two pictures. No filter at all, let's forget that for the moment because it's understanding the operation and limits of the sensor that you say I've got wrong.
First photo taken with Auto:
View attachment 153667
Second taken in Pro, again no filter, deliberately over-exposed:
View attachment 153670
Detail is completely lost in the sky, and it's truly not there in the data (no color grading brings it out).
Could you please explain what is going on with sensor at the photoreceptors that results in that error in the data?
My explanation is the sensor has a limited range of exposure it can measure, and in this case that limit has been reached – so that the values measured for different colors under the Bayer filter are all the same, pegged to the max value, even though clearly there is detail and color there, resulting in pure white pixels after demosaicing. Yet the light hitting those photo receptor sites was not.
What part of the above is wrong?
If you agree with my explanation, then the question reduces to, can a bright sunny day overexpose part of the image within the iso and shutter speed ranges available?
I say YES. And if so,
IF, then color saturation, accuracy, and dynamic range can be affected. Hence ND filters can improve these aspects of an image, if there are overexposed pixels.
I think the mistake you are making is thinking of the sensor as uniformly exposed, while of course it is not. Yet EV is determined by some averaging method over all the pixels, leaving the possibility of portions being overexposed. ND filters help with this too.
Second picture taken again, with exact same Pro settings, but with an ND16 filter:
View attachment 153673
16 was too much, but that's not the point... the rich color is back, the detail above the horizon, etc.