DJI Mavic, Air and Mini Drones
Friendly, Helpful & Knowledgeable Community
Join Us Now

Adjusting a polarizer to reduce glare on water, with camera pointed straight down?

scubaddictions

Well-Known Member
Joined
Jun 1, 2019
Messages
124
Reactions
66
Location
Hawaii, the Big Island
All,

I'm embarking on a project to capture local dive sites, high resolution merged panoramas shot from just under 400' AGL. Think Google Maps satellite view but higher detail. I can't wrap my head around the best way to set the adjustable polarizer to cut glare off the surface of the water. I can fiddle with it while it's on land but the actual shots will be taken with the gimbal pointed straight down. Will I see enough color change while rotating the polarizer looking at the ground to make an adjustment effort worth-while? Hold the drone over a pan of water to make my adjustments? Some other photographers trick to accurately see the best position for an adjustable polarizer?

I imagine I can do some trial and error, lots of take-offs, quick view of the glare, land it for an adjustment, then repeat. The adjustment point of the polarizer should remain at least roughly the same from dive site to dive site, as the drone will always be facing north and shooting at roughly the same time of day, sunny mornings before the clouds and wind roll in.

Thanks!

Ryan
 
  • Like
Reactions: Hauptmann
Hi, Ryan,
Polarizing filters work best when the source of light (the Sun) is at right angles to the direction your camera is facing. They work worst when the Sun is overhead and your camera is pointing down.

Shooting straight down in morning light should turn out well for you no matter how the polarizing filter is rotated.
 
Makes sense. I did notice some glare to the east in my earlier captures as the sun wasn't yet overhead. I'm hoping I can squeeze a bit more glare reduction out of the filters.
Leave the filter at home.
Careful positioning of the camera can go a long way to reducing glare like this:
DJI_0299a-L.jpg

DJI_0308a-L.jpg
 
  • Like
Reactions: E90RAW
Leave the filter at home.
Careful positioning of the camera can go a long way to reducing glare like this:
DJI_0299a-L.jpg

DJI_0308a-L.jpg
Agree-It's all positioning of the drone/camera. It's best to have the sun behind the drone, as is the case with most photography.
 
I'm embarking on a project to capture local dive sites, high resolution merged panoramas shot from just under 400' AGL.
The other issue you might have trouble with is trying to stitch water.
Stitching won't be too much of a problem if there's enough hard detail for the software to match points.
But large areas of smooth water have no detail at all or the software won't like waves and/or ripples because it won't be able to match what it thinks are points in one image to an adjacent image.
 
The other issue you might have trouble with is trying to stitch water.
Stitching won't be too much of a problem if there's enough hard detail for the software to match points.
But large areas of smooth water have no detail at all or the software won't like waves and/or ripples because it won't be able to match what it thinks are points in one image to an adjacent image.

This is a good point, and something I haven't fully figured out yet. Here's an example, a screen shot within ICE. Appears to my eye to have more than ample overlap. If I pick 'Structured Panorama' I can choose the layout, starting point, motion the camera traveled, like this:

ICE structured pano.jpg

However the stitch fails, and claims it was only able to stick 6 of the 12 images. That's a real puzzler, based on the picture above it seems I did most of the work for ICE.

If I let it choose it's own structure, I get this:

ICE simple pano.jpg

As you can see, it seems hell-bent on ignoring the water in the upper left corner.

It seems I still need to figure out a method using the 'Structured Panorama' stitch, but haven't stumbled onto one that works yet.
 
This is a good point, and something I haven't fully figured out yet. Here's an example, a screen shot within ICE. Appears to my eye to have more than ample overlap.

However the stitch fails, and claims it was only able to stick 6 of the 12 images. That's a real puzzler, based on the picture above it seems I did most of the work for ICE.
...
As you can see, it seems hell-bent on ignoring the water in the upper left corner.
It seems I still need to figure out a method using the 'Structured Panorama' stitch, but haven't stumbled onto one that works yet.
ICE is a simplified stitching program.
It doesn't show you what's going on under the hood.
Here's an example of how stitching programs work:
i-wJrcw9t-XL.jpg

They identify and match pairs of distinct points that are in adjacent images.
But a water surface doesn't have any distinct points that will be visible in adjacent images.
Here's another pair of images from the same mapping project:
i-fBGfdqj-XL.jpg

Notice that none of the points are in the water part of the images.
The images were shot only two seconds apart, but because the water surface is constantly moving, there are no identifiable pairs of points that can be matched.
Water surfaces are always going to cause problems for stitching programs.
With no matching points, the stitching program just gives up.
Where you have large all-water areas, you can get holes in the mosaic.
Or another the stitching program might just paste the water it can't match like this:
i-rxskDFn-L.jpg
 
Last edited:
That's good to know! I guess I had just hoped that if I instructed ICE specifically as to where the water image goes it would at least make the attempt, and blend the edges like usual. Especially in my sample above, the original images labeled 7, 8, and 9. There's still a lot of visible variation in the shades of blue, namely the reef below shallow water, ICE surprises me that it can't pick out points of reference. In fact, in it's best attempt it didn't just ignore roughly around image 7 (where the water is most regular), it ignored ALL of the dark blue reef regions from all three images.

Is there a way around this, or do I just need to avoid water? Is there any way to manually set a few distinct points to help things along? Luckily, there should only be a few sites that have important sub-surface features shallow enough for a drone to capture from above.

Thanks!
 
Last edited:
One other little nugget I picked up from the ICE FAQ is that structured panorama is designed to be used with an automated/robotic image capture method. My capture method so far has been purely manual. Knocking around the idea of getting into Litchi to speed up my capture times with an automated flight.
 
That's good to know! I guess I had just hoped that if I instructed ICE specifically as to where the water image goes it would at least make the attempt, and blend the edges like usual. Especially in my sample above, the original images labeled 7, 8, and 9. There's still a lot of visible variation in the shades of blue, namely the reef below shallow water, ICE surprises me that it can't pick out points of reference.
No single stitching program is perfect al the time.
It's amazing how they can appear so smart but then fail on something that doesn't look very difficult.
ICE is appalling at matching sea horizons - it often renders them looking like steps, but it's great for a lot of other things.
Reef below shallow water is going to be tricky since the view is through the ripples.
Is there a way around this, or do I just need to avoid water? Is there any way to manually set a few distinct points to help things along? Luckily, there should only be a few sites that have important sub-surface features shallow enough for a drone to capture from above.
The easiest thing to try is a different stitching program to see if a different stitching algorithm handles things better.
Or you could try a more sophisticated program which allows you to manipulate control points to help out (but that's getting complicated).

If you put the images from your example on dropbox or similar and pm me a link, I'll have a play with them and see if some of my toys handle it better than ICE does.
One other little nugget I picked up from the ICE FAQ is that structured panorama is designed to be used with an automated/robotic image capture method. My capture method so far has been purely manual. Knocking around the idea of getting into Litchi to speed up my capture times with an automated flight.
Rather than Litchi, I'd go for a proper mapping app like DroneDeploy to do the flying.
It will work out a grid with proper overlaps and then fly it for you.
Very fast, very simple, very accurate. And it's free.
Here's a screenshot with DD part way through flying a grid for another project:
This was a 61 acre site with 190 images
i-sqt6QCQ-L.jpg
 
If you put the images from your example on dropbox or similar and pm me a link, I'll have a play with them and see if some of my toys handle it better than ICE does.

Rather than Litchi, I'd go for a proper mapping app like DroneDeploy to do the flying.


Message sent, I appreciate it!

DroneDeploy sound like fantastic software, but beyond the budget of the project. I'm muddling through with adequate success flying manually. BTW, the rough size for each map only covers about 1-2 acres. A starting price of $1200 a year for DroneDeploy.... Yowza!

A few bucks for Litchi by comparison (if you think it would be of any value on this project) would be a cinch.

Thanks again!

Ryan
 
DroneDeploy sound like fantastic software, but beyond the budget of the project. I'm muddling through with adequate success flying manually. BTW, the rough size for each map only covers about 1-2 acres. A starting price of $1200 a year for DroneDeploy.... Yowza!
Correction ... DroneDeploy costs .. if you use their processing.
But it's free to use the app to plan, fly and capture the images.
For small sites like you are wanting, you don't really need that sort of processing anyway.
 

DJI Drone Deals

New Threads

Forum statistics

Threads
134,608
Messages
1,596,759
Members
163,103
Latest member
thinkaerial
Want to Remove this Ad? Simply login or create a free account