- Joined
- May 7, 2019
- Messages
- 11
- Reactions
- 3
Ok I’m just thinking out loud here because I like your project here and I already know lots of applications for it but in looking at your two pictures above I’m not convinced that they were taken at two markedly different times. Look at the way both of your arms necks and legs are aligned.
What it looks like to me is due to the optical cameras wider lens and aspect ratio you are getting barrel distortion indicative of a wide angle lensView attachment 71850
HOWEVER, we know that the camera of the regular Mavic 2 Zoom applies lens correction to the JPEG images automatically to correct for this. On the raw files it does not but leaves this meta data imbedded in the .dng so that it can be read by a post processing tool. Some editors automatically apply it others do not.
So if you are imputing that the optical camera is a 85 degree FOV and adjusting for distortion in you algorithm then this might be an issue since the jpeg is already adjusting the distortion to account for this.
Can you take a raw image and then process it into a jpeg without lens corrections and see if it matches up better?
View attachment 71855
Good that you like the project and keep thinking out loud!

I did a simple test of waving my arm to confirm that the camera is not 100% syncronized. The pictures below is an image pair where the thermic camera and visual camera sees my hand at two very different places.


However when I accounted for that and tried to analyse pictures where I stood completly still I had better accuracy but was still off. I think you might be right in that barrel distortion has something to do with the misses I experience. Unfortunately the Mavic 2 Enterprise dual can only save JPEG, but I might be able to reverse the lens distortion correction or take it into account for the calculations of positions.
I will have to do some more research on how the cameras and lenses work. Thanks for the input and help so far, I`ll get back to you with the results!