DJI Mavic, Air and Mini Drones
Friendly, Helpful & Knowledgeable Community
Join Us Now
This is great, i'm really impressed with this little thing and find these new features trickling through quite exciting.

What i find hard to believe though is that i have the osmo pocket which looks identical to the mini camera, and this can shoot 4K and capture raw. I understand the mavic mini is basically the bottom of the tier for Dji's current drones so won't put these features in because people wouldn't upgrade. Its still annoying as i'm sure this tiny thing could kick out 4K... well at least capture RAW. Anyway despite this i still love it and am looking forward to going out today to try this new update from Litchi.
 
Tried the new Litchi update on my iPhone 7, haven't updated my iPad Pro yet. With the upgrade the pano stitching can handle higher in app file size stitching options (low/med/high) with the iPhone doing up to 90 megapixels in app stitching and the iPad Pro doing up to 220 megapixels. I actual tested the low res 360 degree auto pano using the iPhone 7 indoors in my livingroom! There was a low altitude warning but it did a great job. There are 3 in app options to view - flat/sphere/tiny planet. Impressive program!

Also like the new feature that you can set a C1 custom function using the mini's RTH button with a single press. I set it up to switch speed modes, useful when in Virtual Reality FPV flying with split screen goggles. I can change speeds without removing the phone from the goggles to change speed modes. Now I'm wondering if C2, C3 custom switching is a possibility....
I cant find any info about C1 custom function at RTH button. Where did you see that ?

Thanks a lot

R.
 
Dronelink can now do bracketed 360 for true HDR panoramas. Combine with a good stitcher and the results are excellent. There are scripts for both AEB (for convenience) and fixed exposure bracketing (for the best stitching results).
 
  • Like
Reactions: wspike
I cant find any info about C1 custom function at RTH button. Where did you see that ?
The info about the custom option came in an email from Litchi about the upgrade as part of the iOS beta testing.

Go into the settings and scroll way down to the bottom. Without an aircraft connected there are many custom options for the C1, C2, C3 ++ buttons. When the mini is connected there is one button, C1 (RTH) and you can set the option to what you want it to turn on/off with a single press, which makes me wonder as it is normally used to pause or cancel RTH with a short press... have to check it out.

Have yet to try the "fixed high quality tracking" with the mini.

Email from Litchi:
Beta 2-9-0.jpg
 
  • Like
Reactions: raulmavic
The info about the custom option came in an email from Litchi about the upgrade as part of the iOS beta testing.

Go into the settings and scroll way down to the bottom. Without an aircraft connected there are many custom options for the C1, C2, C3 ++ buttons. When the mini is connected there is one button, C1 (RTH) and you can set the option to what you want it to turn on/off with a single press, which makes me wonder as it is normally used to pause or cancel RTH with a short press... have to check it out.

Have yet to try the "fixed high quality tracking" with the mini.

Email from Litchi:
View attachment 112418
Thanks a lot for the info. As Android beta tester i have never received an email like that.

I will try if c1 custom button works on Android.

Best regards

R.
 
In the test beta iOS I'm really liking the new Litchi pano options but have a question. In the Panorama Database, when you view a low res stitched 360 pano the default view is "Flat". You have the option the also build in-app a "Tiny Planet" and to view the regular pano in "View in 360" which loads the sphere. The app viewer shows the pano scrolling and you can control the scrolling - up/down and around and zoom in plus you can toggle the "Control Mode" which gives you total viewing control with your iPhone's orientation. Great viewer! You can also email/save/print etc.

On my iPhone, so far I have been able to save the 360 pano as a jpg and as a Tiny Planet (it's a 29 second/20meg movie and the planet actuals spins). My question is.... how do I get the cool "View in 360" sphere saved and viewed in something other than in Litchi. I have seen somewhere similar jpg panos being converted for the square up/down/and scroll viewing on Instagram but I've never used Instagram. I'm guessing a special viewer is need to display the 360x110 image. Any ideas or links??

Litchi is a great app that has the mini doing things that I never expected it would be capable of.

EDIT: I've been doing some research about viewers for devices and desktop and embedding in a webpage. Have lots of reading to do.
Within Litchi you can share the pano to facebook and it will show up as a sphere (better resolution when viewed on mobile devices). Alternatively you could just save the 360 photo to your device and view/share it in Google Photos which has a built-in 360 viewer. You can also upload the file to websites like Roundme, 360cities etc; the output file from Litchi should not require any post processing, other than if you want to edit the photo itself (in which case you need to be careful not to strip the metadata).
 
Last edited:
  • Like
Reactions: EyesWideShut
Very nice! Perhaps I missed it in one of your posts, but what software did you use for that 360 pano?
 
Dronelink can now do bracketed 360 for true HDR panoramas. Combine with a good stitcher and the results are excellent. There are scripts for both AEB (for convenience) and fixed exposure bracketing (for the best stitching results).

So with the DroneLink app you can do automated AEB, and combine auto AEB along with an automated 360 panorama? That's something I'm very interested in doing, as AEB compensates somehow for the lack of RAW, as it allows post processing to produce an HDR image with more dynamic range. And it can also do fixed exposure bursts of several images automatically? I use that technique often to process image bursts into a combined stack of images, it can be done in Photoshop or Affinity Photo, and that produces a single image which can be enlarged and yields more resolution & detail and reduces noise. It's an effective technique that basically increases the resolution of the 12MP images.
I took a look at the DroneLink website, but it seems a bit complicated to operate that app? It appears to work based on scripts, that sounds difficult to do, or it's easy to configure all that automated scripts stuff without knowledge of programming?
I'm very interested in buying the DroneLink app if it does all that stuff automatically (panoramas, exposure bracketing, bursts, etc.), those things are the ones I do most, and the personal use version is only $20, but I'm concerned it will be quite complicated to do all that; taking a look at DroneLink's website, I couldn't understand well how that app works and can be configured, it looks very complex like needing to have programming skills.
Is it so difficult to use like it looks? I'll buy it if I'll be able to operate it without so much fiddling and complex programming.
 
Very nice! Perhaps I missed it in one of your posts, but what software did you use for that 360 pano?

I used a free Windows program called "Image Composite Editor" (abbrev. ICE). It was produced by Microsoft Research, and although it's not updated anymore, as they didn't continue developing on it, it's a very good 360 stitching program, and very easy to work with. You can download it here.
However after ICE stitches the 360 image, since a spherical panorama has to be 360x180 degrees (meaning a final image of 2:1 ratio), and the Mini's gimbal doesn't capture the upper section of the sky, as it cannot rotate all the way up, you need to add a blank strip above the stitched 360 image so it will have a 2:1 ratio. That can be done with any advanced editing software, like Photoshop, Affinity Photo, the free Gimp, etc. It's done by enlarging the image canvas above the image to add the blank "sky", and cropping the entire image to 2:1 ratio. Then, there's another free software called "SkyFill" that automatically fills that blank section of sky with an "artificial" sky with the same sky color. SkyFill can be obtained here.
And finally, I used Photomatix Pro to adjust the final 360x180 spherical panoramic image in order to improve the dynamic range- lighten the shadows and darkening the overexposed parts so the final panorama has an even, balanced and natural pleasing tone.
 
  • Like
Reactions: Ronny St.
I used a free Windows program called "Image Composite Editor" (abbrev. ICE). It was produced by Microsoft Research, and although it's not updated anymore, as they didn't continue developing on it, it's a very good 360 stitching program, and very easy to work with. You can download it here.
However after ICE stitches the 360 image, since a spherical panorama has to be 360x180 degrees (meaning a final image of 2:1 ratio), and the Mini's gimbal doesn't capture the upper section of the sky, as it cannot rotate all the way up, you need to add a blank strip above the stitched 360 image so it will have a 2:1 ratio. That can be done with any advanced editing software, like Photoshop, Affinity Photo, the free Gimp, etc. It's done by enlarging the image canvas above the image to add the blank "sky", and cropping the entire image to 2:1 ratio. Then, there's another free software called "SkyFill" that automatically fills that blank section of sky with an "artificial" sky with the same sky color. SkyFill can be obtained here.
And finally, I used Photomatix Pro to adjust the final 360x180 spherical panoramic image in order to improve the dynamic range- lighten the shadows and darkening the overexposed parts so the final panorama has an even, balanced and natural pleasing tone.
Thanks for the information!
 
Yep. Can be done. I have a bunch of them for you over here
40 degrees of vertical 180 is blank, which is ok, because otherwise the drone would be taking pictures of itself and it wouldn't be useful anyway. I use 360 skies from internet to cover that blank spot.It appears that images of skies in equirectangular projection are very popular amongst CG, 3D, game design peoples, hence are easily attainable. VR panoramas are absolutely jaw-dropping when seen through a VR-helmet. Awesome experience.
 
  • Like
Reactions: NEWorldPhoto
In the test beta iOS I'm really liking the new Litchi pano options but have a question. In the Panorama Database, when you view a low res stitched 360 pano the default view is "Flat". You have the option the also build in-app a "Tiny Planet" and to view the regular pano in "View in 360" which loads the sphere. The app viewer shows the pano scrolling and you can control the scrolling - up/down and around and zoom in plus you can toggle the "Control Mode" which gives you total viewing control with your iPhone's orientation. Great viewer! You can also email/save/print etc.

On my iPhone, so far I have been able to save the 360 pano as a jpg and as a Tiny Planet (it's a 29 second/20meg movie and the planet actuals spins). My question is.... how do I get the cool "View in 360" sphere saved and viewed in something other than in Litchi. I have seen somewhere similar jpg panos being converted for the square up/down/and scroll viewing on Instagram but I've never used Instagram. I'm guessing a special viewer is need to display the 360x110 image. Any ideas or links??

Litchi is a great app that has the mini doing things that I never expected it would be capable of.

EDIT: I've been doing some research about viewers for devices and desktop and embedding in a webpage. Have lots of reading to do.
You can upload the "raw" pan to Kuula and view it by scrolling left/right/up/down. There's also a Tiny Planet option (upper right icon). I then did a video screen capture moving around in Kuula. At the moment I don't recall but there's a Win program that can display it also.
 
  • Like
Reactions: EyesWideShut
You can upload the "raw" pan to Kuula and view it by scrolling left/right/up/down. There's also a Tiny Planet option (upper right icon). I then did a video screen capture moving around in Kuula. At the moment I don't recall but there's a Win program that can display it also.
@ff22 and @vctech, thanks for your input on viewing a Litchi "sphere" and tiny planet images. Having a great time using auto pano in Litchi. I like how in one respect it's similar to Quickshots (Dronies etc) in the Fly app where you go to where you want to start and basically push GO and the 360 pano executes and then auto stitches in low res.

I've been playing with my iPhone 7 and a cheap VR headset. The phone just slips in and you focus on a split screen with the HUD info displayed. Problem is I have to screw around with my bifocal glasses and the difficulties in using them with and when taking the VR headset off and finding and focusing on the mini. I just discovered that with the new Litchi upgrade, one of the many options is to set C1 (RTH switch on the controller) to "Toggle VR". So when viewing VR you can press C1 to immediately switch the view to the phone's front camera so you can see LOS, basically the same as if you took the headset off. I'll be flying a lot more VR now that I switch back and forth between LOS and VR using the switch. Amazing!!
 
@ff22 and @vctech, thanks for your input on viewing a Litchi "sphere" and tiny planet images. Having a great time using auto pano in Litchi. I like how in one respect it's similar to Quickshots (Dronies etc) in the Fly app where you go to where you want to start and basically push GO and the 360 pano executes and then auto stitches in low res.

I've been playing with my iPhone 7 and a cheap VR headset. The phone just slips in and you focus on a split screen with the HUD info displayed. Problem is I have to screw around with my bifocal glasses and the difficulties in using them with and when taking the VR headset off and finding and focusing on the mini. I just discovered that with the new Litchi upgrade, one of the many options is to set C1 (RTH switch on the controller) to "Toggle VR". So when viewing VR you can press C1 to immediately switch the view to the phone's front camera so you can see LOS, basically the same as if you took the headset off. I'll be flying a lot more VR now that I switch back and forth between LOS and VR using the switch. Amazing!!

Yes, there's a free standalone Windows program called PanGazer to view full 360 spherical and partial panoramas in the PC's screen. You can get it here. The only drawback to this program is that it doesn't display a "smooth" or "anti-aliased" image, the details look like "rough" sharpened. There may be other programs to view panoramic images, will need to search for it.
 
So with the DroneLink app you can do automated AEB, and combine auto AEB along with an automated 360 panorama? That's something I'm very interested in doing, as AEB compensates somehow for the lack of RAW, as it allows post processing to produce an HDR image with more dynamic range. And it can also do fixed exposure bursts of several images automatically? I use that technique often to process image bursts into a combined stack of images, it can be done in Photoshop or Affinity Photo, and that produces a single image which can be enlarged and yields more resolution & detail and reduces noise. It's an effective technique that basically increases the resolution of the 12MP images.
I took a look at the DroneLink website, but it seems a bit complicated to operate that app? It appears to work based on scripts, that sounds difficult to do, or it's easy to configure all that automated scripts stuff without knowledge of programming?
I'm very interested in buying the DroneLink app if it does all that stuff automatically (panoramas, exposure bracketing, bursts, etc.), those things are the ones I do most, and the personal use version is only $20, but I'm concerned it will be quite complicated to do all that; taking a look at DroneLink's website, I couldn't understand well how that app works and can be configured, it looks very complex like needing to have programming skills.
Is it so difficult to use like it looks? I'll buy it if I'll be able to operate it without so much fiddling and complex programming.
Sorry for a bit late reply. I am in no way affiliated with Dronelink and am also quite a new DL user. I do have some experience with programming so I didn't have much trouble to modify an existing 360 panorama functions for HDR imaging. It's important to note that you do not have to be a programmer to use the DL.

You basically have two possible workflows. You can design a complete mission, with a fixed take-off point, at home using their web app (no programming knowledge needed though you do have to watch a few video tutorials). There is also something called on-the-fly functions. Those are scripts other users have written that have a list of (possibly quite complicated) drone commands that you can simply run while in the air. Something like flexible missions without a defined starting point (they start from wherever your drone is currently at).

For example, to take the 360 HDR pano you would simply bring the drone into the air (you can fly manually with DL, the same as with Litchi or DJI Fly) to the location you want and then simply run the function from my or someone elses public function repository (a few clicks from within DL app on the phone). My function would ask you for the shortest exposure you want to start at (for example 1/4000s). It would then take a series of photos (enough to built a 360x110 pano) with 1/4000s, 1/1000s (+2EV) and 1/240s (+4EV). Currently there is no way for the script to read the current exposure value so you have to set it yourself.

I have also build a script where you can fly with your drone to multiple locations and drone orientations, mark them and then the drone will automatically take a mini HDR panorama / ultra wide photo (3x2 pictures) from those points. What's cool about Dronelink is that users can build quite complicated scripts and publish them for others to use. With Litchi or DJI Fly you have to plead and wait for the app developers to add new functionality. There are currently not a lot of users extending the functionality in this way but that will hopefully change.

What I am currently not happy with is what seems to me a problem with a control-loop tuning for the Mavic Mini that produces a quite choppy motion (the automated flight is not smooth enough for videography, unless you fly extremely slowly or do video motion smoothing in post). This can be fixed and hopefully will be in the future. For taking pano photos there is no such problem.
 
Last edited:
  • Like
Reactions: ff22
Thanks a lot for the info. As Android beta tester i have never received an email like that.

I will try if c1 custom button works on Android.

Best regards

R.
Tested today. RTH programmable button not implemented at current Litchi for Android version.

I hope they will fix it soon...

R.
 
Sorry for a bit late reply. I am in no way affiliated with Dronelink and am also quite a new DL user. I do have some experience with programming so I didn't have much trouble to modify an existing 360 panorama functions for HDR imaging. It's important to note that you do not have to be a programmer to use the DL.

You basically have two possible workflows. You can design a complete mission, with a fixed take-off point, at home using their web app (no programming knowledge needed though you do have to watch a few video tutorials). There is also something called on-the-fly functions. Those are scripts other users have written that have a list of (possibly quite complicated) drone commands that you can simply run while in the air. Something like flexible missions without a defined starting point (they start from wherever your drone is currently at).

For example, to take the 360 HDR pano you would simply bring the drone into the air (you can fly manually with DL, the same as with Litchi or DJI Fly) to the location you want and then simply run the function from my or someone elses public function repository (a few clicks from within DL app on the phone). My function would ask you for the shortest exposure you want to start at (for example 1/4000s). It would then take a series of photos (enough to built a 360x110 pano) with 1/4000s, 1/1000s (+2EV) and 1/240s (+4EV). Currently there is no way for the script to read the current exposure value so you have to set it yourself.

I have also build a script where you can fly with your drone to multiple locations and drone orientations, mark them and then the drone will automatically take a mini HDR panorama / ultra wide photo (3x2 pictures) from those points. What's cool about Dronelink is that users can build quite complicated scripts and publish them for others to use. With Litchi or DJI Fly you have to plead and wait for the app developers to add new functionality. There are currently not a lot of users extending the functionality in this way but that will hopefully change.

What I am currently not happy with is what seems to me a problem with a control-loop tuning for the Mavic Mini that produces a quite choppy motion (the automated flight is not smooth enough for videography, unless you fly extremely slowly or do video motion smoothing in post). This can be fixed and hopefully will be in the future. For taking pano photos there is no such problem.
Hey, any chance that you can share your HDR 360 scripts? ;-)

I found that that I can get the DL missions using the Path profile to fly smoothly when I use a latest generation phone compared to old one.
ie I was using an old iphone 8 Plus as it is big, but paths were choppy, and I swapped to my iphone 11 and it's smooth running the same path as before.

I dont like the way that DL jumps really hard between functions though, it would be MUCH nicer if it interpolated the movement to make it smooth.
ie if you have a goto point, then a path or 360 or whatever, it has a real hard fast move between them.
 
Hey, any chance that you can share your HDR 360 scripts? ;-)

I found that that I can get the DL missions using the Path profile to fly smoothly when I use a latest generation phone compared to old one.
ie I was using an old iphone 8 Plus as it is big, but paths were choppy, and I swapped to my iphone 11 and it's smooth running the same path as before.

I don't like the way that DL jumps really hard between functions though, it would be MUCH nicer if it interpolated the movement to make it smooth.
ie if you have a goto point, then a path or 360 or whatever, it has a real hard fast move between them.
From the Web app, browse public repositories, sort by most stars and my repository (360 HDR, same username) should be near the top. I am not sure but I think you have to copy the functions into your repository to be able to see them from within the phone app.
I do have a two year old phone but with a flagship CPU from that time. If that is not fast enough to do a few hundred calculations every second, I definitely consider it a DL bug. I have no intention buying a new phone when my current one is already capable of doing a few trillion floating point operations per second.
 
  • Like
Reactions: ff22
From the Web app, browse public repositories, sort by most stars and my repository (360 HDR, same username) should be near the top. I am not sure but I think you have to copy the functions into your repository to be able to see them from within the phone app.
I do have a two year old phone but with a flagship CPU from that time. If that is not fast enough to do a few hundred calculations every second, I definitely consider it a DL bug. I have no intention buying a new phone when my current one is already capable of doing a few trillion floating point operations per second.
Great thanks heaps, Ill take a look.

yeah I hear you, I would have though that the iPhone 8 would be plenty fast enough, but the 11 definitely works better.
 
Lycus Tech Mavic Air 3 Case

DJI Drone Deals

New Threads

Forum statistics

Threads
134,413
Messages
1,594,489
Members
162,957
Latest member
DarkG