DJI Mavic, Air and Mini Drones
Friendly, Helpful & Knowledgeable Community
Join Us Now

Get three times your photo quality from stills...

How do you stitch jpegs? Also how can you stretch a jpeg plan view so that there is no distortion?0
I use PTGUI Pro, which works well and I'm very used to it.


You could also look at Affinity Photo, which is currently on sale. I don't use it for stitching panoramas because I'm used to PTGUI, but I use it for virtually all my other editing needs.


I've used Hugin, which is free. I converted to PTGUI years ago, but wouldn't have done that without experience with Hugin first.

 
That isnt how it works. You're not getting extra detail more than 1 single image at the same distance can give you.
Its entirely field of view.
Each individual section of that image has the same resolution and detail as the single source.

Panoramas are allowing a bigger field of view per a set distance.
 
  • Like
Reactions: Meta4 and scro
Why not just upsample with Gigapixel AI? I've used it to upres an image to be printed at very large dimensions, and it worked wonderfully.
 
Last edited:
That isnt how it works. You're not getting extra detail more than 1 single image at the same distance can give you.
Its entirely field of view.
Each individual section of that image has the same resolution and detail as the single source.

Panoramas are allowing a bigger field of view per a set distance.
For a long time stitched panoramas were the way to get increased resolution, because you could use a longer lens and end up with the same field of view as a short lens.

So for a drone like the Mavic 2 Zoom they will get you better resolution, assuming you are stitching zoomed-in images. With a fixed-focal-length lens like most drones, all you get is wider field of view.

There's a technique that superimposes multiple images of the same scene and uses statistical techniques to increase the resolution by looking at the small differences between each image to extract more detail. I'm blanking on the name — there was an app that used it a decade or so ago but that bookmark is long gone on an old computer.
 
  • Like
Reactions: RogerDH
For a long time stitched panoramas were the way to get increased resolution, because you could use a longer lens and end up with the same field of view as a short lens.

So for a drone like the Mavic 2 Zoom they will get you better resolution, assuming you are stitching zoomed-in images. With a fixed-focal-length lens like most drones, all you get is wider field of view.

There's a technique that superimposes multiple images of the same scene and uses statistical techniques to increase the resolution by looking at the small differences between each image to extract more detail. I'm blanking on the name — there was an app that used it a decade or so ago but that bookmark is long gone on an old computer.
That's called image stacking. It's used an awful lot with astrophotography.
 
That's called image stacking. It's used an awful lot with astrophotography.
I've done image stacking for HDR images too. Use Photomatix to compose those, but I think that is HDR specific.
 
That isnt how it works. You're not getting extra detail more than 1 single image at the same distance can give you.
Its entirely field of view.
Each individual section of that image has the same resolution and detail as the single source.

Panoramas are allowing a bigger field of view per a set distance.
I have been employing this technique ever since DSLR cameras were a thing. You do have to move in closer and your perspective may be wider but it gives you the ability to take higher resolution images than you normally would. It's better with telephoto lenses, when they apply the Matrice 30 technology to able to do this on the next Mavic series drone it is going to provide a completely different way to take high quality images at a customizable range of focal lengths and formats.
 
I have been employing this technique ever since DSLR cameras were a thing. You do have to move in closer and your perspective may be wider but it gives you the ability to take higher resolution images than you normally would
Stitching images together isn't doing anything to improve image quality or increase resolution.
The number of pixels captured per inch is unchanged, as is anything you might think of as image quality.
You are simply stitching images together to display a wider view (at the same resolution and image quality).
 
  • Like
Reactions: pommy
Shooting vertically and stitching the files together has been a Photoshop technique for the past 20 years..
Make sure each photo overlaps at least 15%, open then all in Photoshop > File > Photo Merge, and it builds the Panorama.
 
Begs the questions "how is photo quality measured and what is the optimal resolution." The answers are involved, but I think its fair to say greater resolution, meaning more pixels, is not in and of itself a measure of quality.

As to how many pixels one needs, that depends largely on the intended output device. e.g. you don't need a 30 MP image for a computer display. Most of it will be discarded when scaling down the image.

A fact less known to the masses is that the more pixels they jam onto a chip of a given size, the lower the light gathering ability and quality of each pixel. i.e. larger and fewer pixels can equal better image quality than more, smaller pixels in a classic case of less is more.

I guess what I'm saying is that more pixels doesn't necessarily mean better quality. Quantity and quality are not the same and should not be confused.
 
  • Like
Reactions: Bussty
Just realised rather than taking a horizontal pic at the 48 or 12 MP resolution just flip that camera to vertical and take three or 4 overlapping shots and stitch them. You now have a pretty high resolution image rivaling if not exceeding the quality of the Mavic 3 single image.
Also I use Topaz De Noise AI software (about $50) that dramatically will clean and sharpen a marginal photo. It's very easy to use and it's almost too good to be true
 
For a long time stitched panoramas were the way to get increased resolution, because you could use a longer lens and end up with the same field of view as a short lens.

So for a drone like the Mavic 2 Zoom they will get you better resolution, assuming you are stitching zoomed-in images. With a fixed-focal-length lens like most drones, all you get is wider field of view.

There's a technique that superimposes multiple images of the same scene and uses statistical techniques to increase the resolution by looking at the small differences between each image to extract more detail. I'm blanking on the name — there was an app that used it a decade or so ago but that bookmark is long gone on an old computer.
Was it PhotoAcute? It's no longer supported but I still have two versions of that program. It does super-resolution imaging (SR), which is a type of stacking but it specifically attempts to increase resolution by sub-pixel estimation.

And you're correct; I still stitch 250mm telephoto shots from my DSLR to get very high-resolution images. But the most common panorama I shoot is just 3 or 4 images in portrait orientation, stitched to produce something around 3:4 or 3:5 landscape, so it's like a wide-angle lens on a higher resolution camera.
 
With all due respect to Bussty's calculations, the last line should have read: 2889680 / 15681600 = 0.184. The mistake came when he multiplied 12192768 x 2.37. The result should have been 28896860.

To fill the same view as the Mavic 3's image (how, by backing up?) would mean you have an image height of 4032 pixels instead of 2970. At a ratio of 16:9, the width would be 7168 pixels. That means you would need 7168/3024 = 2.37 vertical shots to cover the same area. The resulting image would have (4032 x 7168) / ( 5280 x 2970) = 28901376 / 15681600 = 1.84x number of pixels. Russty's thinking is correct. This doesn't include picture overlap, which is required for stitching of course, but gives correct number of resulting pixels.
 
Al
With all due respect to Bussty's calculations, the last line should have read: 2889680 / 15681600 = 0.184. The mistake came when he multiplied 12192768 x 2.37. The result should have been 28896860.

To fill the same view as the Mavic 3's image (how, by backing up?) would mean you have an image height of 4032 pixels instead of 2970. At a ratio of 16:9, the width would be 7168 pixels. That means you would need 7168/3024 = 2.37 vertical shots to cover the same area. The resulting image would have (4032 x 7168) / ( 5280 x 2970) = 28901376 / 15681600 = 1.84x number of pixels. Russty's thinking is correct. This doesn't include picture overlap, which is required for stitching of course, but gives correct number of resulting pixels.
I was basing off 16:9 view (a very popular photo format, for me at least) Mavic 3 is only doing 5280 X 9/16 pixels vertical in that mode so vertical is 2970 compared to to vertical (once camera in vertical mode of Mini 3) of 4032

I did some tests yesterday and any system whether it drone or DSLR if you can flip your camera to vertical and "paint" the same area a single shot will take (will need either to move forward or use a tele lens to match the same single shot area) you will have considerable overall image improvement.

Cheers
 
Begs the questions "how is photo quality measured and what is the optimal resolution." The answers are involved, but I think its fair to say greater resolution, meaning more pixels, is not in and of itself a measure of quality.

As to how many pixels one needs, that depends largely on the intended output device. e.g. you don't need a 30 MP image for a computer display. Most of it will be discarded when scaling down the image.

A fact less known to the masses is that the more pixels they jam onto a chip of a given size, the lower the light gathering ability and quality of each pixel. i.e. larger and fewer pixels can equal better image quality than more, smaller pixels in a classic case of less is more.

I guess what I'm saying is that more pixels doesn't necessarily mean better quality. Quantity and quality are not the same and should not be confused.
This is true if all you are doing posting to social media then most people are going to look at on their phone. I do a few canvases and large prints for people and this is where it starts to matter. My experience is the bigger the print any sort of detail in it people love to go have a look close up. It's nice to hold the image up in these cases.

Your comments above mirror what I thought about the new 48 MP sensors compared to 12 MP but this test changed that a bit for me...

To view this content we will need your consent to set third party cookies.
For more detailed information, see our cookies page.

I'm really looking forward to receiving the Mini 3 and seeing what we can get out of it.

Cheers
 
you will have considerable overall image improvement.
You seem to have confused increasing the number of pixels with improving image quality.
They are not the same thing.
If you are really hankering for more pixels, these days you can do that with software, rather than going to the trouble of doing this with a camera.
 
You seem to have confused increasing the number of pixels with improving image quality.
They are not the same thing.
If you are really hankering for more pixels, these days you can do that with software, rather than going to the trouble of doing this with a camera.
Hey Meta4

So a couple of things...

Firstly can you post a series of stitched vertical images that match a single image view from the same sensor to demonstrate that increased pixel number doesn't increase image quality? Seeing is believing and you might be surprised.

Secondly that software you are talking about... Imagine using that on the stitched result above... you would have even better image quality.

The aim here is to get the best possible result from a small sensor and that is what I am exploring.

Cheers
 
Firstly can you post a series of stitched vertical images that match a single image view from the same sensor to demonstrate that increased pixel number doesn't increase image quality? Seeing is believing and you might be surprised.
I've been stitching panoramas for as long as I've had digital cameras.
It's great for covering a bigger area, but the image quality doesn't change ... the result just covers a wider angle of view.
It keeps the same image quality as the frames that I stitch.
It doesn't improve image quality.

Secondly that software you are talking about... Imagine using that on the stitched result above... you would have even better image quality.
You really are stuck on the mistaken idea that more pixels = better image quality.
But image quality isn't the number of pixels.
If you use a dinky little sensor camera and shoot lots of frames to form a composite, the big stitched image will still have the same dinky image quality, but just be bigger.
The aim here is to get the best possible result from a small sensor and that is what I am exploring.
If you want the best image quality, you get that from using the camera with the sensor that gives the best image quality.
You don't magically improve mediocre image quality by stitching multiple mediocre images.
To sum up in one sentence ... more pixels doesn't equal improved image quality.
 

DJI Drone Deals

New Threads

Forum statistics

Threads
134,581
Messages
1,596,523
Members
163,089
Latest member
saskia
Want to Remove this Ad? Simply login or create a free account