The Story of my First Mavic Mini Flight with an Unexpected ending:
How did you get your MM to shoot 4K?
I understand how software works, and I understand how optics work, but I don't get how software can "create" a higher resolution than the original raw data provided by the optics (unless you change the optics). When I record an audio track at 44.1Khz sampling rate to produce a .wav file, a typical 3 -4 minute song is about 30 Mb in size. After compression (10 to 1 roughly) to MP3 format, the size is down to about 3Mb. Most people can't hear the difference, but through good headphones, I can. As far as I know, there is no software that can take an audio file originally recorded at 44.1Khz and magically convert it to 96Khz rate. Maybe I'm just not technically smart enough, but I'm skeptical of software taking video at 2.7K resolution, and converting to 4K without adding some kind of "artifacts" that were not part of the original. What am I missing?
I understand how software works, and I understand how optics work, but I don't get how software can "create" a higher resolution than the original raw data provided by the optics
You create new pixels. Red and a black next to each other? Put a dark red inbetween. Upscaling is just stretching the image to the new resolution and putting appropriate pixels inbetween. More sophisticated obv..
That sounds exactly what the compression software does in reverse. It looks at all of the frequencies a millisecond before and after, then removes most of those that didn't change.Possibly a better audio equivalent might be to mix 4 tracks of 22khz audio into a single 22khz file, or mix it in to an "upscaled" 44khz file.
Upscalers can use information in other video frames before and after the current frame to more accurately guess what the new pixels should be for going in the "gaps" between original pixels.