DJI Mavic, Air and Mini Drones
Friendly, Helpful & Knowledgeable Community
Join Us Now

HUGE difference in video quality between micro SD & Go4 app recording

4K is overkill anyway. I'm still experimenting with 1080 and it may be more than adequate for excellent HD videos.


It wont be overkill in a year or two 4k tv's are now on heavy discounts and 5k screens are already out there. more 4k will be shot is my guess in next 12 to 18 months then you might just wish you had kept the 4k.

you can compress 4k to 1080 but you cant take it up. Better to have it in storage now at 4k.
 
4K is overkill anyway. I'm still experimenting with 1080 and it may be more than adequate for excellent HD videos.

i used to think that aswell, but from my limited experience the 4k video from the mavic, (downsampled by my pc and tv to 1080), is an order of magnitude better in quality than native playback recorded at1080p.
 
  • Like
Reactions: Gary cato
ok so here is how to tell cheaply

stick it in an sd card reader go to TV shop tell em u are looking for new tv but want to check which ones can play your 4K footage stick it in all their tvs and find one that plays it have a watch and see.

or find someone with a new laptop

In general, very few laptops have a 4K display. And very few laptops have decent video processors. Those that do are expensive. No matter what you do if you do not have a native 4K display any 4K content is going to be compromised. Upscaling, down scaling, any scaling is going to introduce anomalies.
 
It wont be overkill in a year or two 4k tv's are now on heavy discounts and 5k screens are already out there. more 4k will be shot is my guess in next 12 to 18 months then you might just wish you had kept the 4k.

you can compress 4k to 1080 but you cant take it up. Better to have it in storage now at 4k.

5k is never going to see a broadacast. It's good for Apple and great for editing. 4K isn't even broadcasted yet. Bandwidth is expensive etc.

The resolution has to effectively double each time just for the eye to see a difference as far as broadcasting is concerned. 4K is sold but not broadcasted unless it's upscaled. After 4k for Television broadcasts of and when it comes it will go to 8K.

Short story. 4K is still a waste of money. Full 1080 is fine. But yeah it's not far off. I doubt the networks are in any hurry although. As they just not long ago adopted 720-1080 after a long long run with standard definition and that took 15+ Years. ATSC 1.0. 4K would be ATSC 3.0.

The end result is the manufacturers of TVs pushed well ahead of broadcast capabilities and standards. Why? The same reason the iPhone has looked the same the last few years to make money. It will come eventually. But aside from an internet broadcast etc. OTA is still a ways off.

Everything we see as far as DirectTV 4K etc. Fake that's all upscaled. Not native.
 
Last edited:
No matter what you do if you do not have a native 4K display any 4K content is going to be compromised. Upscaling, down scaling, any scaling is going to introduce anomalies.

True, but it's still going to be better than if your recorded it at 1080p. I don't know much about video but I know a lot about photography and the same principals will apply I would have thought.

A digital video, same as a digital image, is quite simply a collection of digital data compressed into a file. Simply put, the more data you squeeze into the that file, (regardless of the output), the more information you have recorded and hence the better the image will be.

I have a d810 capable of shooting a 36mp image which is good to print billboard size. But if I print it at the tiny size of say a3, the full 36mp file is still going to look better than if I shot it at 12mp which is all you need for an a3.

In photography, and video, more information is always better.
 
5k is never going to see a broadacast. .
That sounds like one to preserve for posterity, like no one will ever need more than 8MB storage ;)
They'll probably invent a new nano-sync-double-reflex transmission method and go straight to 8K
There won't be any need but eventually it will be default.
By then mini-SD cards will still be $10 but hold 20TB so no probs.
 
  • Like
Reactions: Gary cato
True, but it's still going to be better than if your recorded it at 1080p. I don't know much about video but I know a lot about photography and the same principals will apply I would have thought.

A digital video, same as a digital image, is quite simply a collection of digital data compressed into a file. Simply put, the more data you squeeze into the that file, (regardless of the output), the more information you have recorded and hence the better the image will be.

I have a d810 capable of shooting a 36mp image which is good to print billboard size. But if I print it at the tiny size of say a3, the full 36mp file is still going to look better than if I shot it at 12mp which is all you need for an a3.

In photography, and video, more information is always better.

Agree with you here the 4k has far higher image quality so compression down affects it less and will still give a better picture on downscaling

36mp what camera are you running?
 
True, but it's still going to be better than if your recorded it at 1080p. I don't know much about video but I know a lot about photography and the same principals will apply I would have thought.

A digital video, same as a digital image, is quite simply a collection of digital data compressed into a file. Simply put, the more data you squeeze into the that file, (regardless of the output), the more information you have recorded and hence the better the image will be.

I have a d810 capable of shooting a 36mp image which is good to print billboard size. But if I print it at the tiny size of say a3, the full 36mp file is still going to look better than if I shot it at 12mp which is all you need for an a3.

In photography, and video, more information is always better.

The ability to interpolate a photograph pixel by pixel has been around for a long time. It takes a good graphics processor and intensive algorithms.

To do the same frame by frame with a video is not going to happen in anything like real time with a computer short of a collection of Crays. Think about the rendering process when you finish an edit. That is the sort of time it takes to do a frame by frame. If you do a rendering with a powerful algorithm and a good computer, then you would do best to render it in the native format you want to use on a given display. Just tossing 4K video at a computer is not going to give the same results in real time. A good example is gamers. They have powerful computers and they choose a native resolution for gameplay. They don't try to render on the fly.
 
The ability to interpolate a photograph pixel by pixel has been around for a long time. It takes a good graphics processor and intensive algorithms.

To do the same frame by frame with a video is not going to happen in anything like real time with a computer short of a collection of Crays. Think about the rendering process when you finish an edit. That is the sort of time it takes to do a frame by frame. If you do a rendering with a powerful algorithm and a good computer, then you would do best to render it in the native format you want to use on a given display. Just tossing 4K video at a computer is not going to give the same results in real time. A good example is gamers. They have powerful computers and they choose a native resolution for gameplay. They don't try to render on the fly.

Id love to be be able to reply and have a conversation but I'm nowhere near smart enough to understand that post. lol
 
  • Like
Reactions: Michael Ainsworth
Id love to be be able to reply and have a conversation but I'm nowhere near smart enough to understand that post. lol

You are familiar with still photography and understand that having more pixels than necessary can be beneficial for photographs. The algorithm (program) scaling the picture can interpolate, (smooth) for displays not capable of displaying all the information. So lets say you have the pixels in width and height now designated 4K (just an extension of the various resolutions from the old TV formats to the newer) Now lets say your display is only able to display 1K or 1080 as we call it. The programming in whatever the rendering engine is can take adjacent pixels in the original and smooth them to display nicely on the 1080 screen. So that original 8.3 megapixels is reduced little piece by little piece so that the final product only has a little over 2 megapixels. So over 8 million pixels are analysed for differences edges etc. If you have a 4K video and a 1080 display and you want to display a video in near real time (only a little delay from the beginning but no delays between frames, at 30 frames per second the rendering program and computer (many times a graphics chip) have to process a bare minimum of 249 milliion pixels per second. And in order to render those with a crisp outcome that number will be much higher as the computer attempts to determine edges etc. A small delay in opening a photo to be rendered and displayed is never a problem as we are used to waiting a little for the photo to be displayed. That cannot happen in a video.

Now if you really want to screw it up, shoot that 4K video at 24 frames per second and display on a device with a 60 frame per second capability. The old two frames then three frames etc will be the same as that which was used to pull down movie film to 60 hz displays. A whole lot more processing. For sure something will have to give. And I can guarantee it will be video quality.
 
You are familiar with still photography and understand that having more pixels than necessary can be beneficial for photographs. The algorithm (program) scaling the picture can interpolate, (smooth) for displays not capable of displaying all the information. So lets say you have the pixels in width and height now designated 4K (just an extension of the various resolutions from the old TV formats to the newer) Now lets say your display is only able to display 1K or 1080 as we call it. The programming in whatever the rendering engine is can take adjacent pixels in the original and smooth them to display nicely on the 1080 screen. So that original 8.3 megapixels is reduced little piece by little piece so that the final product only has a little over 2 megapixels. So over 8 million pixels are analysed for differences edges etc. If you have a 4K video and a 1080 display and you want to display a video in near real time (only a little delay from the beginning but no delays between frames, at 30 frames per second the rendering program and computer (many times a graphics chip) have to process a bare minimum of 249 milliion pixels per second. And in order to render those with a crisp outcome that number will be much higher as the computer attempts to determine edges etc. A small delay in opening a photo to be rendered and displayed is never a problem as we are used to waiting a little for the photo to be displayed. That cannot happen in a video.

Now if you really want to screw it up, shoot that 4K video at 24 frames per second and display on a device with a 60 frame per second capability. The old two frames then three frames etc will be the same as that which was used to pull down movie film to 60 hz displays. A whole lot more processing. For sure something will have to give. And I can guarantee it will be video quality.

ok, thanks, i sort of follow your premise but dont fully understand where you're going with the processing time part or what youre recommendation is. when i play a 4k recorded video on my PC, (without a 4k monitor), isnt that downsampling that to 1080 on the fly ? this isn't taking any load time so im guessing i misunderstand what you're saying ?

are you saying that its better to record at 1080 when you are playing it on a 1080 screen or that its better to record in 4k playing on a 1080 screen ? that hasnt been my experience...not to my eyes anyway.
 
ok, thanks, i sort of follow your premise but dont fully understand where you're going with the processing time part or what youre recommendation is. when i play a 4k recorded video on my PC, (without a 4k monitor), isnt that downsampling that to 1080 on the fly ? this isn't taking any load time so im guessing i misunderstand what you're saying ?

are you saying that its better to record at 1080 when you are playing it on a 1080 screen or that its better to record in 4k playing on a 1080 screen ? that hasnt been my experience...not to my eyes anyway.

It will depend on the size of the screen the correspondence of the original resolution to the screen resolution. So 4K to 1K is going to be easier than other odd ball ratios. And 30 to 60 frames per second is going to be easier than oddball ratios. So there is not a hard and fast rule. If you like what you are getting using 4K by all means use that. However, don't be surprised if a friend says WTF kind of crap video is that? Say your friend does not use the computer but uses his medium screen TV that only has 720 resolution. Now instead of an integer of pixels it is a ratio of 5.31 Which pixels do we discard? Which are critical to our eye not seeing zaggies or halos or whatever else we can resolve?

Seems like a few are discovering a better video using 2.7K One wonders if there really are 8.3 megapixels being recorded for every frame in the Mavic Camera at 4K?
 
I don't buy the logic that a 4K image will result in a superior 1080 image, if that's what you are saying.

In fact the quality may (or may not) be degraded when those pixels have been rather brutally compressed, as is the case with 4K.

Anyway - proof is in the pudding and all that. I'm doing my own experiments, centering on aliasing of vertical ship masts for now and may share my results later if I get organized enough. haha.
 
A partial update as my experiments continue. Shooting at 2.6k is pretty darn good. I don't see the flickering artifacts at 1080, and the file size is significantly smaller and easier to handle.

But don't take my word for it - experiment yourself and see how the same subject matter looks at a number of resolutions and frame rates. Then we can all compare our results. I think it would be interesting.
 
I was watching the same video clip from my micro SD card (after copying onto PC) and the auto-recorded clip on my ipad (found in 'editor' section of Go4 app) and was stunned at the difference.

The onboard video from the Sandisk Extreme card (recommended micro SD for the Mavic) was TOTAL crap. It was hyper sharp, and had tons of drop outs and very jumpy. I would have assumed the opposite if anything. Anyway, does anyone know how to fix this issue?

I know it cannot be my video settings because I had just optimized them in the app and this same video clip that Go4 cached onto my ipad mini is perfect in every way.

You're joking right?
 
Lycus Tech Mavic Air 3 Case

DJI Drone Deals

New Threads

Forum statistics

Threads
131,095
Messages
1,559,770
Members
160,077
Latest member
svdroneshots