DJI Mavic, Air and Mini Drones
Friendly, Helpful & Knowledgeable Community
Join Us Now

Technical H265 Question

Dakrisht

Well-Known Member
Joined
Apr 10, 2019
Messages
111
Reactions
54
Location
Southern California
So I’m aware that you can shoot either H264 or H265 on the M2P. And that H265 offers better compression and allows for a 10-bit color space. Great.

But can anyone provide actual technical information as to why you’d use H265 over H263 when shooting NORMAL vs D-Log M?

You’re still shooting in an 8-bit color space but using a “newer codec.” I’ve read that H265 will “give better quality and details and reduce noise and artifacts” but how - where is / what is the technical data to back up these claims?

Why shoot H265 over H264 in NORMAL is the question - and I’m searching for an answer aside from the “it’s a newer codec, everyone will switch to it soon.”
 
I am no expert and cant give a great answer but in my case I have a Mavic air H264 my pc can edigt and play the footage fine.
My my pc wont do anything well using H265. So hardware might be a big issue if your pc cant handle H265.
Dont know much about color differences though.
 
..and I’m searching for an answer aside from the “it’s a newer codec,


Definitely no expert here and honestly, if someone wants to shoot in Normal I would suggest using H.264 simply because most folks do not have hardware that can easily and smoothly edit H.265 footage yet. The upside is the higher (and improved) compression which will dramatically reduce the file size but the jury is out on how much the IQ improves with the newer codec in Normal. However, with the capacity of storage devices today, the reduced file size won't be a concern for most people.

Other than this info, I don't know of any reason to use the H.265 codec for filming in Normal 8-bit mode.
 
There is no noticable difference btw h264 and h265 at same bitrate (100 mb/s)

Theoretically h265 should compress better at bitrates > 50mb @ 4k, decrease comression artefacts. That is why DJI offer h265 for 10 bit image - compress additional color data at equal quality/bitrate of h264
 
There is no quality difference in Normal with H265, you aren't adding any data you're just making the existing data take up less space with a more efficient compression algorithim.

In 10bit Dlog the H265 is used to keep the 100Mbps bitrate and manageable file sizes despite all the extra information.

If you're shooting in normal just use H264, file sizes aren't big, storage is so cheap it's almost free (both memory cards and general storage), and any somewhat current hardware can play/edit it without issue.

When I'm on vacation I shoot in Normal/H264 because I don't have to worry about correcting the horizons and I like to edit in LumaFusion, which is only an 8bit space currently anyway. Need to watch your exposure more carefully though.
 
CD: Thanks for the clarification, as always. For your sake, (and for all the others using Lumafusion,) I'm guessing it will be a long time, if ever, before LF supports 10-bit due to processing requirements.
 
  • Like
Reactions: CanadaDrone
CD: Thanks for the clarification, as always. For your sake, (and for all the others using Lumafusion,) I'm guessing it will be a long time, if ever, before LF supports 10-bit due to processing requirements.

I think the processing is there, as the newest iPad Pros should be up to the task. They told me they are working on making the workspace 10-bit, this is what they said exactly in their correspondence to me "We will be adding 10-bit processing through our pipeline in a future update." No idea what the timeline is though.

Lens correction on the other hand, they said is on their list but weren't working on it at this time. The 10-bit workspace is of limited value when you still need to use a separate program to correct the lens distortions, at least with regards to the M2P and how that is currently handled.
 
  • Like
Reactions: brett8883
CD: Good news and bad news I guess. I still can't believe that the latest iPad Pro has the horsepower to edit 10-bit H.265 but I am by no means a CPU expert. Anyway, for your sake, I hope both updates come sooner than expected.

I must admit, I can see the validity of editing on a large Retina iPad with a stylus and I'm sure it would be perfect for all but the most critical editing tasks and more than good enough for many people. But, until it's possible, I'll wait on upgrading the wife's older iPad Pro 12 and just stick with the new desktop mochine with dual 4k displays and Adobe's Ransomware. :)
 
  • Like
Reactions: CanadaDrone
CD: Good news and bad news I guess. I still can't believe that the latest iPad Pro has the horsepower to edit 10-bit H.265 but I am by no means a CPU expert. Anyway, for your sake, I hope both updates come sooner than expected.

I must admit, I can see the validity of editing on a large Retina iPad with a stylus and I'm sure it would be perfect for all but the most critical editing tasks and more than good enough for many people. But, until it's possible, I'll wait on upgrading the wife's older iPad Pro 12 and just stick with the new desktop mochine with dual 4k displays and Adobe's Ransomware. :)

The iPad Pro actually isn't that good for editing, it's just incredibly convenient when traveling, which for me personally is when it sees all it's usage. Dealing with storage is annoying, the screens aren't good for critical editing (from a calibration and brightness uniformity standpoint), and they aren't nearly as fast as even a very modest PC. It helps that the apps are written specifically to take advantage of the iPad hardware though. I couldn't imagine trying to edit photos on an iPad, especially in large numbers, but for non-professional video it's quite good IMO. Stick with your desktop machine - it's better unless you need to travel with it :)
 
  • Like
Reactions: kilomikebravo
The iPad Pro actually isn't that good for editing, it's just incredibly convenient when traveling, which for me personally is when it sees all it's usage. Dealing with storage is annoying, the screens aren't good for critical editing (from a calibration and brightness uniformity standpoint), and they aren't nearly as fast as even a very modest PC. It helps that the apps are written specifically to take advantage of the iPad hardware though. I couldn't imagine trying to edit photos on an iPad, especially in large numbers, but for non-professional video it's quite good IMO. Stick with your desktop machine - it's better unless you need to travel with it :)

You’d be shocked. I know I was. iPad Pro can edit photos and video better then all but the most powerful desktop computers. In fact even my iPhone 8 renders H.265 in faster than real time.

It’s mind boggling.
 
  • Like
Reactions: Pigeon Camera
You’d be shocked. I know I was. iPad Pro can edit photos and video better then all but the most powerful desktop computers. In fact even my iPhone 8 renders H.265 in faster than real time.

It’s mind boggling.

iPad Pro consistently benchmarks faster than 90% of current laptops including some of the newer MacBook Pro’s. No doubt that the A12X is a monster processor.

Currently no 10-bit color support for iPad and personally, I’m not a big fan of Luma as Davinci Resolve is leaps and bounds better but the iPad is probably the future for most non-studio edits.

Apple did just launch a new MBP today with an 8-core i9 CPU so that machine should fly but again, $3500 vs. $1200 for a similar but not that comparable device (iPad Pro).
 
iPad Pro consistently benchmarks faster than 90% of current laptops including some of the newer MacBook Pro’s. No doubt that the A12X is a monster processor.

Currently no 10-bit color support for iPad and personally, I’m not a big fan of Luma as Davinci Resolve is leaps and bounds better but the iPad is probably the future for most non-studio edits.

Apple did just launch a new MBP today with an 8-core i9 CPU so that machine should fly but again, $3500 vs. $1200 for a similar but not that comparable device (iPad Pro).

Is it just that LumaFusion can’t display 10 bit images in 10 bit or can it not even open them? I guess for me if it’s just that it can’t display 10 bit I mean that not a huge deal since you won’t be rendering into 10 bit anyway.

As a photographer I work with 16 and 32 bit images all the time. Can’t see what the original looks like but doesn’t mean I can’t edit the way the final product looks.

But hey I have a workflow that’s perfect for you! So how it works is you just open your video in Davinci Resolve, pick a frame to color grade, and then do what you’d normally do to it. Then instead of rendering in DR just export the LUT from Resolve and send it to your iPad via airdrop or network connection.

The iPad will already recognize a LUT as something lumafusion can use so it will give you the option to open in lumafusion. Then you just apply the LUT to the same video and BOOM you’ve just edited in DR and rendered in Lumafusion.

The same works for Photoshop too btw.

Now does anybody know if you can use keyframes in Lumafusion?
 
Is it just that LumaFusion can’t display 10 bit images in 10 bit or can it not even open them? I guess for me if it’s just that it can’t display 10 bit I mean that not a huge deal since you won’t be rendering into 10 bit anyway.

As a photographer I work with 16 and 32 bit images all the time. Can’t see what the original looks like but doesn’t mean I can’t edit the way the final product looks.

But hey I have a workflow that’s perfect for you! So how it works is you just open your video in Davinci Resolve, pick a frame to color grade, and then do what you’d normally do to it. Then instead of rendering in DR just export the LUT from Resolve and send it to your iPad via airdrop or network connection.

The iPad will already recognize a LUT as something lumafusion can use so it will give you the option to open in lumafusion. Then you just apply the LUT to the same video and BOOM you’ve just edited in DR and rendered in Lumafusion.

The same works for Photoshop too btw.

Now does anybody know if you can use keyframes in Lumafusion?

I like that idea. Custom LUT for that particular edit. Works well ?

I don’t shoot much D-Log video since I don’t want to spend time color grading it nor do I have the need for it - normal actually works pretty well for my applications. Main focus is on still photography with this drone, 90% of my work / fun are stills.

Luma down samples to 8-bit, and still doesn’t support 10-bit HEVC since Apple gave iOS the power to support it back in December. I’m not quite sure why they haven’t pushed an update to supoort 10-bit color for 6+ months, I’m sure they’re working on it but perhaps a bit more complex than we think.

I wish Davinci made a light version of Resolve for iPad but it’s probably a micro-niche market and not worth it for them to spend such resources to code a native app.
 
Stick with your desktop machine - it's better unless you need to travel with it


CD: I wonder have you ever written a NON-informative post? <smile> I swear, you have more data in that noggin' of yours than almost anyone I know!

Anyway, thanks for the confirmation and as someone who USED to travel every other week for work (but retirement has now saved me from that arduous drill) so, as you suggested, I shall stick with the desktop and let the wife have fun playing games on the iPad. <bigger grin>
 
  • Like
Reactions: CanadaDrone
I guess for me if it’s just that it can’t display 10 bit I mean that not a huge deal since you won’t be rendering into 10 bit anyway.


Brett: I think it was mentioned in this thread or another that LF strips two bits from a 10-bit file during import (although it was CanadaDrone who told us this first,) and no, you can't display 10-bit on the vast majority of displays. But isn't the whole point of 10-bit the advantages it gives you during grading? For me anyway, it IS a "huge deal" and one of the main reasons I bought the M2P in the first place. Had I been happy with 8-bit, I would have purchased the Zoom.


Apple did just launch a new MBP today with an 8-core i9 CPU so that machine should fly but again, $3500 vs. $1200 for a similar but not that comparable device (iPad Pro).


DAK: I recently upgraded my Windoze desktop with an i9-9900, 64gb of memory, two M.2 drives, and an nVidia GTX-1070i, all for less than $1500. I never cease to be amazed at what Mac fans will pay for hardware.
 
Brett: I think it was mentioned in this thread or another that LF strips two bits from a 10-bit file during import (although it was CanadaDrone who told us this first,) and no, you can't display 10-bit on the vast majority of displays. But isn't the whole point of 10-bit the advantages it gives you during grading? For me anyway, it IS a "huge deal" and one of the main reasons I bought the M2P in the first place. Had I been happy with 8-bit, I would have purchased the Zoom.





DAK: I recently upgraded my Windoze desktop with an i9-9900, 64gb of memory, two M.2 drives, and an nVidia GTX-1070i, all for less than $1500. I never cease to be amazed at what Mac fans will pay for hardware.

Right if it just couldn’t display them then THAT wouldn’t have been a huge deal but if it converts them to 8 bit then I get it. Seems stupid why the heck do they do that.

You can’t ever compare the hardware of a PC to the hardware of a Mac. It’s like comparing a Tesla to a Ferrari. It just doesn’t work the same way.You have have to compare them on how fast they get from A to B.
 
Seems stupid why the heck do they do that.


Brett: It is simply because mobile devices do not have the horsepower to edit 10-bit video.

It’s like comparing a Tesla to a Ferrari.


Not sure if that's the best analogy, amigo. Seems to me that Macs use Intel processors as well, no? Personally, I think the comparison is more like Ford and Cadillac. You get a prettier interface with one but you pay dearly for it while the "Ford" Windoze machine does exactly the same things, it just does them more cheaply.
 
You’d be shocked. I know I was. iPad Pro can edit photos and video better then all but the most powerful desktop computers. In fact even my iPhone 8 renders H.265 in faster than real time.

It’s mind boggling.

I'm actually not shocked as I also have a 'proper' video/photo editing PC, so I know how much better it can be, however the main source of confusion is almost always with how the benchmarks work :) What so many people don't realize (not saying this is you) is that the Geekbench scores that are so often touted (this is how they are compared to laptop CPUs) are architecture benchmarks, not actual performance comparisons. Apple (and other manufacturers) spin this in various ways. It is the only way you can technically do an "apples to apples" comparison with much more powerful CPUs, because it takes things like wattage out of the equation. Obviously, the 8W CPU or whatever the A12X is gets absolutely demolished by a 35W CPU found in some of the most common laptops, and 90W+ desktop CPUs are in another universe. Performance per watt, however, is on par with entry-level ultra low voltage laptop CPUs, most of which are usually in the range of 8W to 15W similar to the A12X. In that context, the iPad Pros with the A12X are technically delivering "laptop performance" in a tablet.

Apple's CPU architecture is very efficient, but so is Intel/AMD, and the X86 platform is exponentially more capable at this time. When you are comparing architectures though, both of which have been so incredibly optimized, they are similar. The difference then becomes core count, wattage, clock speed, etc. which give enormous advantages to the non-mobile world.

My 2018 iPad Pro 11" takes about 8 seconds to load a single, small, 20MP RAW file into Affinity. My PC will load dozens of 46MP 14bit Raw files from my FF DSLR into Photoshop in less time than that. Any kind of serious photo editing on the iPad is absolutely excruciating. There is no file system, there is no way to batch process, you can't calibrate the screen, you can't use custom color profiles, everything has to be funneled though the Photos app, you can't use third party plug-ins, etc. etc. Video editing is a better story with programs like LumaFusion, but that has its limitations as well (8bit only, no distortion correction, etc.) Even a relatively cheap laptop makes a much faster photo editor than the iPad Pros, but the iPad Pros do handle the basic video tasks quite well.

The iPads can do certain things very quickly as some apps are very well optimized to run on the hardware and perform specific tasks. The notion that they are more powerful than even a modest laptop though is not true, and Apple works very hard to make you think it is by touting things Geekbench scores which give customers numbers to look at so higher must mean better, which it technically does, but is almost never interpreted in the proper context.
 
CD: I wonder have you ever written a NON-informative post? <smile> I swear, you have more data in that noggin' of yours than almost anyone I know!

Anyway, thanks for the confirmation and as someone who USED to travel every other week for work (but retirement has now saved me from that arduous drill) so, as you suggested, I shall stick with the desktop and let the wife have fun playing games on the iPad. <bigger grin>

I am glad you find my posts helpful :) My background and personal hobbies just happen to translate well into the drone world (professional photography, IT, home theater, building computers, etc.) All of those things are fairly interconnected.
 
I'm actually not shocked as I also have a 'proper' video/photo editing PC, so I know how much better it can be, however the main source of confusion is almost always with how the benchmarks work :) What so many people don't realize (not saying this is you) is that the Geekbench scores that are so often touted (this is how they are compared to laptop CPUs) are architecture benchmarks, not actual performance comparisons. Apple (and other manufacturers) spin this in various ways. It is the only way you can technically do an "apples to apples" comparison with much more powerful CPUs, because it takes things like wattage out of the equation. Obviously, the 8W CPU or whatever the A12X is gets absolutely demolished by a 35W CPU found in some of the most common laptops, and 90W+ desktop CPUs are in another universe. Performance per watt, however, is on par with entry-level ultra low voltage laptop CPUs, most of which are usually in the range of 8W to 15W similar to the A12X. In that context, the iPad Pros with the A12X are technically delivering "laptop performance" in a tablet.

Apple's CPU architecture is very efficient, but so is Intel/AMD, and the X86 platform is exponentially more capable at this time. When you are comparing architectures though, both of which have been so incredibly optimized, they are similar. The difference then becomes core count, wattage, clock speed, etc. which give enormous advantages to the non-mobile world.

My 2018 iPad Pro 11" takes about 8 seconds to load a single, small, 20MP RAW file into Affinity. My PC will load dozens of 46MP 14bit Raw files from my FF DSLR into Photoshop in less time than that. Any kind of serious photo editing on the iPad is absolutely excruciating. There is no file system, there is no way to batch process, you can't calibrate the screen, you can't use custom color profiles, everything has to be funneled though the Photos app, you can't use third party plug-ins, etc. etc. Video editing is a better story with programs like LumaFusion, but that has its limitations as well (8bit only, no distortion correction, etc.) Even a relatively cheap laptop makes a much faster photo editor than the iPad Pros, but the iPad Pros do handle the basic video tasks quite well.

The iPads can do certain things very quickly as some apps are very well optimized to run on the hardware and perform specific tasks. The notion that they are more powerful than even a modest laptop though is not true, and Apple works very hard to make you think it is by touting things Geekbench scores which give customers numbers to look at so higher must mean better, which it technically does, but is almost never interpreted in the proper context.

I respect your opinion and you seem very knowledge about a lot of things but just to test your theory I just opened a 135.7MB 32-bit HDR .tiff file in Lightroom mobile and it opened right up and I was able to edit it in real time.

I also opened the same photo but in 16bit .DNG HDR and also opened just fine and was able to edit it in real time.

I exported them to my files app because iOS does in fact have a file system. And the .JPEGs rendered instantaneously. Something that would have taken my MacBook Pro a second or two to do. Also I was using an iPhone 8 not an iPad Pro.

I still prefer to do them on my computer because that’s what I am use to but it seems the fusion chips are plenty capable of doing these things I’m not sure why it was giving you so much trouble.

There may not be a program for batch edits on the iPad (maybe photoshop?) but if not it’s just a matter of time.

Before you dismiss the idea try opening the same 4K video file on your iPad and in your video editor of choice on your computer. Do a few basic adjustments on both platforms and then render both in H.265 and see how rendering times compare. This is the test that made my jaw drop and I bet yours will too if you think the iPad Pro can be bested by modest PC.

There’s certainly room for improvement on the programming side I would like to see adobe make a full featured Premier Pro for iPad Pro but to say the hardware can’t handle it just hasn’t been my actual experience when using it. I can render a 5 minute 4K 30fps video in just a couple of minutes into H.265 with my iPhone 8, which isn’t as powerful as the iPad Pro. I don’t any modest PC that is capable of that.
 
Lycus Tech Mavic Air 3 Case

DJI Drone Deals

New Threads

Forum statistics

Threads
130,599
Messages
1,554,240
Members
159,603
Latest member
refrigasketscanada