DJI Mavic, Air and Mini Drones
Friendly, Helpful & Knowledgeable Community
Join Us Now

24, 25, or 30 FPS

I know you ate limited to 1/fps exposure time (well, technically not, but practically yes) and I agree that high shutter rates are worse for poor lighting conditions (for example, actual slomo cameras require insane bright studio lighting). That wasn't my question.

I asked how do you think 180 shutter rule relates to digital shutter, where there's no mechanical shutter plate.
 
I know you ate limited to 1/fps exposure time and I agree that high shutter rates are worse for poor lighting conditions. That wasn't my question. I asked how do you think 180 shutter rule relates to digital shutter, where there's no mechanical shutter plate.

How you capture the frame, or whether there is a physical shutter plate involved, really doesn't matter. Regardless of how it's achieved, or on what medium, you are going to end up with a sequence of still images to be played back at a given rate to simulate motion. The relevant variable you control is how long each of those images takes to capture (and by inference, how long the sensor is idle between each restart of the expose/read process).

The 180 shutter rule still applies to digitial because it provides a guide as to a good starting point for what your exposure time needs to be for a given frame rate to strike the balance I outlined above. That point then lets you select an appropriate camera settings (aperture/ISO) and add an ND filter if you feel one is necessary. Yes, there are exceptions (especially for things like propellers, for instance), and also scenarios where you may want to do break the rule on purpose for a specific "feel" to the video, but that's not the point of a "rule of thumb" is it?

Here's RED's take on the subject, if you are interested.
 
How you capture the frame, or whether there is a physical shutter plate involved, really doesn't matter.
You don't know what you're talking about. In digital video, your sensor is always active. By "shutter", you only control sampling frequency. There's no shutter angle in digital. Rolling shutter *effect* is an artifact of DSP, not alanog shutter plate. Et cetera.

The 180 shutter rule still applies to digitial because it provides a guide as to a good starting point...


"it applies because non sequitur", sigh.

No, how does it apply mechanically. Do you know where does the 180 shutter rule come from, and what a shutter angle is? Why was rolling shutter conceived in the first place? What's global shutter and how it differs from rolling shutter? What shutter angle does your Mavic shoot at (trick question)?
 
Last edited by a moderator:
Brace yourselves, the essay is coming!

Consider analog video.
  1. Shutter is a physical plate that controls whether your film is being exposed, or not.
  2. Shutter angle is the part of your shutter that exposes the film.
    • 180 shutter is a rotating (rolling) shutter that exposes the film for 1/2 of its rotation.
  3. Shutter speed is the time a single frame of your film is being exposed for.
  4. Frame rate is the speed your film moves along the film gate.

Now, for digital video with a consumer-grade camcorder.
  1. There is no physical shutter.
  2. There is no shutter angle.
    • it's technically equivalent to a 360 shutter, which is impossible in analog.
    • shutter angle doesn't apply because electronic image sensors are always active.
  3. Shutter speed is the time for which a digital sensor will be interpolating cell charges into a frame.
  4. Frame rate is a property of a video file that tells the decoder how fast it should play.
    • a digital sensor itself has no awareness of frame rate even being a thing

In analog cameras, shutter angle is completely unrelated to frame rate.
  • A 15 degree shutter, and a 180 degree shutter, both will take a whole revolution to:
    • expose a single frame of a film
    • give time for the film to move so another frame can be exposed
  • The revolution speedof a rolling shutter is strictly bound to frame rate
    • the purpose of a shutter is to allow a frame of a film to move into place
    • it has to be synchronized with the speed the film is moving at
  • The shutter angleof a rolling shutter determines exposure time!
    • frame rate multiplied by angle/360 = 1/exposure!
    • 24x180/360 = 48! So 1/48 exposure.
Now, in our digital video cameras, there's obviously no shutter, and you control exposure directly, because you can, because electronics is amazing.
  • if you want 1/200 shutter with 25fps, just set 1/200 shutter. No fiddling with changing metal plates in your camera.
  • if you set 1/25 shutter while recording at 20fps, you will simply end up with 5/25, or 1/5 less exposure (=captured movement) each second.
  • if you shoot 1/48 shutter while recording at 24fps, you're simply discarding half of the motion data that the sensor produces.
  • the smaller the "shutter angle", or the ratio of exposure time to frame time, the less motion blur you get (assuming constant frame rate), but the more motion data you discard.
  • jumps between frames are less visible at high frame rates, just because you have more of them and it's easier on the eye to produce the illusion of movement.

So, you say, why not shoot at 1/fps all the time? After all, it captures ALL the motion and there should be no jump between frames!
Well, the problem is, sometimes you get too much light and it overwhelms your sensor. Or physical film. And you end up with overexposed picture.

If you have adjustable aperture, you just increase the f-number, that is, you make your aperture smaller. This lets you use lower exposure times, thus capturing more motion blur and you end up with more natural movement without jumps between frames.

Or you can lower your gain, but only if you're not already at the minimum QE of your sensor.

But what if we don't have adjustable aperture? We use an ND filter. The purpose of an ND filter is to limit the amount of light entering the lens, letting us use higher exposure and better capture movement - we end up with more blur and less jumps between frames, and we don't overexpose in sunny days with low shutter speed.

If you made it to this point, go get yourself a treat. :D

So, to sum things up, you should always shoot at maximum available frame rate and minimum available shutter speed if the conditions allow you to. This lets you capture the scene completely, with maximum available temporal resolution (you record short events and they don't get stretched in time). If you don't have adjustable aperture, use an ND filter in high-light conditions, or lower the exposure. Only lower frame rate if you don't have enough light (it's too dark), as lowering frame rate lets you use higher exposure times.

Now keep in mind, the lower our framerate, the more we will have to decrease shutter time (=increase shutter speed) if our entire scene moves between frames so our recording doesn't end up being unrecognizable mess. It's easier to visualise if we think about a 10fps recording of someone waving their camera, where each frame is 1/10th of a second. Of course we don't want that.

"But I still wanna shoot 1/60 at 30fps!!!1" Well if you're fine with discarding half of what happens in the scene... go ahead, I guess? Can't stop ya. But be careful with events shorter than 1/30th of a second. Blink (your shutter) and you miss it. You have discarded that data. It's gone forever.
 
Last edited by a moderator:
You don't know what you're talking about. In digital video, your sensor is always active. By "shutter", you only control sampling frequency. There's no shutter angle in digital. Rolling shutter *effect* is an artifact of DSP, not alanog shutter plate. Et cetera.

There's a difference between light falling on the sensor and that light being *recorded*, I thought it was pretty clear from the context I was referring to the latter, but apparently not. e.g. If you are exposing at 1/200s at 50fps (0.25s of *exposure* per second), then 75% of the light hitting the sensor is not being recorded. If it *were* being recorded, then we'd have a lot more problems with blown highlights, no?

"it applies because non sequitur", sigh.

It's a *rule of thumb*. It is, by definition, intended solely as a starting point that will give reasonable results in the majority of circumstances and not something to be taken as a literal thing to be followed without question in every situation like a law of physics. See the "Sunny 16 rule", and many others. We're also dealing with a creative medium here, so there isn't really even a "right" setting other than where necessary for an intended effect.

No, how does it apply mechanically. Do you know where does the 180 shutter rule come from, and what a shutter angle is? Why was rolling shutter conceived in the first place? What's global shutter and how it differs from rolling shutter? What shutter angle does your Mavic shoot at (trick question)?

I think the link to RED's site I provided most of that pretty well other than the shutters (rolling = read sensor line by line and typically CMOS, global = snapshot the entire sensor as-is then read the data off and typically CCD, since you asked), but the key bit, and the definition I was using, is "Although current cameras don't necessarily control shutter speed in this way, the shutter angle terminology has persisted as a simple and universal way of describing the appearance of motion blur in video."

Yes, it's a grammatical legacy of the film/mechanical shutter era, but, as RED notes, what it represents in terms of results is still valid to digital. Understanding that, and how it relates to the type of sensor you have, allows you more creative control, which is the other aspect I was trying to get across.
 
There's a difference between light falling on the sensor and that light being *recorded*, I thought it was pretty clear from the context I was referring to the latter, but apparently not. e.g. If you are exposing at 1/200s at 50fps (0.25s of *exposure* per second), then 75% of the light hitting the sensor is not being recorded. If it *were* being recorded, then we'd have a lot more problems with blown highlights, no?

Yes, I get it, and you should always go for capturing as much of the frame time as possible. Discarding half of the frame time makes no sense. You're losing information. Yes, you will get too much blur at low framerates when using 360 shutter. That's what high framerate is for. If you can increase framerate before decreasing exposure, do it!

Yes, if you record at 120fps with 360 shutter angle and re-encode it to 60fps, you will get one of those:
- skip every 2nd frame resulting in de facto 60fps at 180 angle (best option IMO)
- interpolate 2 frames into 1 (blended frames, looks meh IMO)
- screen tearing (unacceptable IMO)

This should be adjustable in your editing software. Keep in mind not all TVs/software players can degrade gracefully.
 
Last edited by a moderator:
Yes, I get it, and you should always go for capturing as much of the frame time as possible. Discarding half of the frame time makes no sense. You're losing information. Yes, you will get too much blur at low framerates when using 360 shutter. That's what high framerate is for.

It makes perfect sense to discard some of the frame time if you have a fast enough moving subject, even at the 60fps most drones prosumer max out at. It's can become essential if you (or your client) want sharp freezeframes and/or the option to extract stills from the video. 360 shutter at 60fps means each frame is 1/60s; how far do you think an a fast moving vehicle can move in 1/60s? Far enough to render it as an unrecogniseable blur, that's how far.

It's not all about nice static landscapes, you know. :)
 
I missed the point? Dude it's the first thing I said in this thread. how the sensor reads things and how it's recorded digitally are two different things. I think you're confuing concepts.
 
I have been following this thread with fascination. I have a long history in the technical side of video dating back to analog and the transition to digital. In my very humble opinion, the frame rate discussion (resolution plays a part as well) is more about the creative purpose than the technical details. I use a scale from fantasy to reality. If you are creating a movie to enjoy it is more about presenting the fantasy, if you are creating an informative (or sports) video you are presenting the reality. Fantasy = lower frame rates (and lower resolution); reality - faster frame rates (and higher resolution).
 
  • Like
Reactions: kilomikebravo
So is their a difference between the P4P camera and the 2pro because of the different shutters. I hope I’m asking that right.
 
@Vegasdad Not only the shutter (global shutter P4P vs. electronic shutter M2P) but the sensor is different too, so different results are to expect. ;)
They share however the same size (1"), same aspect ratio (3:2) and the resolution (5472 x 3648), but due to the work with Hasselblad, they may behave differently concerning colours, sensitivity, etc. that need to be thought of.

If we can see the results is another story, as I don't have the P4P, I cannot tell.
 
I am an engineer who transfers and restores movie film.

fps

I have transferred film that was shot at 12 fps (1928 film). At that speed you can actually see individual frames and there is no illusion of smooth motion. However, it is still enjoyable to watch. Here's an example:


If the frame rate is the standard sound film speed of 24 fps, you no longer see individual frames, but the motion is still not smooth.

However ...

It is not "worse" nor is it "better" than higher speeds.

This is the thing that is missed in many of the posts. Many shows and movies are still filmed at 24 fps, not because that was the way it was done when movie film was the only way to capture motion, but because it imparts a "once removed" feeling to the action. Put another way, it does not look like reality, and that is actually a really good thing for putting the viewer in the right state of mind to believe the fantasy that is being projected.

By the time you get to 60 fps (or 30 fps interlaced), the motion is fluid and that "once removed" feeling is gone. There is very little change you will perceive by going to higher frame rates.

Spatial Resolution

720x480 NTSC resolution on a 55" TV screen does not look crisp. 1920x1080 HD resolution looks really sharp.

But here's the thing: continuing to add pixels doesn't provide a better looking picture on that 55" TV. The reason is that the ability to resolve additional detail depends on the size of the screen and the viewing distance. For a 55" screen viewed at 10-15 feet, 4K video (4096 x 2160) won't provide you with the sense of a superior picture. You might see a very, very small improvement, but in most cases you won't see a thing.

For a 55" TV screen, 4K is mostly a wasted effort.

If you go to a huge screen, or if you view at a really close distance, then you will see more detail. Also, 4K is a really good acquisition format because you crop in post and deliver in HD, and you can do many other tricks once you "have more pixels than you need." Always shoot in 4K, when you can, even if you plan to deliver in 1920x1080 HD.

Compression

This is the killer for most people. The quality and amount of compression can absolutely kill detail and introduce huge artifacts that are hard to ignore. It is really hard to find video sources that are not highly compressed. The compression in my Mavic 2 Pro is not very well done, and is definitely an issue compared to the compression in most prosumer dedicated video cameras. Those cameras, of course, have virtually no weight limits compared to an aircraft, and can afford more CPU horsepower to do a better job on compression.

So, as to the answer to the OP's question:

24 fps works well if you want a "once removed" feeling to your video. It is an international standard that can be played anywhere.

25 fps is a European standard for video that grew out of their original PAL SD video standard. If you live in the US or if your distribution includes the USA, you should avoid this frame rate.

30 fps is the North American (and Japan) NTSC video standard. It is actually quite rare to have 30 fps progressive video, because most video is interlaced which gives you an effective 60 fps temporal feel. Since 30 fps does not give much more of a fluid feel compared to 24 fps, I would avoid this one as well. In this day and age, most people are probably better off not shooting 30 fps interlaced (sometimes called 60i) and I don't even know if the M2 Pro has this option. Instead, you should shoot 60 fps progressive (a.k.a. 60p).

So my advice is this: shoot 24 fps when you want cinematic, "once removed" feeling, and shoot 60 fps for everything else. Avoid all other frame rates.
 
I am an engineer who transfers and restores movie film.

fps

I have transferred film that was shot at 12 fps (1928 film). At that speed you can actually see individual frames and there is no illusion of smooth motion. However, it is still enjoyable to watch. Here's an example:


If the frame rate is the standard sound film speed of 24 fps, you no longer see individual frames, but the motion is still not smooth.

However ...

It is not "worse" nor is it "better" than higher speeds.

This is the thing that is missed in many of the posts. Many shows and movies are still filmed at 24 fps, not because that was the way it was done when movie film was the only way to capture motion, but because it imparts a "once removed" feeling to the action. Put another way, it does not look like reality, and that is actually a really good thing for putting the viewer in the right state of mind to believe the fantasy that is being projected.

By the time you get to 60 fps (or 30 fps interlaced), the motion is fluid and that "once removed" feeling is gone. There is very little change you will perceive by going to higher frame rates.

Spatial Resolution

720x480 NTSC resolution on a 55" TV screen does not look crisp. 1920x1080 HD resolution looks really sharp.

But here's the thing: continuing to add pixels doesn't provide a better looking picture on that 55" TV. The reason is that the ability to resolve additional detail depends on the size of the screen and the viewing distance. For a 55" screen viewed at 10-15 feet, 4K video (4096 x 2160) won't provide you with the sense of a superior picture. You might see a very, very small improvement, but in most cases you won't see a thing.

For a 55" TV screen, 4K is mostly a wasted effort.

If you go to a huge screen, or if you view at a really close distance, then you will see more detail. Also, 4K is a really good acquisition format because you crop in post and deliver in HD, and you can do many other tricks once you "have more pixels than you need." Always shoot in 4K, when you can, even if you plan to deliver in 1920x1080 HD.

Compression

This is the killer for most people. The quality and amount of compression can absolutely kill detail and introduce huge artifacts that are hard to ignore. It is really hard to find video sources that are not highly compressed. The compression in my Mavic 2 Pro is not very well done, and is definitely an issue compared to the compression in most prosumer dedicated video cameras. Those cameras, of course, have virtually no weight limits compared to an aircraft, and can afford more CPU horsepower to do a better job on compression.

So, as to the answer to the OP's question:

24 fps works well if you want a "once removed" feeling to your video. It is an international standard that can be played anywhere.

25 fps is a European standard for video that grew out of their original PAL SD video standard. If you live in the US or if your distribution includes the USA, you should avoid this frame rate.

30 fps is the North American (and Japan) NTSC video standard. It is actually quite rare to have 30 fps progressive video, because most video is interlaced which gives you an effective 60 fps temporal feel. Since 30 fps does not give much more of a fluid feel compared to 24 fps, I would avoid this one as well. In this day and age, most people are probably better off not shooting 30 fps interlaced (sometimes called 60i) and I don't even know if the M2 Pro has this option. Instead, you should shoot 60 fps progressive (a.k.a. 60p).

So my advice is this: shoot 24 fps when you want cinematic, "once removed" feeling, and shoot 60 fps for everything else. Avoid all other frame rates.

Great information and I appreciate you sharing your expertise. I love the detail and the history.

I believe the 30fps, as well as all of the frame rates, on the Mavic 2 are progressive not interlaced. See page 65 of the latest manual.
 
I believe the 30fps, as well as all of the frame rates, on the Mavic 2 are progressive not interlaced. See page 65 of the latest manual.

Yes, it's 30p; the M2 has no interlaced modes at all. You can manipulate that into a fairly reasonable 60i or 60p in post in a pinch, but the only real reason for doing that is that you're low on storage space for some reason.

Unless you have a specific need to shoot at a given frame rate (whether creative or technical), the "ideal" choices will usually be either 24fps for the cinematic effect, or the fastest possible frame rate for a given resolution for maximum sharpness (subject to available light).

One very valid reason for choosing a lower frame rate is that higher framerates (and bitrates for that matter) require more CPU to encode. That translates into more battery usage, which in turn means a reduced flight time, so if you need the maximum possible flight time and can get away with fewer fps, then it's an option worth considering.
 
The MP2 is more complex than just the frame rate.... 100mb/s h264 looks worse than 100mb/s h265, however the processing on the PC/Mavic (well not the mavic as it's unprocessed hence the barrel roll) is more demanding for x265.

24 (23.967) is 'Film' speeds as said, so will give you a more cinematic look
25 is Pal in Europe's standard (our power is 50hz, so it's half the hz for inverse line fireing on CRT tv's)
30 (29.97) NTSC is American TV (the reason it's not 24/30 is the 10% broadcast overhead imposed by the technology in the 'legacy' US tv systems btw, it's meant to be 24/30 -10%) - Mostly chosen because US 60Hz power see PAL
50 NTSC (legacy) a format that was 'tried' but didnt really make it in the USA
60 / 120 - 'Modern' speeds for high framerate stuff, in the UK some TV sports are broadcast at 60

What you shoot then comes down to how you want it to look, 24fps will give you a nice cinematic look, and 'feels' more professional as what people expect when viewing movies. 30 feels more TV like. 60 for action/fast moving or halving your speeds for slow(ish) motion and 120 for Super Slow Motion.

Just remember as people have said above, use the 180degree rule for shutter speed so you get nice motion blur, else it feels very 'clinical' if using too fast shutter, or very smurry if too slow. Although 180 is a STARTING point for this, and sometimes different frame rates require different shutter speeds too (https://wipster.io/blog/debunking-the-180-degree-shutter-rule is a decent read)

For example, a good proportion of YouTube videos are in 24fps, as people want to go for the cinematic look.

I tend to shoot 2.7k/60 for the most part, using ND filters to stop it down to the required shutter speeds (on my Air, as just upgraded to MP2) but as the MP2 has the ability to change the F-Stop then you could potentially get away stopping it up (just like normal photography) , at the expense of sharpness.

Technically with the codecs in use, 24FPS video's will have more information in each frame (as it's shot at 100mb/s no matter the frame rate) so it should be clearer than shooting at 60 (as you have more bandwidth per frame), however flicking over to h265 kinda negates that as it compresses so much more info per frame, so your 100mb/s is 'sort of' equvalent to 140mb/s h264 (HEVS/h265/x265 tend to have 30-40% bandwidth improvement) but remember the MP2 shoots 10bit colour profile in h265 and only 8bit h264, so swings and roundabouts as loose some of your bandwidth improvement to colour profile.

Basically, after all that rambling, try them all yourself, see what you like, it's all personal preference. I'm used to shooting in LOG formats, on all my camera's and was certainly the best way to get un-blown out skies in the Air, however my first flight of the MP2 yesterday (see recent VLOG) showed how nice the h264 'default' profile looks on it, I was genuinely surprised how nice it looked out of the MP2 raw. Although in post using the DLOG (Hasselblad 10bit h265 log format) shows how much information is stored in the video you can work with (although kicks even my I9/1080ti rig's butt on processing in Premiere!)

So, now not sure what I'll use. I may revert to out of the box h264 for some stuff, and use the DLOG when I know I'll need it only, have to fly some more (and it's piddling it down here!) to know for sure I suppose.

Anyhow, I'll link the vlog, skip towards the end for the footage if you don't want to see me waffling on, I have a side by side of DLOG Raw and Colour Graded in flight footage, as well as some of the default out of camera H264 footage in there to look at.

 
Last edited:
I am new to video coming from a shooting stills background, and this thread has been Very informative, thanks to all.
 

DJI Drone Deals

New Threads

Forum statistics

Threads
131,555
Messages
1,564,206
Members
160,445
Latest member
piloter