That's why I facepalm every time there's some debate whether to go 24, 25 or 30 fps, and everyone is talking about motion blur only. Heck, its clear as day. In the Lighthouse video you can see the guy is recording at 4k 30 fps, but if you right click the video and check "stats for nerds", you see YouTube is actually serving 25 fps video. PLUS its playing on your 60 Hertz screen, so 60 fps. You record 30 fps, throw 5 frames away each second to get 25 fps and then play it back on 60 "fps" monitor. Of course you are gonna see stuttering, you have 25 frames to display and 60 windows to display in. You paint most frames twice, but that gets you only to 50 fps. You also have to paint some frames three times. To fill the whole second.
That's why some videos are smooth. You record at 30 fps, edit at 30 fps timeline and then serve 30 fps video to 60 fps screen - every frame is painted exactly twice. Its consistent and stutter free.
If you record at anything other than 30 or 60 (or another value that you can divide 60 with without remainder), edit your video timeline at other than 30 or 60 fps, on 60 Hz screen (vaaaaast majority) you will always see stuttering.
People are arguing about the 24 vs 25 vs 30 fps while thinking about how much motion blur it allows them to capture, while completely ignoring the question "And what medium will I be watching the final output on?". Sure, if you are filming for a cinema, and you know the projector will run at 24 fps, feel free to record in 24 fps and enjoy stutter free motion blur. But if your goal is to view the video on a PC screen (or TV), make yourself a favour and record at 30 fps, and edit on 30 fps timeline.