DJI Mavic, Air and Mini Drones
Friendly, Helpful & Knowledgeable Community
Join Us Now

New PC build

Hmmm, more to think about. The i7 in my build satisfies the minimum requirements for Pinnacle. My problem is I stopped staying current with processors with the last generation of Pentiums. Much to catch up on.

Graphics cards though, nada. From what I read, even though these cards are meant for gaming, getting a better card will improve performance for the editing software. I do use effects. Pinnacle does use proxy files, as does my current software.

So in the end, just keep throwing money at it, And I thought the drone was expensive.....

Graphics cards are more specialised than the processor but a few years ago companies realised it there were certain tasks a graphics card could do better than a general processor, generally heavy compute functions that involve performing large numbers of simple calculations. You can actually buy high end 'graphics' cards from Nvidia and AMD that have no graphics output as they use the graphics core purely for computing.

That's why I was asking what the spec of your previous system was as you may end up spending a lot of money for not much improvement. I upgraded from an old hex core i7 3930K to a 12 core Ryzen 9 3900x which not only had twice the number of cores but each of those cores are much quicker as well. However I was surprised when I disabled GPU acceleration in my video editing software due to a rendering problem that the render performance was significantly slower so my graphics card was doing much of the work, not the Ryzen CPU as I thought. I've not used the Pinnacle software so I'd check with users what hardware gives the most benefit with it.

Quadro cards use the same or very similar cores to the Geforce cards but they're certified for professional use, in practice it means you'll get a higher performance Geforce card at the same price of a Quadro card. In this case the Quadro 620 looks to be slower than a GTX 1050 but for the same price in the UK you can get the much faster Geforce GTX 1660. However you may already have the card or there's a reason you've chosen so that's why I was asking earlier.
 
Usually the problem is with the monitor.

If the monitor is not 4K, then it'll bottleneck.

Even if you have a 4K video card, you need the 4K monitor for a 4K video. You can always reduce down, but not upscale.
 
That was definitely part of my problem. I'll be getting a 27" 4K monitor as my primary monitor. Second monitor, one of my old ones.
 
Wow, new PC up and running, all software and data moved successfully....and wow is all I can say. Ten seconds from pushing the power button to desktop, ready to roll. I love this, mind you the 27" monitor doesn't hurt.

Now I'm spending my time learning how to use Pinnacle Studio. Feels like I'm back in school watching all these "Instructional Videos". Ah well, I have the time.....
 
Usually the problem is with the monitor.

If the monitor is not 4K, then it'll bottleneck.

Even if you have a 4K video card, you need the 4K monitor for a 4K video. You can always reduce down, but not upscale.

I'm not sure what you're saying here, the monitor is just an output device so it won't bottleneck or any effect on the performance for rendering or editing, technically you don't even need a monitor connected to the PC for that type of work. You only need a 4k monitor if you want to view the 4k video at native resolution but you can still easily watch it on a lower resolution screen, all my video work is 4k but I don't use 4k monitors because I dislike the amount of scaling you need for the interface to be a normal size.
 
  • Like
Reactions: passedpawn
There’s a lot of misunderstanding about the number of cores in a CPU and system performance, especially with video editing.

The video editing software will only use multiple cores IF it is written to make use of them regardless of the support provided by the operating system.

The software, by design, has to be able to benefit from breaking its processing into multiple threads that can be run in parallel. It’s this process that can provide a speed increase over a single core CPU, which can only run one process at a time.

Video editing/rendering is ideal for this type of parallel processing as many video frames can be worked on at the same time. A single core can decode a compressed format such as h.264/h.265 while another can be working on colour or effects processing for example.

Davinci Resolve is one such application but throwing more cores at it beyond what it is written for will make absolutely no difference.

However, most parallel image processing is handled by the GPU. With hundreds of GPU cores available the throughput is enormous. But again the application MUST support GPU processing and most of the professional ones do, such as Resolve, Premiere Pro, MediaComposer and FCP.

Spending money on a supported powerful graphics board, RAM and SSD’s is a far better bet than the marketing hype for many of the recent high multicore count CPU’s as Barbara has witnessed.
 
I'm not sure what you're saying here, the monitor is just an output device so it won't bottleneck or any effect on the performance for rendering or editing, technically you don't even need a monitor connected to the PC for that type of work. You only need a 4k monitor if you want to view the 4k video at native resolution but you can still easily watch it on a lower resolution screen, all my video work is 4k but I don't use 4k monitors because I dislike the amount of scaling you need for the interface to be a normal size.
If you try to watch a 4K video on something less than 4K monitor sometimes even if you scale down, it still chokes. It might be the software still trying to render the whole 4K and an inefficiency thereof. It's just better if you have a 4K video and a 4K monitor to make sure things work out efficiently. It's still easier to downscale something from a bigger picture, but upscaling just sucks. See one of those cheap cop cams. It does have an automatic infrared mode, but it upscales from 720p to 1080p. The thing is just junk. It goes faster on an upscale because it isn't processing any new blocks, it just blur enlarges them.
 
If you try to watch a 4K video on something less than 4K monitor sometimes even if you scale down, it still chokes. It might be the software still trying to render the whole 4K and an inefficiency thereof. It's just better if you have a 4K video and a 4K monitor to make sure things work out efficiently. It's still easier to downscale something from a bigger picture, but upscaling just sucks. See one of those cheap cop cams. It does have an automatic infrared mode, but it upscales from 720p to 1080p. The thing is just junk. It goes faster on an upscale because it isn't processing any new blocks, it just blur enlarges them.

You're contradicting yourself, you first claim it's more work to downscale from 4K to 1440p then you claim it's easier to downscale. You were correct the second time, it's easy to downscale a signal and it's less work for a graphics card rendering 4k at 1440p or lower resolution as it's doing less work since there's less pixels for it to draw. I've never come across a system that 'chokes' playing 4k at 1440p resolution and if you're seeing this I can only assume there's something very wrong with the configuration. I've been working with 4K on 1440p systems for many years and never had a single issue working with 4K while some of my low end Core-m/ULV systems can manage to play back 4K at 1080p or 1440p out but can struggle at native 4K as it's a bit beyond their hardware.

If you want the extra resolution then a 4K monitor is worth considering but it's certainly not the case that you're getting better performance, it's the opposite because rendering at 4K is harder work than lower resolutions. Any system capable of video editing is unlikely to be having 4K playback issues anyway unless there is something wrong with it.
 
  • Like
Reactions: passedpawn
You're contradicting yourself, you first claim it's more work to downscale from 4K to 1440p then you claim it's easier to downscale. You were correct the second time, it's easy to downscale a signal and it's less work for a graphics card rendering 4k at 1440p or lower resolution as it's doing less work since there's less pixels for it to draw. I've never come across a system that 'chokes' playing 4k at 1440p resolution and if you're seeing this I can only assume there's something very wrong with the configuration. I've been working with 4K on 1440p systems for many years and never had a single issue working with 4K while some of my low end Core-m/ULV systems can manage to play back 4K at 1080p or 1440p out but can struggle at native 4K as it's a bit beyond their hardware.

If you want the extra resolution then a 4K monitor is worth considering but it's certainly not the case that you're getting better performance, it's the opposite because rendering at 4K is harder work than lower resolutions. Any system capable of video editing is unlikely to be having 4K playback issues anyway unless there is something wrong with it.
I'm assuming you are using Windows or Mac which hide lots of things going on behind the scenes from the user. And if you watch a video on Youtube, it selects a video resolution suitable so that you aren't playing 4K video on a browser that's only doing a 720 frame. I did a video the other day that needed some remastering, only using open source ffmpeg without a full blown video frame by frame editor. Just did processing of the frames without seeing each frame as it was doing it.

4K does take a lot more work than 1080p, yes. And running a 1080p on a 4K screen is less work. And modern video cards do the processing in hardware (GPU) and leave the CPU to manage the operating system and programs.

Upscaling it taking the image and trying to produce missing elements by fudging them that would've been picked up in a larger video/picture with more detail. It just smudges the watercolors even more. Movies were taken in lots higher resolution, but formatted to fit your TV screen (which means you lost some of the detail, especially in wide formats).
 
Last edited:
I'm assuming you are using Windows or Mac which hide lots of things going on behind the scenes from the user. And if you watch a video on Youtube, it selects a video resolution suitable so that you aren't playing 4K video on a browser that's only doing a 720 frame. I did a video the other day that needed some remastering, only using open source ffmpeg without a full blown video frame by frame editor. Just did processing of the frames without seeing each frame as it was doing it.

4K does take a lot more work than 1080p, yes. And running a 1080p on a 4K screen is less work. And modern video cards do the processing in hardware (GPU) and leave the CPU to manage the operating system and programs.

Upscaling it taking the image and trying to produce missing elements by fudging them that would've been picked up in a larger video/picture with more detail. It just smudges the watercolors even more. Movies were taken in lots higher resolution, but formatted to fit your TV screen (which means you lost some of the detail, especially in wide formats).

Seems your understanding of video and formats is a little lacking. ‘It just smudges the watercolours even more’ is just nonsense as are many other comments you have made.

It doesn’t help when people who are trying to learn the technologies on the forum when posts such as these show a clear misunderstanding.
 
  • Like
Reactions: passedpawn

DJI Drone Deals

New Threads

Forum statistics

Threads
130,984
Messages
1,558,584
Members
159,978
Latest member
James Hoogenboom