Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
|
|
Thread Tools | Search this Thread | Display Modes |
24th November 2024, 17:50 | #20961 | Link | |
Registered User
Join Date: Aug 2020
Location: Pennsylvania
Posts: 133
|
Quote:
In fact yesterday I tried setting eco mode using ryzen master as a test and ended up having to pull the cmos battery and rest the bios after ryzen master rebooted and nothing came up. Screwed it all up, so I gave up on that idea. |
|
25th November 2024, 09:15 | #20962 | Link |
Silent Reader
Join Date: Dec 2003
Location: Germany
Posts: 306
|
For example:
5950X with 99% Powerplan = 3.4 Ghz for all cores (Stock clock) 5950X with 100% Powerplan = 4.25 GHz for all cores (value is different for each processor) So you get around 20% higher clock. If power consumtion is 20% higher too, there is no implact on IPW, else lower clocks with much lower wattage have a higher IPW. Encoding is slower but takes lower power (cheaper) overall. In Bios there are settings for THM, TDC, EDC and PPT to play around. And with curve optimizer you can reduce wattage too (or increase clock at stock wattage). My core values differ from -17 to -24 with stable encoding. |
26th November 2024, 13:59 | #20963 | Link | ||
Registered User
Join Date: Mar 2011
Posts: 440
|
Quote:
Quote:
|
||
26th November 2024, 18:20 | #20964 | Link |
Registered User
Join Date: Sep 2006
Posts: 179
|
Before setting the Windows Balanced power plan to 99% for the CPU max, my girlfriend would remark that the four machines participating in DE sounded like a vortex. They also pumped out quite a bit of heat. Now, they're silent, cooler and still get plenty of FPS.
|
27th November 2024, 20:34 | #20965 | Link | |
Registered User
Join Date: Feb 2002
Posts: 162
|
Quote:
Last edited by stryker412; 27th November 2024 at 20:36. |
|
27th November 2024, 21:08 | #20966 | Link |
Registered User
Join Date: Aug 2020
Location: Pennsylvania
Posts: 133
|
Seeing how there is very little difference between the 3900x and 5900x, I doubt there would be much of a difference between the 3700x and 5700x in encoding performance. Remember encoding is based a lot on core and thread count, maybe a 5900x or a 5950x from the 3700x to get more cores. You will have to account for additional cooling though, either an aoi or a noctua d15, but that also brings into play the case on if it can handle an aio or the 165mm cooler height of the Noctua.
Last edited by rlev11; 27th November 2024 at 22:30. |
27th November 2024, 21:22 | #20967 | Link |
Registered User
Join Date: Aug 2020
Location: Pennsylvania
Posts: 133
|
Since we have been talking power usage lately, i found this that is interesting. Instead of changing the max cpu state in the power plan to 99%, leave that at 100%. then regedit here:
HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Power\PowerSettings\54533251-82be-4824-96c1-47b60b740d00\be337238-0d82-4146-a960-4f3749d470c7 and change "Attributes" (DWORD) and set to "2" Now, there is a new setting on advanced power settings: (in the legacy control panel power options) "Processor performance boost mode". Set it to "Disabled" and cpu power drops considerably. This works for both AMD and Intel CPU's and forces everything to run at the base clock speed (setting to 99% didn't do anything for me on my Intel systems) Have found on my AMD boxes, there is little performance difference with a considerable power decrease. On the Intel systems, since they have a much lower base clock (larger spread from base to boost speed) it does hamper performance a bit more. On my i7-14700, disabling boost, dropped p-cores to 2gz and E-cores to 1500 mhz. but cpu power dropped from 175 watts to 35 watts. lost about 30-40% performance Just something else to consider if looking at lowering power usage quickly without messing with BIOS settings. If I get some time this weekend, I'll run some numbers comparing the same video/chunk for each machine both boost enabled/disabled and come up with a fps versus watts used and see where things shake out. Last edited by rlev11; 27th November 2024 at 21:59. |
28th November 2024, 02:05 | #20968 | Link |
Not new to this...
Join Date: Nov 2024
Posts: 12
|
Why do ppl buy multi core CPU's, then hobble them just to save some energy / money ??
Why not buy a lesser CPU, and run it at 100%, what's the point in having a 16C CPU, that is hobbled down to be equivalent to a 12C ?? |
29th November 2024, 16:24 | #20969 | Link | |
Registered User
Join Date: Mar 2011
Posts: 440
|
Quote:
And as someone else said, their machines won't sound like a jet turbine just to gain another 5% performance at double the power usage and most likely killing the processor sooner then later. Last edited by Ryushin; 30th November 2024 at 14:09. |
|
30th November 2024, 02:01 | #20970 | Link | |
Not new to this...
Join Date: Nov 2024
Posts: 12
|
Quote:
You run your system as you see fit, and I will too. AFAIK, current gen CPU's are designed to run hot & fast, and it's almost mandatory to run with really good cooling, but by the time they have degraded to an unusable level, it's probably time to update, anyway !! But I doubt there'd be too many users that run their system(s) for days, let alone weeks at a time, and those that do, should probably be running Server grade systems, NOT desktops. |
|
2nd December 2024, 13:22 | #20971 | Link | |
Registered User
Join Date: Oct 2001
Posts: 465
|
Quote:
|
|
5th December 2024, 12:59 | #20972 | Link |
Registered User
Join Date: Mar 2014
Posts: 8
|
Is there a way to force files shorter than 60 seconds to still be encoded distributively?
The problem I'm having is that my main RipBot controlling computer, which I don't usually encode with but like using as the central server, uses a iGPU (UHD 770). Any video with KNLMeansCL used gets extra blurry/halos around the edges of things when encoding with the UHD 770, but with KNLMeansCL off or when I use any of my other computers with discrete GPUs I don't have that issue. Example of the issue: https://imgur.com/a/X09dQrh Last edited by hardkhora; 5th December 2024 at 13:14. |
5th December 2024, 14:39 | #20973 | Link | |
Registered User
Join Date: Oct 2001
Posts: 465
|
Quote:
do you really need nlmeans noise reduction on this footage? Looks fine to me.... I usually use CPU degrain on anime, rarely nlmeans does a better job. |
|
5th December 2024, 18:18 | #20974 | Link | |
Registered User
Join Date: Mar 2014
Posts: 8
|
Quote:
I was trying to understand if there a way to force files shorter than 60 seconds to still be encoded distributively? I also tried updating the iGPU driver and that didn't make a difference. |
|
5th December 2024, 19:28 | #20975 | Link | |
Registered User
Join Date: Oct 2001
Posts: 465
|
Quote:
if you donīt run a encoding server on the master PC - shouldnīt it pass one chunk onto one of the network encoders? Have you tried that ? |
|
5th December 2024, 20:49 | #20976 | Link | |
Registered User
Join Date: Mar 2014
Posts: 8
|
Quote:
I tried setting the chunk size to 0 but it still won't pass it to encoding servers as I'd expect. |
|
Yesterday, 05:53 | #20978 | Link | |
Not new to this...
Join Date: Nov 2024
Posts: 12
|
Quote:
And by CPU degraining, you must be referring to good old mdegrain ? You should be using SMDegrain, is just better & faster in most cases. |
|
Today, 08:31 | #20979 | Link | |
Registered User
Join Date: Oct 2024
Location: Scotland
Posts: 3
|
Quote:
Doesn't have to work as hard to achieve sitting at the speed limit which puts less strain on engine overall. |
|
Today, 19:58 | #20980 | Link |
Registered User
Join Date: Aug 2020
Location: Pennsylvania
Posts: 133
|
So I got a chance to do a full test for FPS and wattage on my entire encoding farm with cpu boost enabled and disabled. I did a separate run with a 4k source (3840x1600 light degraining) and a 1920x1080 source with heavy degraining just to see how they compare. The below link is for the spreadsheet with the results. Ran each run and got the fps from the distributed encoding window and for each cpu it was the same chunk number so that matches up. I also timed each run and saw about a 10-12% increase in total encoding time without boost, which is close to what the fps difference is. Saw about a 48% total drop in combined power usage.
So my math may be fuzzy on the end result, but it looks like it is much more cost efficient to run at least AMD processors with boost disabled since their spread is minimal. Just using my raw numbers, If an encode runs for 1 hour at full power, I would use appx 106380 total watts. If running with boost off and it takes 15% longer (69 minutes) I get 63135 total watts used for the same encode. I am not advocating anything with this, just was curious of the difference and how it all fits together. And plus, as others have mentioned, your computers don't sound like a turboprop taking off. https://docs.google.com/spreadsheets...#gid=983798900 Last edited by rlev11; Today at 20:03. |
Tags |
264, 265, appletv, avchd, bluray, gui, iphone, ipod, ps3, psp, ripbot264, x264 2-pass, x264 gui, x264_64, x265, xbox360 |
Thread Tools | Search this Thread |
Display Modes | |
|
|