Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Video Encoding > High Efficiency Video Coding (HEVC)

Reply
 
Thread Tools Search this Thread Display Modes
Old 19th November 2017, 20:43   #1  |  Link
Neillithan
Banned
 
Join Date: Feb 2007
Posts: 124
60fps (x265 vs nvenc)

Hi. I've been recording some gameplay footage of Battlefield 4 using Nvidia Geforce Experience at 2560x1080@60fps, 50,000 bitrate, then using Xmedia Recode to convert to 10,000 bitrate (for storage purposes).

I'm mostly here to report my very unprofessional findings and ask a couple questions.

I started off converting using NVENC because it converts at a very fast rate compared to x265 (no surprise there). After watching the resulting MKV file, I noticed the framerate was behaving erratically, especially when a very noisy scene would occur, such as when a player kills you, and your screen shows a 0.5 second noise overlay effect to indicate that you have died.

After a lot of experimentation back and forth between NVENC and x265, I determined that x265 produces a much smoother playing video, but doesn't produce as high quality of a result compared to the NVENC video during the noisy part. I have a theory as to why this is happening.

NVENC allows you to specify a "maximum bitrate", and I would typically set that to 50,000 (no reason in particular, it didn't seem to influence the resulting filesize, so I kept that value). As it turns out, when this noisy 0.5s scene occurs, it uses this opportunity to apply a huge bitrate, as evidenced by the fact that the NVENC video is much higher quality compared to the x265 one (only during this noisy scene).

So, I lowered the max bitrate to 16000 in an attempt to stabilize the framerate during playback. Results were better, but still unimpressive. For whatever reason, the framerate is still erratic during this 0.5s noisy part and I just can't seem to understand why.

I tried some of the preset settings for NVENC, including "Low Latency, High Performance", but was unable to get the extremely perfect result that x265 gave me (in terms of stable framerate during playback).

For x265, I chose 10,000 bitrate, with the "FastDecode" tuning preset at Medium complexity. This produces a fantastic result.

I'm wondering if anybody here has had a chance to test NVENC with 60fps footage, especially extremely noisy footage?

Does anybody have any theories as to why NVENC produces a choppy, unstable, inconsistent playback framerate for 60fps footage, compared to x265?

Does anybody have any suggestions that would help with NVENC and creating a consistent framerate during playback during these highly noisy scenes?

It would be nice to be able to compress 60fps gameplay footage quickly using NVENC without the obvious framerate issues that occur.

I'm using Media Player Classic Home Cinema with internal HEVC decoder (H/W decoding support) with Lav and FFDShow Filters (default settings) on Windows 10. I have a quad core i7 8700K CPU, 64GB DDR4 RAM, GTX Titan X GPU.

Last edited by Neillithan; 19th November 2017 at 20:51.
Neillithan is offline   Reply With Quote
Old 20th November 2017, 12:23   #2  |  Link
Neillithan
Banned
 
Join Date: Feb 2007
Posts: 124
I've zipped up 3 videos for comparison: https://www.dropbox.com/s/vdbasyaf0o...rison.zip?dl=0

I've included an x265 video, an NVENC video, and an original video for comparison.

I encoded these using similar settings in Xmedia Recode. 10,000 bitrate, 10,500 max bitrate. Fast Decode for x265. Low Latency, High performance for NVENC. 3 Reference frames. About as similar as I can get it.

You will see that, the x265 video plays flawlessly, whereas the NVENC video has playback problems during the noisy part.

Any ideas anyone?

Edit: It looks to me like Xmedia Recode is ignoring the "maximum bitrate" value for NVENC. When I use a constant bitrate mode, the playback is flawless, but the video looks like absolute crud. I did not include a constant bitrate version in the zip file for comparison. I included a variable bitrate one. So, I'm just going to assume that "maximum bitrate" for NVENC in xmedia recode is broken. Will try contacting the software author about this. If this is the reason, then it could be applying a huge bitrate to the noisy part of the video and that would be the reason why the playback performance is poor.

Last edited by Neillithan; 20th November 2017 at 12:32.
Neillithan is offline   Reply With Quote
Old 21st November 2017, 17:24   #3  |  Link
birdie
.
 
birdie's Avatar
 
Join Date: Dec 2006
Posts: 116
What I've found on the net: "Are you trying to use compatibility mode on your game capture source? Turn that off. Also enable vsync or some other form of FPS limiter to manage system load".

I guess during those intensive scenes GPU load becomes a lot higher which means less resources are left for encoding.
birdie is offline   Reply With Quote
Old 22nd November 2017, 18:12   #4  |  Link
Neillithan
Banned
 
Join Date: Feb 2007
Posts: 124
Quote:
Originally Posted by birdie View Post
What I've found on the net: "Are you trying to use compatibility mode on your game capture source? Turn that off. Also enable vsync or some other form of FPS limiter to manage system load".

I guess during those intensive scenes GPU load becomes a lot higher which means less resources are left for encoding.
Ah, seems you have misunderstood. Allow me to clarify.

When recording gameplay footage using Nvidia Geforce Experience, the result is a 50,000 bitrate video, with little to no problems.

I am downconverting the video to a 10,000 bitrate video using HEVC NVENC, to reduce filesize. The playback issues do not occur until after the downconversion.

x265 remains flawless by comparison, it's just aggravating to convert a video at 8fps when Nvidia NVENC can handle 130fps.

Hopefully that makes sense.
Neillithan is offline   Reply With Quote
Old 23rd November 2017, 00:58   #5  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,261
NVENC gets that speed by having terrible quality at reasonable bitrates. If it did respect your bitrate limit the quality would be terrible for the noisy parts.

Using NVENC twice in one workflow is a terrible idea, generational quality loss is very high with NVENC. Why don't you simply capture to the final format if you need to use NVENC?

You should be able to run x264 software encoding for a much better game capture compared to NVENC HEVC, then reencode that with x265 for much improved quality, if a lot slower.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 23rd November 2017, 01:37   #6  |  Link
Neillithan
Banned
 
Join Date: Feb 2007
Posts: 124
Quote:
NVENC gets that speed by having terrible quality at reasonable bitrates. If it did respect your bitrate limit the quality would be terrible for the noisy parts.
I suppose you're probably right about that. When I chose constant bitrate at 10,000, the noisy part did look quite terrible. However, I saw the bitrate climb up to upwards of 30-40,000 for the noisy part... which is just absurd. If NVENC would respect my maximum bitrate and cap it at 16000 or 20000, the result would be acceptable I think.

Quote:
Using NVENC twice in one workflow is a terrible idea, generational quality loss is very high with NVENC.
I'm aware of this.

Quote:
Why don't you simply capture to the final format if you need to use NVENC?
Because, the idea is to capture the game footage at a *very* high bitrate (NVENC H264) and then downconvert to a lower bitrate, better codec (HEVC NVENC).

The initial process takes place on the GPU using Nvidia Geforce Experience, which results in an unnoticeable performance impact.

If I just simply captured the footage at (NVENC H.264) at 5000-10000 bitrate, it would look like balls by comparison to the "workflow" I've chosen. As far as I can tell, Geforce Experience does not capture at (NVENC HEVC, it only captures in NVENC H264), which kinda sucks.

Quote:
You should be able to run x264 software encoding for a much better game capture compared to NVENC HEVC
x264 will *always* be better than (NVENC H264) or (NVENC HEVC), undoubtedly, but the performance impact takes place on the CPU, which is something you generally do not want if you're playing a game and trying to capture at 60fps. I realize lots of people do this on Twitch, but in my experience, there's a *very* noticeable performance impact (mostly noticeable with mouse lag, or some kind of input delay as a result) and BELIEVE me when I say this, I have spent *countless* hours over the years tweaking OBS and reading and trying every possible combination of settings. Geforce Experience for recording gameplay footage is simply unbeatable in terms of performance. The way twitch users get around this problem is by having a second computer dedicated to the realtime 60fps x264 encoding.... Unfortunately for me, I don't have a 2nd PC lying around to dedicate for realtime encoding.

Quote:
, then reencode that with x265 for much improved quality, if a lot slower.
Just about the only thing I agree with here..... I just wish that NVENC HEVC wouldn't dedicate 30,000 bitrate to a 0.5s noisy scene, resulting in a massive sync delay / framerate dip during playback. It seems so silly and easy to prevent.... yet no value I input for max bitrate seems to have any effect. Unfortunate.

I've contacted the author of Xmedia Recode to report this bug, but as usual, he never responds to my e-mails and I'll be lucky if he fixes it (if that's even possible).

Last edited by Neillithan; 23rd November 2017 at 01:49.
Neillithan is offline   Reply With Quote
Old 23rd November 2017, 04:22   #7  |  Link
JohnLai
Registered User
 
Join Date: Mar 2008
Posts: 445
@Neillithan,

Let me get this right.

You are capturing gameplay footage via Nvidia Experience using H264 with 50 000kbps bitrate.
Next, you want to convert/transcode that H264 footage to HEVC using Nvidia NVENC.....through XMedia Recode?

Why don't you use Rigaya nvencc software for transcoding? Staxrip provides GUI for rigaya command line software. There are many options for quality and bitrate control.
JohnLai is offline   Reply With Quote
Old 23rd November 2017, 18:22   #8  |  Link
Neillithan
Banned
 
Join Date: Feb 2007
Posts: 124
Quote:
Originally Posted by JohnLai View Post
@Neillithan,

Let me get this right.

You are capturing gameplay footage via Nvidia Experience using H264 with 50 000kbps bitrate.
Next, you want to convert/transcode that H264 footage to HEVC using Nvidia NVENC.....through XMedia Recode?
Correct.

Quote:
Why don't you use Rigaya nvencc software for transcoding? Staxrip provides GUI for rigaya command line software. There are many options for quality and bitrate control.
Good idea. I have downloaded StaxRip, and I found "Command Line - NVIDIA H.265" - I'm assuming this is Rigaya NVENCC?

I used this for the codec configuration:

Code:
NVEncC64 --sar %target_sar% --codec h265 --vbr 10000 --max-bitrate 17500 -i "%script_file%" -o "%encoder_out_file%"
The resulting file never exceeded 17500 bitrate according to the MPC statistics. The video plays back very well! So far, very positive results! Thanks for turning me onto this! Going to be testing constant quality with this next.
Neillithan is offline   Reply With Quote
Old 23rd November 2017, 18:34   #9  |  Link
Neillithan
Banned
 
Join Date: Feb 2007
Posts: 124
Quote:
The resulting file never exceeded 17500 bitrate according to the MPC statistics
Hmm, I spoke too soon. It seems like max-bitrate is working sometimes, and sometimes not at all (for Staxrip Command Line - Nvidia h.265)

I just tested a Constant Quantizer value of 28, and the bitrate spikes up to 48,000 during the noisy part. I can't have this because 48,000 bitrate spikes cause the video to desync and have framerate fluctuations.

It seems like, Max-bitrate is completely ignored when using CQP... and is only somewhat obeyed when using VBR. This is very unpredictable. Why does "max-bitrate" seem to *influence* rather than *dictate*? If I set a VBR of 15000 and a max bitrate of 17500, then during the noisy part, the bitrate spikes up to 20,000. I actually do not understand what's going on here.

If I set the VBR to 10,000 and the max bitrate to 15,000... it does a pretty good job of somewhat obeying the max bitrate. MPC reports a bitrate of 16000 at one point. It's about 1000 over my max bitrate, but that's tolerable. As long as it's not going up to 48,000 I'm fine. The goal here is to prevent the video from descyning or having framerate related issues, and when the bitrate has massive spikes, my *very* fast PC can't even keep up.

The only thing I'd like to be able to do now is set a keyframe interval of 60 (1 keyframe every second), for slightly more accurate seeking.

Converting this 2560x1080@60fps video at 123 fps. Glorious.

Last edited by Neillithan; 23rd November 2017 at 18:51.
Neillithan is offline   Reply With Quote
Old 23rd November 2017, 18:51   #10  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 9,422
Max Bitrate in the VBV model can be a bit tricky. First off you would need to define the window over which you measure the Bitrate, if that's different then you get different min/Max.

If it really violates max Bitrate in VBV, you would need a VBV analyzer to tell you that.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 24th November 2017, 04:22   #11  |  Link
JohnLai
Registered User
 
Join Date: Mar 2008
Posts: 445
Quote:
Originally Posted by Neillithan View Post
Hmm, I spoke too soon. It seems like max-bitrate is working sometimes, and sometimes not at all (for Staxrip Command Line - Nvidia h.265)

I just tested a Constant Quantizer value of 28, and the bitrate spikes up to 48,000 during the noisy part. I can't have this because 48,000 bitrate spikes cause the video to desync and have framerate fluctuations.

It seems like, Max-bitrate is completely ignored when using CQP... and is only somewhat obeyed when using VBR. This is very unpredictable. Why does "max-bitrate" seem to *influence* rather than *dictate*? If I set a VBR of 15000 and a max bitrate of 17500, then during the noisy part, the bitrate spikes up to 20,000. I actually do not understand what's going on here.

If I set the VBR to 10,000 and the max bitrate to 15,000... it does a pretty good job of somewhat obeying the max bitrate. MPC reports a bitrate of 16000 at one point. It's about 1000 over my max bitrate, but that's tolerable. As long as it's not going up to 48,000 I'm fine. The goal here is to prevent the video from descyning or having framerate related issues, and when the bitrate has massive spikes, my *very* fast PC can't even keep up.

The only thing I'd like to be able to do now is set a keyframe interval of 60 (1 keyframe every second), for slightly more accurate seeking.

Converting this 2560x1080@60fps video at 123 fps. Glorious.
Keyframe? I assume you are talking about I-Frame placement....



But I don't recommend that......I prefer to use standard 10 seconds x fps rule for GOP (default option, efficiency reason) and automatic I and P frames placement in the GOP via lookahead method. My version of VBR-Quality point control mode.



In the screenshot, set your desired bitrate constraint first (Max Bitrate and Video Bitrate values must match each other, then set VBR Quality)

Normally, I just set those two bitrates at Lv5.1 38400kbps and set a quality control value at VBR Quality (27 to 30 for obvious reason?).
But in your requirement...you might wanna set bitrate limit to 10000kbps......
JohnLai is offline   Reply With Quote
Old 5th December 2017, 10:40   #12  |  Link
Neillithan
Banned
 
Join Date: Feb 2007
Posts: 124
@JohnLai,

Just wanted to say thanks for the pictures, they were very useful. I've been doing a slew of different tests, trying to cut corners here and there, and I've discovered some interesting things along the way.

I found out that Hardware decoding with MPC-HC is utter trash compared to good old CPU decoding. I disabled H/W in Lav video decoder, and a lot of those performance issues I had due to high bitrate spikes disappeared.

With that figured out, I can resort to 60fps encodes again for NVENC with a higher max bitrate. I'm still trying to work out the kinks with Staxrip concerning 60fps.

For 23.976fps movies, no matter what I try to do, NVENC just destroys certain movies, whereas for other movies, it does a VERY acceptable job. It seems to me like, NVENC is very touchy about the input footage you give it.

I'm currently using x265 Medium preset with Grain, 2-pass (Fast first pass) 40,000 max bitrate, 50,000 VBV, and a video bitrate of 12000. I have compared this to a 3-pass encoding, and the results are close. Obviously, 3 pass wins but who wants to do a 3pass? That's insane.

Comparing (x265 12000 bitrate 2-pass medium preset) to (NVENC 12000 bitrate VBR) is no contest. x265 wins pretty much every time.

Now, some interesting things to note, is I simply cannot use Constant quality for x265. There's a particular scene that I've been using to do all of my comparisons, and CQ utterly destroys the quality, even if I set it to a CQ of 13. If anybody would like to know the scene in question, PM me and I will send you a sample for your own testing.
Neillithan is offline   Reply With Quote
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 13:46.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2018, vBulletin Solutions Inc.