Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Capturing and Editing Video > New and alternative a/v containers

Reply
 
Thread Tools Search this Thread Display Modes
Old 24th April 2012, 10:29   #10601  |  Link
jmone
Registered User
 
Join Date: Dec 2007
Posts: 652
Quote:
Originally Posted by jakmal View Post
If you are going to use madVR, I don't suggest using DDR3-1333.
Interesting as I get 0 dropped frames on 50i / 50p / 60i material on the i7 2600K 3000 iGPU using DDR3-1333 with MN. FYI - I agree with your EDID comments, I need to use a DVI (or HDMI) detective plus to stop issues when the "source" changes (AVR swithes away). I'm just not sure the upgrade to IVB is worth it.....

EDIT: Mem is Kingston KVR1333D3N9K2/8G 8GB KIT 1333MHz (PC3-10600) DDR3 NON-ECC CL9 240pin

Last edited by jmone; 24th April 2012 at 10:37.
jmone is offline   Reply With Quote
Old 24th April 2012, 10:33   #10602  |  Link
jakmal
Registered User
 
Join Date: Jul 2010
Location: Sunnyvale, CA
Posts: 51
Quote:
Originally Posted by egur View Post
EVR uses what the driver recommends as the best algorithm. This is HW and driver specific. Different GPU can give completely different results.
EVR-CP is an open source project that implements EVR's interfaces.
Using EVR (in SNB/IVB) you'll get:
* Context adaptive scaling
* Deinterlacing and all video post processing algorithms found in the iGPU control panel.
Does this mean that context adaptive scaling isn't made available through the DXVA calls that EVR-CP makes? I look at the scaling performance of EVR just now, and it is pretty neat, and I think the quality is the same as that of the PowerDVD playback screenshot.

Any way to bring up statistics similar to what is obtained in EVR-CP by pressing Ctrl-J ?
__________________

Ganesh T S
Sr. Editor, AnandTech Inc.
jakmal is offline   Reply With Quote
Old 24th April 2012, 10:40   #10603  |  Link
aufkrawall
Registered User
 
Join Date: Dec 2011
Posts: 1,812
Quote:
Originally Posted by nevcairiel View Post
Considering IVB is even native 1600 now, getting 1333 seems just dumb.
I'd rather wait and spend the money on a good 28nm GPU.
Or on Trinity.
aufkrawall is offline   Reply With Quote
Old 24th April 2012, 11:09   #10604  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,342
Quote:
Originally Posted by aufkrawall View Post
I'd rather wait and spend the money on a good 28nm GPU.
How is memory speed related to an external GPU?
A good GPU won't make bad RAM go fast.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders

Last edited by nevcairiel; 24th April 2012 at 11:21.
nevcairiel is offline   Reply With Quote
Old 24th April 2012, 11:23   #10605  |  Link
egur
QuickSync Decoder author
 
Join Date: Apr 2011
Location: Atlit, Israel
Posts: 916
Quote:
Originally Posted by jakmal View Post
Does this mean that context adaptive scaling isn't made available through the DXVA calls that EVR-CP makes? I look at the scaling performance of EVR just now, and it is pretty neat, and I think the quality is the same as that of the PowerDVD playback screenshot.

Any way to bring up statistics similar to what is obtained in EVR-CP by pressing Ctrl-J ?
The quality should be the same as PowerDVD/WinDVD renderer as they'll use the HW scaler as well. They might opt for different video processing settings or even add their own algorithms into the mix.

For subtitles I use vobsub (2.0 if I remember correctly) so I don't rely on the renderer. Not optimal but it works just fine and I mostly watch TV shows and movies with subs.

I don't watch live TV through my HTPC so deinterlacing is hardly ever used.

EVR-CP produced some performance problems (i7-2600) on some situations and provided lower quality than EVR.
Sync issues exist but they are not that bad and mostly insignificant. It's not optimal but the whole setup is very simple (no dGPU).

People have different setups and watch different content.
For my needs the top priority are scaling quality and proper luma levels which work very well for me. I use very mild NR and sharpening. Mild settings on the total color control as well.
__________________
Eric Gur,
Processor Application Engineer for Overclocking and CPU technologies
Intel QuickSync Decoder author
Intel Corp.
egur is offline   Reply With Quote
Old 24th April 2012, 11:25   #10606  |  Link
aufkrawall
Registered User
 
Join Date: Dec 2011
Posts: 1,812
Quote:
Originally Posted by nevcairiel View Post
How is memory speed related to an external GPU?
A good GPU won't make bad RAM go fast.
I haven't read the whole conversation. So I assumed RAM speed was somehow related to iGPU/shared memory stuff.

I don't see why one would need >1333 Mhz RAM to avoid dropped frames with madVR and hardware decoding like CUDA/QS.
Even software decoding should work.
aufkrawall is offline   Reply With Quote
Old 24th April 2012, 11:30   #10607  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,342
If your CPUs memory controller is officially rated for 1600Mhz, you really should get 1600 (or above) RAM, otherwise you just do waste performance. Its not like there is a big price gap (if one at all).
That the iGPU benefits from the memory speed more then most CPU applications is one factor, but not the whole argument.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 24th April 2012, 11:47   #10608  |  Link
aufkrawall
Registered User
 
Join Date: Dec 2011
Posts: 1,812
Quote:
Originally Posted by nevcairiel View Post
If your CPUs memory controller is officially rated for 1600Mhz, you really should get 1600 (or above) RAM, otherwise you just do waste performance. Its not like there is a big price gap (if one at all).
I've just tested my i5 2500k with 1033 Mhz RAM and there was no dropped frame either.
Test videos weren't easy: 1080p, 60fps, high bitrates and pure software decoding.

My assumption is: If there are dropped frames with low memory frequency, the problem could be caused by something else.

Quote:
Originally Posted by nevcairiel View Post
That the iGPU benefits from the memory speed more then most CPU applications is one factor, but not the whole argument.
Maybe the money should be spend on watercooling for CPU.
Then you can have higher clocks which will be much more gainful.

Last edited by aufkrawall; 24th April 2012 at 11:58.
aufkrawall is offline   Reply With Quote
Old 24th April 2012, 12:28   #10609  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,342
You really don't get the point. Oh well.

The point is that 1600Mhz RAM isn't more expensive then 1333Mhz RAM, both are about the same price around here, so when you get 1333 instead, you're just doing it wrong.

The bonus that the iGPU really benefits from the 1600Mhz over the 1333Mhz is irrelevant for that fact (although something important to know if you want to use the iGPU)
And no, overclocking won't help if your memory bandwidth is limiting.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders

Last edited by nevcairiel; 24th April 2012 at 12:34.
nevcairiel is offline   Reply With Quote
Old 24th April 2012, 13:43   #10610  |  Link
SamuriHL
Registered User
 
SamuriHL's Avatar
 
Join Date: May 2004
Posts: 5,351
Guess I'm glad I sprung for the fast ram.

Sent from my Xoom using Tapatalk 2
__________________
HTPC: Windows 11, AMD 5900X, RTX 3080, Pioneer Elite VSX-LX303, LG G2 77" OLED
SamuriHL is offline   Reply With Quote
Old 24th April 2012, 14:28   #10611  |  Link
STaRGaZeR
4:2:0 hater
 
Join Date: Apr 2008
Posts: 1,302
Quote:
Originally Posted by nevcairiel View Post
EVR doesn't offer proper subtitle support, and its vsync accuracy is a mess.
What accuracy? lol
__________________
Specs, GTX970 - PLS 1440p@96Hz
Quote:
Originally Posted by Manao View Post
That way, you have xxxx[p|i]yyy, where xxxx is the vertical resolution, yyy is the temporal resolution, and 'i' says the image has been irremediably destroyed.
STaRGaZeR is offline   Reply With Quote
Old 24th April 2012, 14:51   #10612  |  Link
andyvt
Registered User
 
Join Date: Jan 2010
Posts: 265
Quote:
Originally Posted by nevcairiel View Post
If its by design, its a bad design.
Why offer an option that doesn't work? Even more so, one that resets on its own all the time?
The rationale behind the feature was that they wanted it to reset when a new display was detected.

Obviously that doesn't work with HDMI displays because every time the system resyncs with the display it "detects" a new display. The important thing is that they are now aware of the use case and issue with the current approach and have at the very least documented a change request. While I don't personally find that feature valuable, it should work properly for those who do.

Quote:
Originally Posted by nevcairiel View Post
Anyway, its sad such a trivial issue will force me to install a GPU in this box. Well, to be honest the iGPU wasn't fast enough for madVR to be 100% reliable anyway, 720p60 content wasn't handled fast enough.
Sadly the AnandTech HTPC review didn't use 720p content, because with madVR 1080p actually has less load then 720p (1080p does not need Luma scaling)
What did you set the fix GPU memory size to in BIOS? What scaling settings are you using?

I did most of my testing with 480i content and as long as FSE was enabled it worked quite well.
__________________
babgvant.com
Missing Remote
andyvt is offline   Reply With Quote
Old 24th April 2012, 14:53   #10613  |  Link
aufkrawall
Registered User
 
Join Date: Dec 2011
Posts: 1,812
Quote:
Originally Posted by nevcairiel View Post
You really don't get the point. Oh well.
I did get it, no worries.

Quote:
Originally Posted by nevcairiel View Post
The point is that 1600Mhz RAM isn't more expensive then 1333Mhz RAM, both are about the same price around here, so when you get 1333 instead, you're just doing it wrong.
Did I say anything contrary?
I don't think so.
But you also spoke of RAM with more than 2000 Mhz which doesn't make much sense at all.

Quote:
Originally Posted by nevcairiel View Post
And no, overclocking won't help if your memory bandwidth is limiting.
There is none, not even really with 1333 Mhz and IVB.
As long as you don't plan to do massive encoding, anything faster than 1600 Mhz is pointless, waste of money.
aufkrawall is offline   Reply With Quote
Old 24th April 2012, 15:39   #10614  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,342
Quote:
Originally Posted by andyvt View Post
What did you set the fix GPU memory size to in BIOS? What scaling settings are you using?

I did most of my testing with 480i content and as long as FSE was enabled it worked quite well.
Of course i increased the memory, to the maximum which seemed to be 1024MB on my board.

As i explained to jakmal earlier, 720p content is more computationally expensive to upscale to 1080p. He also confirmed that a 720p60 camcorder clip does not work flawlessly for him either.
My scaling was mentioned earlier in the thread, MN for Chroma, Lanczos3 for Luma.

Quote:
Originally Posted by aufkrawall View Post
As long as you don't plan to do massive encoding, anything faster than 1600 Mhz is pointless, waste of money.
Actually, if you plan to use the iGPU, it does benefit from faster memory. 2133 may be overkill, but 1833 will still show benefits.
Its not like 2133 memory is extremely expensive or anything, i paid for the 8GB kit 60€. A similar kit (same brand and line) with 1600 was 50€ at the time. I don't mind the 10€, i never did try to go cheap on my PCs, and i'll not start now. If i run a 300€ CPU, the 10€ for better RAM won't hurt.
Thinking about it, RAM has become so extremely cheap over the years....
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 24th April 2012, 15:46   #10615  |  Link
DragonQ
Registered User
 
Join Date: Mar 2007
Posts: 934
Quote:
Originally Posted by aufkrawall View Post
There is none, not even really with 1333 Mhz and IVB.
As long as you don't plan to do massive encoding, anything faster than 1600 Mhz is pointless, waste of money.
I briefly considered getting an i3 (Sandy Bridge) and 1833 MHz RAM when building my HTPC but then I realised that a Celeron (Sandy Bridge), 1333 MHz RAM* and a GT 430 was cheaper and offered other advantages (such as hardware decoding with CUVID, better hardware deinterlacing, and custom refresh rates).

The i3 route would've given me a better CPU but I had no need for that when using hardware decoding and deinterlacing, so I considered it a waste of money.

If I were making the decision now, it'd be tougher because of the introduction of QuickSync and DXVA2 into LAV Filters, plus the fact that having a better CPU (i3 vs Celeron) would help with on-the-fly transcoding etc.


*Celerons only support running DDR3 SDRAM at 1066 MHz but 1333 MHz RAM was actually cheaper at the time.
__________________
TV Setup: LG OLED55B7V; Onkyo TX-NR515; ODroid N2+; CoreElec 9.2.7
DragonQ is offline   Reply With Quote
Old 24th April 2012, 15:46   #10616  |  Link
SamuriHL
Registered User
 
SamuriHL's Avatar
 
Join Date: May 2004
Posts: 5,351
That's why I opted for the 2133 ram I got. It wasn't cheap, but, I will be doing TONS of encoding work on that machine. Personally, I'll be using my 5870 for output, but, I do want to try to get QS working for decoding and encoding in Media Converter. That would ROCK.
__________________
HTPC: Windows 11, AMD 5900X, RTX 3080, Pioneer Elite VSX-LX303, LG G2 77" OLED
SamuriHL is offline   Reply With Quote
Old 24th April 2012, 15:53   #10617  |  Link
andyvt
Registered User
 
Join Date: Jan 2010
Posts: 265
Quote:
Originally Posted by nevcairiel View Post
Of course i increased the memory, to the maximum which seemed to be 1024MB on my board.
Common sense isn't common.

Quote:
Originally Posted by nevcairiel View Post
As i explained to jakmal earlier, 720p content is more computationally expensive to upscale to 1080p. He also confirmed that a 720p60 camcorder clip does not work flawlessly for him either.
My scaling was mentioned earlier in the thread, MN for Chroma, Lanczos3 for Luma.
Ganesh and I did not see identical results during testing.

The stats were generated w/ your settings and FSE enabled w/ DDR3-1333. While it was running the render queue was in the 14-16 range. I dropped it back to windowed to take the screenshot.



Not perfect, but not terrible either. When I get a chance I'll see if the results are any different w/ DDR3-1600.
__________________
babgvant.com
Missing Remote
andyvt is offline   Reply With Quote
Old 24th April 2012, 16:23   #10618  |  Link
aufkrawall
Registered User
 
Join Date: Dec 2011
Posts: 1,812
Quote:
Originally Posted by nevcairiel View Post
Actually, if you plan to use the iGPU, it does benefit from faster memory. 2133 may be overkill, but 1833 will still show benefits.
Its not like 2133 memory is extremely expensive or anything, i paid for the 8GB kit 60€. A similar kit (same brand and line) with 1600 was 50€ at the time. I don't mind the 10€, i never did try to go cheap on my PCs, and i'll not start now. If i run a 300€ CPU, the 10€ for better RAM won't hurt.
Thinking about it, RAM has become so extremely cheap over the years....
Yeah, it's not a big difference.
But I'm a a bit penny pinching when it's not about GPU power.
I paid 40€ for 2x 4 GB 1333 Mhz in December. When I got SNB I just overclocked it to 1600.

Quote:
Originally Posted by andyvt View Post
Common sense isn't common.



Ganesh and I did not see identical results during testing.

The stats were generated w/ your settings and FSE enabled w/ DDR3-1333. While it was running the render queue was in the 14-16 range. I dropped it back to windowed to take the screenshot.



Not perfect, but not terrible either. When I get a chance I'll see if the results are any different w/ DDR3-1600.
Try using new FSE with 16 frames presented in advance.
It helped me with a H.264 I444 60fps video.

Last edited by aufkrawall; 24th April 2012 at 17:15.
aufkrawall is offline   Reply With Quote
Old 24th April 2012, 16:45   #10619  |  Link
andyvt
Registered User
 
Join Date: Jan 2010
Posts: 265
Quote:
Originally Posted by aufkrawall View Post
Try using new FSE with 16 fps presented in advance.
It helped me with a H.264 I444 60fps video.


No dropped frames with that setting selected. The presentation glitch occurs as playback begins.
__________________
babgvant.com
Missing Remote
andyvt is offline   Reply With Quote
Old 24th April 2012, 16:56   #10620  |  Link
noee
Registered User
 
Join Date: Jan 2007
Posts: 530
Pardon for the butt-in, but are you guys using the latest madVR? I thought the "GPU ram in use..." item was removed recently....
noee is offline   Reply With Quote
Reply

Tags
decoders, directshow, filters, splitter

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 17:05.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.