Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
7th May 2016, 08:53 | #37781 | Link | ||
QB the Slayer
Join Date: Feb 2011
Location: Toronto
Posts: 697
|
Quote:
Quote:
QB
__________________
Last edited by QBhd; 7th May 2016 at 08:56. |
||
7th May 2016, 09:48 | #37782 | Link |
Is this for real?
Join Date: Mar 2016
Location: Norway
Posts: 168
|
It is, but Chroma 128 doesn't tax my card as hard as Image doubling, at least not when super-xbr is used for doubling/quad. I do have tried diffferent settings...
__________________
My HTPC : i9 10900K | nVidia RTX 4070 Super | TV : Samsung 75Q9FN QLED |
7th May 2016, 14:05 | #37783 | Link |
Registered User
Join Date: Sep 2014
Posts: 280
|
just read the new presentation of the new geforce cards. 1070 as strong as 980ti and 1080 stronger than titan... seems we will get a good jump in performance. Now lets see what amd got and then we'll see how much benefits madvr will get from this more powerfull gpu's when upsaling 1080p to 4k on my machine.
__________________
Intel i5 6600, 16 GB DDR4, AMD Vega RX56 8 GB, Windows 10 x64, Kodi DS Player 17.6, MadVR (x64), LAV Filters (x64), XySubfilter .746 (x64) LG 4K OLED (65C8D), Denon X-4200 AVR, Dali Zensor 5.1 Set |
7th May 2016, 15:02 | #37785 | Link |
Registered User
Join Date: Oct 2015
Posts: 99
|
I currently use Super XBR 75 for doubling with some profiles where I can't afford NNEDI3. I'm considering whether I should try switching to Super XBR anti-bloat 25 because many seem to like it.
Any pointers as to the strengths/weaknesses of each, i.e. with what material would I best see their benefits or drawbacks? I've understood they should look pretty close. |
7th May 2016, 15:34 | #37786 | Link | |
Registered User
Join Date: Sep 2015
Posts: 60
|
Quote:
The real performance cards are due out next year. The next Ti should allow some serious high settings for pushing 1080p@60 on to a 4K screen. Speaking of which, i'm actually glad I downgraded my monitor back to 1080 (the 4K one I had was too hazy for my liking). I can use pretty much any setting I like now with my 980Ti and I can barely notice any difference. 4K is a gimmick. Beneficial to gamers perhaps, but you'd need a good 60-70" screen to appreciate the difference for video watching. On another note, IMO, NNEDI3 is a complete waste of resources and I think Super-XBR looks miles better. I had all the settings ramped up on one of my profiles for 720p using NNEDI3, GPU was at about 99% with no dropped frames and no real improvement on screen. Changed to Super-XBR, less resource heavy and sharpens my images enough to look 1080. Guess it really is down to personal taste. |
|
7th May 2016, 15:38 | #37787 | Link |
Registered Developer
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,342
|
The current estimates (from their presentation information and the core/clock ratio) go around 15-20% over a TitanX/980Ti at stock, with easy 20% OC headroom, which personally I find substantial. Only benchmarks will reveal the real information either way however, and anything else is just speculation, both positive and negative.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders |
7th May 2016, 16:01 | #37788 | Link |
Registered User
Join Date: May 2012
Posts: 447
|
They said that at the end, but I think that was taking their VR viewport trick into account (since that adds another ~40% performance in VR). Still, the 1070 won't exactly be a pushover either.
__________________
Test patterns: Grayscale yuv444p16le perceptually spaced gradient v2.1 (8-bit version), Multicolor yuv444p16le perceptually spaced gradient v2.1 (8-bit version) |
7th May 2016, 16:16 | #37789 | Link | |
Registered User
Join Date: Dec 2002
Location: Region 0
Posts: 1,436
|
Quote:
|
|
7th May 2016, 16:33 | #37790 | Link | |
Registered Developer
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,342
|
Quote:
__________________
LAV Filters - open source ffmpeg based media splitter and decoders Last edited by nevcairiel; 7th May 2016 at 17:02. |
|
7th May 2016, 20:34 | #37793 | Link | |
Registered User
Join Date: Dec 2002
Location: Region 0
Posts: 1,436
|
Quote:
I use it on low, med during fades myself. I use that setting for all content. |
|
7th May 2016, 22:56 | #37794 | Link | |
Registered User
Join Date: Dec 2011
Posts: 1,812
|
Quote:
When you use fade-detection, I recommend not to disable the performance/quality trade-off option for it, as you will need longer CPU and GPU queues else. |
|
8th May 2016, 02:39 | #37796 | Link | |
Registered User
Join Date: Jun 2015
Posts: 25
|
Quote:
Is half precision enough for NNEDI3? If yes, Pascal should be very fast at that. |
|
8th May 2016, 06:47 | #37797 | Link |
Registered User
Join Date: Oct 2015
Location: Brasil, SP, São Paulo
Posts: 154
|
I find a strange but good behave on madvr
When i m downscaling a 1080 source i getting some great extra performance if set chroma to reconstruction soft, all other algorithms have greater performance hits when downscaling 1080 to 1366x768 even mitchel nevali is slower than reconstuction soft on this scenario, getting almost less 10 ms using reconstruction soft, the same thing if uncheck don t use linear light for dithering i get better render times that if that option is checked... Someone know why this?
__________________
Desktop, i5 2500, 8GB, N570 GTX TF III PE/OC Asus X555LF, i7-5500U, 6GB Ram, Nvidia 930m/HD 5500 Windows 8.1 Pro x64 |
8th May 2016, 11:56 | #37799 | Link |
Registered User
Join Date: Oct 2015
Posts: 99
|
SSIM downscaling is not working when deinterlacing - it always reverts to Bicubic150. It says so in the OSD and also switching between 1D and 2D makes no difference in rendering times.
Looking a some older posts, it looks like at one point SSIM may not have been working at all after doubling (?) but it does work now apart from when deinterlacing (dxva) is going on. I'm just wondering if this behavior is to be expected or not. |
8th May 2016, 12:51 | #37800 | Link |
Registered User
Join Date: Sep 2005
Posts: 29
|
A question regarding HDR, why does it currently not work and how would it theoretically work in the future? If I have file encoded in HDR and I'm connected to an HDR TV where does the information that it's an HDR stream get stuck?
Let's assume HDR10 and not the more complex Dolby Vision. As I understand it all of the characteristics are globally defined just like a gamma curve so theoretically the GPU-makers could just provide an HDR output toggle in the driver settings and it would work correctly even without passing through any metadata from the player? (Though if you had a windowed player and enabled that mode the rest of your desktop would probably scorch your eyeballs.) |
Tags |
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling |
Thread Tools | Search this Thread |
Display Modes | |
|
|