View Single Post
Old 25th August 2020, 02:13   #16  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,926
adaptive sync for gaming is not a special feature anymore more which HDMI can do but rarely used with HDMI displays. usually only a range of 48-60 is present which means a practical range of 50-58 in short not useful at all yet.

HDMI 1.3 was always able to do 1080p 60 hz 12 bit and TV where able to accept this type of signal too and it was used on PC with consumer grade hardware.

HDMI 2.0 is the first major used version that couldn't do the target refreshrate/resolution with 12 bit for 23p 12 bit is a default feature.

if i now take in to consideration that every TV except one i have tested in my life span performance much worse in term of banding when 10 bit or more bit was send into the display instead of 8 bit i don't even know what the point of it is. guess what my new X900h get's "destroyed" when 10 bit is send into it instead of 8 bit and i wasn't expecting anything else.

hasn't it been proven that downscaling is a terrible compression algorithm and that 4:4:4 which a higher chroma qp offset setting is "general" better for quality even for luma then 4:2:0 because more bandwidth is used on luma?

the chroma scaler in madVR weren't created for fun and they where discussed with real world examples. if i really have to i can hunt some... or we just take a esport computer game gaming is already bigger then hollywood so lot's and lot's of content where 4:4:4 matter a lot.

pretty much every new not low end LG TV can or is planned to do 120hz 4K 4:4:4.

just for fun nvidia can do 4.4:4 hardware "encoding" is limited but it can do it.

modern computer are very good at 16-256 bit so i don't see a huge issue here and aren't people already doing this with "avisynth" and vapoursynth? we are talking about processing right not encoding to 16-32 bit?
huhn is offline   Reply With Quote