Hi,
I'm trying to encode an avs+ x64 16bit script using x264-10bit, but I can't get a dithered 10bit encode, but only an interleaved one...
Script
Code:
setmemorymax(4096)
SetFilterMTMode("DEFAULT_MT_MODE", 2)
SetFilterMTMode("dss2",3)
dss2("test1.avs_lossless.mp4",preroll=100,pixel_type="YV12")
dither_convert_8_to_16()
dither_out()
Prefetch(4)
CLI
Code:
"%~dp0x264-10bit.exe" --input-depth 16 --input-res 1920x1080 --preset ultrafast --qp 0 -o %1_lossless_10_64.mp4 %1
edit: using avs4x26x and avs+ x86 all seems working fine
Code:
"%~dp0avs4x26x.exe" -L "%~dp0x264-10bit.exe" --qp 0 --preset ultrafast -o %1_lossless_10.mp4 --input-depth 16 %1