View Single Post
Old 2nd August 2016, 13:07   #1  |  Link
Motenai Yoda
Registered User
 
Motenai Yoda's Avatar
 
Join Date: Jan 2010
Posts: 709
Feed x264 with 16bit avs

Hi,
I'm trying to encode an avs+ x64 16bit script using x264-10bit, but I can't get a dithered 10bit encode, but only an interleaved one...

Script
Code:
setmemorymax(4096)
SetFilterMTMode("DEFAULT_MT_MODE", 2)
SetFilterMTMode("dss2",3)

dss2("test1.avs_lossless.mp4",preroll=100,pixel_type="YV12")

dither_convert_8_to_16()

dither_out()

Prefetch(4)
CLI
Code:
"%~dp0x264-10bit.exe" --input-depth 16 --input-res 1920x1080 --preset ultrafast --qp 0 -o %1_lossless_10_64.mp4 %1
edit: using avs4x26x and avs+ x86 all seems working fine
Code:
"%~dp0avs4x26x.exe" -L "%~dp0x264-10bit.exe" --qp 0 --preset ultrafast -o %1_lossless_10.mp4 --input-depth 16 %1
__________________
powered by Google Translator

Last edited by Motenai Yoda; 2nd August 2016 at 13:23.
Motenai Yoda is offline   Reply With Quote