Well, I've read some doc on 10bit color depth support and the BD standard and compatibility with avisynth (including
this doc), and I'm not sure it is a good idea to implement it. AFAIK, currently, avisynth is still limited to 8-bit (although some filters may use more bits internally), and I wonder if encoding in 10-bit is really interesting if avisynth returns a 8-bit video anyway.
Perhaps it is possible to use various alternative filters within avisynth to output 10 or 12-bit, but honestly, I don't know how, I'm not sure they will be compatible with the filters BD3D2MK3D requires, and again, given the fact that the input BD is encoded in 8-bit (TV range), I don't think that the gain will be major. Note also that most monitors and TVs are 8-bit only, and therefore you can only expect a more or less well dithered 8bit color depth in final, even if the whole processing and encoding processes are made correctly in 10 or 12-bits.
So, unless somebody can explain what I'll have to do to effectively support 10 or 12-bit color depth during the whole process and can show me clearly that there is a big advantage, I will not do it. Sorry.
Anyway, if you really want to encode in 10 or 12-bit, you can replace the x264/x265 exe with the 10/12-bit version, as you know. You can even create several small cmd files to restore the 8-bit version or overwrite it with another version, and call the right batch when necessary, before launching the encoding.