Code:
logo0 =CoronaSequence(path).trim(0,end-start-1).FadeIn0(I).FadeOut0(O)
logo0a=logo0.addborders(x,y,clip.width,clip.height).crop(0,0,-logo0.width-x,-logo0.height-y).blur(Sblur)
logo1 =logo0a.converttoyv12().assumefps(Framerate(Clip))
s=ShowAlpha(logo0a,"YV12")
video1 = AverageLuma(s) == float(HexValue("FFFFFF")) ?
\ mt_merge(clip.trim(start,end-1),logo1,mt_lut(s," x "+String(Opac)+" * "),luma=true)
\ : mt_lutxy(clip.trim(start,end-1),logo1,Y=3, U=2, V=2,
\ "236 16 - 236 16 - x 16 - - 236 16 - y 16 - - * 236 16 - / - 16 + "+String(Opac)+" * x 1 "+String(Opac)+" - * + ")
return start >0 ? clip.trim(0,start == 1 ? -1 : start-1)+video1+clip.trim(end,0) : video1+clip.trim(end,0)}
I updated the code a bit, mostly nesting functions, do you think this is better for RAM optimization?
Also I want to automatically detect alpha so I can remove the alpha parameter.
As theres are no clip properties associated to alpha evaluation, I try to compare the result from the average of ShowAlpha to White, if false then it has alpha.
Problem here is Im getting errors ("Invalid arguments")when I try AverageLuma, I think because some the ISSE requeriments:
AverageLuma(ShowAlpha(logo0a,"YV12")) == float(HexValue("FFFFFF"))