Increasing a video's gamma or histogram profile can brighten a seemingly black nighttime surveillance video. However, with only 256 shades of YUV illumination, the results can fail to be forensically useful. My idea is to amplify subtle color changes to detect movement, amplify signage (logos, license plates, etc...), and other forensic needs. Besides the obvious use with low quality surveillance video, the exaggerated color shifts would allow anyone to see the normally imperceptible temperature changes. For example, machinery heating up (and thus red incremented by one in RGB space).
HOW IT WORKS:
The filter selects the upper left most pixel and reads its RGB values along with those from the preceding and postceding "N" frames. It then averages those temporal values and subtracts that average from the current frame's R G and B pixel values. It then multiplies that difference by "M", and add it to the original RGB pixel values, the resulting RGB values are an exaggeration of the color deviation from average at that one pixel. The process is repeated for every pixel of every frame, which amplifies any changes that are occurring, even if the scene appears almost completely dark. The resulting scene will look over saturated, and poison noise will become amplified, but the goal is to provide a forensic tool.
While the concept and code are correct, the original code below suffers from an Overlay memory leak issue (that I was previously unaware of) and speed issues. I am posting the original code to serve as a programming tutorial.
USAGE:
Avisource("D:\avs\test.avi").ConvertToRGB32()
A=RgbAmplifier(2,1.1)
StackHorizontal(A)
Code:
function RgbAmplifier (clip clip, int range, float magnifier) {
GScript("""
clipfinal=clip.trim(1,1)
clip=clip.trim(1,1)+clip #resolves the trim (0,0) error
for (f=1,clip.framecount,1) {
clip2=clip.trim(f,f)
clip2
rangemin=(f-range<1)?1:f-range
rangemax=(f+range>clip.framecount)?clip.framecount:f+range
for (x=0,clip2.width-1,1) {
for (y=0,clip2.height-1,1) {
current_frame=0
redavg=0
red=clip.RT_RgbChanMedian(n=0, delta=0, x=x, y=y, w=1, h=1, interlaced=false, chan=0)
for (g=rangemin,rangemax,1){ redavg=redavg+clip.trim(g,g).RT_RgbChanMedian(n=0, delta=0, x=x, y=y, w=1, h=1, interlaced=false, chan=0) }
redavg=redavg/(g-rangemin)
newred=red+magnifier*(red-redavg)
newred=newred>255?255:newred<0?0:newred
current_frame=0
greenavg=0
green=clip.RT_RgbChanMedian(n=0, delta=0, x=x, y=y, w=1, h=1, interlaced=false, chan=1)
for (g=rangemin,rangemax,1){ greenavg=greenavg+clip.trim(g,g).RT_RgbChanMedian(n=0, delta=0, x=x, y=y, w=1, h=1, interlaced=false, chan=1) }
greenavg=greenavg/(g-rangemin)
newgreen=green+magnifier*(green-greenavg)
newgreen=newgreen>255?255:newgreen<0?0:newgreen
current_frame=0
blueavg=0
blue=clip.RT_RgbChanMedian(n=0, delta=0, x=x, y=y, w=1, h=1, interlaced=false, chan=2)
for (g=rangemin,rangemax,1){ blueavg=blueavg+clip.trim(g,g).RT_RgbChanMedian(n=0, delta=0, x=x, y=y, w=1, h=1, interlaced=false, chan=2) }
blueavg=blueavg/(g-rangemin)
newblue=blue+magnifier*(blue-blueavg)
newblue=newblue>255?255:newblue<0?0:newblue
clip2=clip2.Overlay(clip2.BlankClip(pixel_type="RGB24", color=newred*65536+newgreen*256+newblue).crop(0,0,1,1), x=x, y=y, opacity=1.0)
}
}
clipfinal=clipfinal+clip2
}
return clipfinal.trim(1,0)
""")
}