PDA

View Full Version : greyscale


wotef
15th December 2002, 16:30
what i would like to do is test the effect of creating a greyscale of a frame and then replacing only the pure black and pure white pixels from the greyscale frame over any pixels from the original that were not originally black
(or white) - i can't seem to do this using layer or mergechroma/mergeluma

could someone explain how greyscale works? is each pixel examined for its colour components (e.g. RGB, black=0,0,0 white=255,255,255) and then some formula is applied to alter it? i presume a YUY2 colour space changes this again?

Guest
15th December 2002, 23:19
For RGB, you calculate the luma according to a standard formula such as:

Luma = 0.299*R + 0.587*G + 0.114*B // or integer equivalent

Then you put the resulting luma value back as the value for R, G, and B. Any pixel with R, G, and B having the same value will be a shade of gray.

For YUV spaces, all you have to do is clear out the chroma bytes; value 128 means no color according to the YUV space equations, so you put 128 into U and V and leave Y unchanged.

WarpEnterprises
16th December 2002, 00:05
Did you try ColorKeyMask (in cooperation with Layer)?

jang0
16th December 2002, 21:14
Originally posted by wotef
what i would like to do is test the effect of creating a greyscale of a frame and then replacing only the pure black and pure white pixels from the greyscale frame over any pixels from the original that were not originally black
(or white)

well, i could be wrong but would that process change anything of the original frame? i think if you convert a frame into greyscale, only the pixels that were black or white in the original frame will be black/white in the greyscale frame, any other colors will be changed to a shade of grey. so if you would replace pixels in the original frame that were not black/white, they won't be (pure) black/white in the greyscale either, so this method wouldn't result in the described effect. or am i mistaken? perhaps i didn't understand properly what you want to do exactly.

scmccarthy
16th December 2002, 21:53
@jango

No, that is presicely what will happen. You have described it very nicely. If you look again at what neuron2 wrote, you will see that it is true.

I did not want to say so because it is not a very interesting point. neuron2's explanation of how to derive the greyscale makes much more interesting reading.

Stephen

wotef
16th December 2002, 22:38
no need to be supericilious or conceited about what is "interesting", scmmcarthy; i just wanted to experiment...

based on some tinkering i did with crazy colour tweaks of an analogue cap, i turned saturation up to 75 and was surprised to see a lot of activity in (what were normally) "black" / very dark areas of the frame; when i turned the original frame to greyscale (and tweaked brightness/contrast a bit more) and layered the black pixels back, all those fluctuations disappeared

anyway, i don't really understand colorkeymask but i think i did what i wanted to do (at least with the black pixels) with:

clip=avisource("E:\capture.avi")
tweaked=clip.greyscale().tweak(cont=2,bright=-2)
end=clip.layer(tweaked,"darken",255,1)
return end

scmccarthy
17th December 2002, 01:22
@wotef

I just meant that the answer that first occured to me that black is black and white is white before and after you convert to grey scale, while technically true and nessesary to point out, can't help you much.

There could be randon fluctuations between almost black and black that become visable once you light the picture I suppose. I did not think about tweak from your original question.

Stephen

Guest
17th December 2002, 01:40
Originally posted by scmccarthy
If you look again at what neuron2 wrote, you will see that it is true.Enter pedantic mode...

Given the rounding of integer calculations, it is quite possible that R=0/G=0/B=1 would map to luma=0 and thence to black. That would be an example of a case where you get blacks that weren't there originally.

Of course, that is not very interesting. :)

What I find more interesting, and what I assume wotef is thinking about, is 'coring'. You might map any pixels with luma below a threshold to black, for example. Coring is a well-known technique used in TVs, and nowadays in videocams, especially in conjunction with sharpening circuits to reduce the effect of sharpening on low-level noise.

jang0
17th December 2002, 06:55
couldn't such a filter also be used to make the black edges in anime thicker and more distinct? or would that have a negative effect on compressibility when the borders between two areas are two abrupt?

wotef
17th December 2002, 10:04
first, sorry scm for reading your post wrong

ah, neuron2 has explained it (again) for me, though i wasn't aware it was called coring --> also that there's a coring vdub filter you-know- where

@neuron2 - why does your intro page call coring a "poor man's noise reduction" - isn't it a no-brainer, quick-win; wouldn't it be beneficial to use it prior to something like convolution3d so that the latter doesn't waste time denoising pure black areas of the frame?

sh0dan
17th December 2002, 10:14
You should be very careful about what you do, since luma == 0 (or actually 16) DOESN'T map to black. If the chroma values are not 80/80 it might STILL have a color.

Try this in latest 2.5 alpha:colorbars(16,16)
coloryuv(showyuv=true)
histogram()
Note how it still displays color on the first frame (luma=16) when UV are small (topleft) and when either U or V are high (right, bottom).

Luma < 16 visually (and conversion-wise) maps to luma = 16.