It seems DivX H264 delivers around 15% more fps than ffdshow. Over here with a 720p source I got 10% more when renderer is VMR and 120% more when VMR9 - that is interesting to me because I use subtitles (VMR9).
DivX and CoreAVC similar, specially on VRM9.
Turning Deband ON in ffdshow (what is a must for me because the x264 issue with blocks on dark areas) drops it's speed by 40%. With DivX H264 there is no impact turning deblock on/off, but also the image does not change - I see the blocks. The same with CoreAVC (a little bit more blocks than DivX).
ffdshow:
110 fps (VMR) (CPU @ 80%)
100 fps (VMR9) (CPU @ 80%)
DivX H264
120 fps (VMR) (CPU @ 90%)
118 fps (VMR9) (CPU @ 65%) (Obs: deblock on/off, latency on/off)
CoreAVC
120 fps (VMR) (CPU @ 65%)
119 fps (VMR9) (CPU @ 70%)
CPU 6600@2.40GHz (dual-core) - 1GB RAM DDR2 400MHz - GeForce GTS 8600
Last edited by MarcioAB; 19th May 2008 at 13:28.
Reason: included graphics card info
|