View Single Post
Old 27th May 2016, 17:52   #38184  |  Link
ashlar42
Registered User
 
Join Date: Jun 2007
Posts: 656
Quote:
Originally Posted by ashlar42 View Post
madshi, would you be so kind as to provide a full explanation about the video and audio clock calculations?

I know the refresh rate is displaying the refresh rate after deviation is taken into account, so if one knows the exact timings he used, he can get to the percentage of deviation. Is this right? How do you come up with the value for "display"? Why is it shown as an absolute value instead of a percentage, as it happens for the audio clock?

What about the audio clock deviation? Both audio and video are measured against the system clock, right?

I'm using a GTX 660 with HDMI audio onboard and I'm seeing very similar values in deviation (based on the above assumptions) for both audio and video. Meaning if I input exact timings for the resolutions/refreshes needed I see excellent results when bitstreaming (1 frame drop/repeat every x days or every 18 hours at the worst), which kinda makes sense according to your previous descriptions. If the two clocks (video and audio) deviate by a similar amount, playback is smooth.
I'd just like to understand the process fully.

Thanks so much for your help and for all the work you put in madVR.
Some more info. I'm trying, together with user hannes69 from Kodi forums, to understand exactly the process behind the numbers madVR displays.

Considering you have stated in the past that:

1) "display" shows the refresh rate adjusted for the video clock deviation measured against the system clock

2) "clock deviation" shows the deviation for the audio clock, again measured against the system clock.

I'll make an example of what we found for a 59.940 video content played back with bitstreaming audio at 59.940 display refresh rate.

Display: 59.94174
Clock Deviation: +0.00276%
1 frame drop every 3.17 hours
movie 59.940 fps

audio clock deviation 27.6ppm
video clock deviation: (59.94174 - 59.94) / 59.94 = 29.03ppm
total deviation: video clock dev - audio clock dev = 28.86ppm - 27.6ppm = 1.43ppm

frames in 3.17 hours = (3600 * 59.940) * 3.17 = 684,035.28

1 frame dropped every 684,035.28 frames. Accuracy of 1 / 684,035 = 0.0000014619, that is 1.46ppm (I guess this is well within the margin of error in madVR approximations once it reaches the x.xx hours stage).

The above tells me that you are using the approximation of 59.940 in the internal calculations, and not the 60/1.001 value. Is this correct?
If using the 60/1.001 value we would have:

audio clock deviation 27.6ppm
video clock deviation: (59.94174 - (60/1.001)) / (60/1.001) = 28.03ppm
total deviation: video clock dev - audio clock dev = 28.03ppm - 27.6ppm = 0.43ppm

frames in 3.17 hours = (3600 * (60/1.001)) * 3.17 = 684,035.96

1 frame dropped every 684,036 frames. Accuracy of 1 / 684,036 = 0.0000014619, that is again 1.46ppm

0.43ppm clock difference audio/video seems too low if compared to what drop/repeats at every 3.17 hours indicate, while 1.43 is very similar.
So, are you using 59.940 for internal calculations? Does this hold true for 23.976 as well?

Sorry, I know we are reaching the human limits of geekiness here... but we would really love to fully understand the underlying math

I also can't understand why my clock deviation sits constantly at about 0.00276% at 59.940 refresh rate while going down to 0.00242% at 23.976 refresh rate. It's the audio clock measured against the system clock. Why should display refresh rate influence the deviation? I have been checking and checking but the difference between the two refresh rates stays there... It's audio and video through HDMI on a GTX660, if it matters.
__________________
LG 77C1 - Denon AVC-X3800H - Windows 10 Pro 22H2 - Kodi DSPlayer (LAV Filters, xySubFilter, madVR, Sanear) - RTX 4070 - Ryzen 5 3600 - 16GB RAM

Last edited by ashlar42; 27th May 2016 at 20:36.
ashlar42 is offline   Reply With Quote