Quote:
Originally Posted by Ghitulescu
Why picking up 1h 32m 54s 610ms and not 1h 37m 53s 11f?
Let's say the movie has 140787 frames.
If the speed is 23.976fps, the time in seconds is almost 5872 seconds (xxx1, 99699699 etc etc).
But if it's 24/1.001, it will be already 6 millisenconds off. By true 24fs, it will play for ~5866 seconds: 6120 milliseconds less.
To further push this phylosophically discussion into the absurd, let's notice that the last frame has no proper duration (all other frames are replaced by the next image at its PTS (Presentation Time Stamp), that is its duration is given by the difference between those two PTSs. It should be also a PTS after the last frame (link-pts), but AFAIK is not mandatory used, many players giving the last frame the same duration as the preceding one/s or using pkt-duration (imprecise). The clock is also required to be within 4 millisecond tolerance.
Thus it all boils down to what the OP intends to do....
|
Thanks your scientific explanation. As a layman, I seem to have catch the point.