[GOODIE] Mpeg3Plugin.so for Linux

Karl Ramberg karl.ramberg at chello.se
Sun Nov 5 21:10:15 UTC 2000



John M McIntosh wrote:
> 
> >John M McIntosh wrote:
> >
> >I don't know enough of the inner workings of the mpeg format, but isn't
> >there a common timecode reference to both streams so one can sync the up
> >at any point ? It's more likely that video frame rate drops than
> >audio sample rate so one could skip the number off frames it lags behind
> >e.g.. now it skips 3 frames when it lags behind but shouldn't this
> >number
> >be calculated at every occurrence of out of sync so it's a perfect sync
> >again ?
> >
> >Karl
> 
> Yes I need to calculate a proper skip number, versus clamping the
> value to 3. My original thought was to clamp so other activity which
> causes a sudden temporary  performance issue wouldn't cause a sudden
> bunch of frame skips, rather we would drop a fixed number of frames
> at a high rate in order to sync. But in looking at how other systems
> handle it, we do need to skip a bunch.
> 
> I'll issue some Smalltalk code updates later today (12+ hours).
> 
> But if you want to change things right now look at
> 
> MPEGPlayer >>decideToSkipAFrame: delta averageWait: aWaitTime stream: aStream
>         delta abs > aWaitTime ifTrue:
>                 [external videoDropFrames: 3 stream: aStream].
> 
> In a situation where we aren't able to decode the frames fast enough
> then aWaitTime should be zero or close to zero. delta is a negative
> number which indicates how many milliseconds we are behind. Just
> change the code to skip frames based on the milliseconds need to
> resync and the frame Rate (self frameRate). millisecondsRequired/1000
> = 1/framerate*framesToDrop.
I'll wait for your code :-)
> 
> There is some other code lurking
> calculateDelayToSoundGivenFrame: frame stream: aStream
> 
> where I attempted to sync the frame to the current sound sample being
> played however I found we can't really figure out which sample is
> currently being played there is too much buffering going on between
> the layers. One would need a VM call (I think) to get the required
> information. This tackles the problem from the other direction.
> 
> >but isn't
> >there a common timecode reference to both streams so one can sync the up
> at any point
> 
> I think so, but right now you know you are at frame 232402 out of
> 5553222, and at 24 frames a sec you know which sample of sound you
> should be playing just based on doing some math. The trick is how to
> sync in Squeak. Right now we assume we start playing sound at time
> zero, and if we play 1 hour of sound data it takes one hour to play,
> but is this really really true? Anyone care to test?

I don't have a sample that long... I guess I could make a loop somehow
like play a song that has a certain length a few times.
I'll look into it.
Karl





More information about the Squeak-dev mailing list