In order to save bandwidth in my multiplayer game, I do not update every object every server tick, instead each object has an updateRate which tells the game that this object is expected to be updated every X server ticks.
When I receive an update message for an object I calculate the time I expect the next update to come in:
origin = serverCurrentPosition
diff = serverNextPosition - origin
arriveTime = now + timeBetweenTicks * updateRate
When I draw the object I calculate the time that's left till the next update and interpolate the position accordingly:
step = 100 / timeBetweenTicks * updateRate
delta = 1 - step * ((arriveTime - now) / 100)
position = origin + diff * delta
It works... but there's still a bit of jitter in the drawing, although in my theory every thing should work out fine, since the scaling should take care of some amount of lag, should it?
So the question here is, is this the best approach? Should I put actual lag into the computation? If so, how would I do that? I did some experiements, but the jitter only got worse.
No comments:
Post a Comment