Many java games use thread.sleep() to controll fps. Since the server does not display graphics, should the server game loop keep running just calculating delta time? Like this example:
long lastLoopTime = System.nanoTime();
final int TARGET_FPS = 60;
final long OPTIMAL_TIME = 1000000000 / TARGET_FPS;
while (gameRunning)
{
long now = System.nanoTime();
long updateLength = now - lastLoopTime;
lastLoopTime = now;
double delta = updateLength / ((double)OPTIMAL_TIME);
doGameUpdates(delta);
}
Answer
There are many reasons I wouldn't suggest that:
Computer resources are valuable. even if they are free, you shouldn't waste them. consider the case where two or more server are running simultaneously on one computer. If a single instance of server consumes all CPU cycles, other instances will simply fail. Surly OS will try to handle this kinda greedy behavior but it'll come with a cost. You won't know where will OS pause your process and give CPU cycles to the other one. Also you might want to release your server along side the actual game, so that players could initiate their own LAN games. It's kinda annoying if I have dedicate one of my CPU cores just to run a game server!
Most games, Specially networked synchronized games, use fixed time-step. simply because they want to be able to predict what is going to happen client side. Dynamic time step is great for so many reasons as long as you are running a single instance of your game and it's not going to be synchronized with anywhere else. It'll grant you the power of the system running the game to it's fullest extent. But again there is a price for it. You won't be able to predict what is going to happen if you run the game again. consider simple physics collision, where a box is hitting another box. Even slightest change in time step will result in big differences in the resulting collision resolution.
There is a simple reason why people use Thread.Sleep as you suggested, because adding more precision will not result in a better game. When you are playing a game, you usually won't notice if an object is one or two pixels off. So you won't gain anything by increasing accuracy. Also there is a network band width limit, both server side and client side. Meaning you usually won't be able to send more than 30-60 updates per second. So I can ask why should you compute states you are simply going to throw away?
There are many cases where increasing precision might cause trouble. For example people tend to use "Vertical synchronization" so that graphics buffer is not updated while it's sending data to the monitor (which might result in shattering). Also most games use floats as their primary number container but they are not precise. You might not notice when numbers are between 2^-20 and 2^20 but outside that range you'll see inaccurate behaviors. For example if you try adding 10^-5 to 10^5 it simply ignores your addition. These kinda errors are not usually visible, but when you don't have a full control over your time step, things might get ugly. In your case, since there is no rendering involved I'll assume your update time should be something around 10^-4 at most, meaning all changes for numbers above 100 will be messed up! using doubles will fix this problem but as I said before it won't necessarily result in a better gaming experience.
In some games, it's only player commands that will make an unpredictable differences. so you might want to only use your server to broadcast each player's commands to the other players. In these cases you could simply wait for until any of the clients send the server an update. There is nothing to calculate between those updates, meaning clients can do everything (beside broad casting) so let the server only do it's job and leave anything else to the clients.
No comments:
Post a Comment