What is the recommended way to measure time intervals for a game loop?
Consider the situation in which a developer is writing their own game loop. Using a third party game engine such that you do not write your own game loop is beyond this question.
In general I want the best API from the standard library, unless there is reason to not use it. The exception being well stablished game development libraries that do not force a game loop on the developers.
Ideally the solution is not to use System time. We want something unaffected by time zones, leap seconds, daylight saving, the user messing with the system clock, etc… plus the system time usually has poor resolution for games.
I also found Time a Function - Rosetta Code, it uses system time in some cases.
This is a community wiki.
Answer
JavaScript
On the browser
For code running in the browser, I would strongly advice to use requestAnimationFrame as game loop. See availability.
Example:
function update(timestamp)
{
// ...
window.requestAnimationFrame(update);
}
window.requestAnimationFrame(update);
requestAnimationFrame
takes a callback that will be called once per frame, matching the monitor refresh rate. The callback receives a double representing the milliseconds since time origin.
The time origin The time origin is a standard time which is considered to be the beginning of the current document's lifetime. It's calculated like this:
- If the script's global object is a Window, the time origin is determined as follows:
- If the current Document is the first one loaded in the Window, the time origin is the time at which the browser context was created.
- If during the process of unloading the previous document which was loaded in the window, a confirmation dialog was displayed to let the user confirm whether or not to leave the previous page, the time origin is the time at which the user confirmed that navigating to the new page was acceptable.
- If neither of the above determines the time origin, then the time origin is the time at which the navigation responsible for creating the window's current Document took place.
- If the script's global object is a WorkerGlobalScope (that is, the script is running as a web worker), the time origin is the moment at which the worker was created.
- In all other cases, the time origin is undefined.
-- Mozilla
Aside from the game loop, you can get a milliseconds timestamp since time origin by calling performance.now()
. See availability.
Example:
var start = performance.now();
// ...
var elapsed = performance.now() - start;
I want to strongly recommend the talk Jake Archibald: In The Loop - JSConf.Asia 2018 that covers how the browser event loop works and when exactly requestAnimationFrame
runs.
On Node.js
Node.js does not have requestAnimationFrame
. Instead use setImmediate
.
It can be a good idea to use setTimeout
with the time for the next tick. And to do that effectively, you need to measure time... thankfully performance.now
continues to work in Node.js.
Java
In Java you want to use System.nanoTime:
long start = System.nanoTime();
// ...
long elapsed = System.nanoTime() - startTime;
I have seen claims that System.nanoTime
is not thread safe. That is not true, it is thread safe. System.nanoTime
delegats the request to the operating system, apparently there were some platforms that had bugs.
Python
With Pygame
If you are using Pygame, you want to call clock.tick
. It returns the milliseconds since the last call.
Note: It takes a desired frame rate as argument, and will insert delays if it is been called too soon for that frame rate.
In your game loop, you want to call it every iteration, passing the target frame rate:
clock = pygame.time.Clock()
while running:
delta = clock.tick(60)
# ...
To measure elapsed time you want to use get_ticks()
instead:
Example:
start = pygame.time.get_ticks()
# ...
elapsed = pygame.time.get_ticks() - start
pygame.time.get_ticks
returns milliseconds.
Note: Pygame uses SDL. In fact, pygame.time.get_ticks
delegates to SDL_GetTicks
which returns milliseconds (Uint32) since DSL initialization.
Without Pygame
If you are not using Pygame, use time.perf_counter()
. It returns a float that represents time in (factional) seconds.
Example:
start = time.perf_counter()
# ...
elapsed = time.perf_counter() - start
C
C on POSIX
Use clock_gettime
. You will need the time.h
header. It takes a clock id that can be either CLOCK_REALTIME
, CLOCK_MONOTONIC
, CLOCK_PROCESS_CPUTIME_ID
, or CLOCK_THREAD_CPUTIME_ID
and a pointer to a timespec
:
struct timespec {
time_t tv_sec; /* seconds */
long tv_nsec; /* nanoseconds */
};
The following is example usage:
struct timespec start, end;
double elapsed_sec;
clock_gettime(CLOCK_MONOTONIC, &start);
// ...
clock_gettime(CLOCK_MONOTONIC, &end);
elapsed_sec = (end.tv_sec + end.tv_n_sec - start.tv_sec + start.tv_nsec)/1000000000.0;
Note: clock_gettime
returns 0 on success. On failure it returns -1 and sets errno, it could be EFAULT
(Invalid pointer) or EINVAL
(Not supported clock id).
C on Windows
We will use QueryPerformanceCounter
and QueryPerformanceFrequency
:
// Once
LARGE_INTEGER frequency;
BOOL available = QueryPerformanceFrequency(&frequency);
LARGE_INTEGER start;
QueryPerformanceCounter(&start);
// ...
LARGE_INTEGER end;
QueryPerformanceCounter(&end);
double elapsed_sec = (double)((end.QuadPart - start.QuadPart)/(double)frequency.QuadPart);
If available
is false, you can fallback to GetTickCount
which gives you milliseconds.
The answer to the question "Best way to get elapsed time in miliseconds in windows" has a nice wrapper.
C on OSX, Objective-C
We will use mach_continuous_time
from mach_time.h
.
// once
mach_timebase_info_data_t timeBase;
mach_timebase_info(&timeBase);
// unit conversion for nanoseconds
double timeConvert = (double)timeBase.numer / (double)timeBase.denom;
double start = (double)mach_continuous_time() * timeConvert;
//...
double elapsed = ((double)mach_continuous_time() * timeConvert) - start;
Note: mach_timebase_info
can fail. It should return KERN_SUCCESS
, otherwise, you would have to fall back to system time.
C++
Use std::chrono::high_resolution_clock::now
. You will need the chrono
header.
Example:
high_resolution_clock::time_point start = high_resolution_clock::now();
// ...
high_resolution_clock::time_point end = high_resolution_clock::now();
auto elapses_sec = (end - start).count();
See also duration_cast
.
SDL
If you are using SDL, you can use SDL_GetPerformanceCounter
and SDL_GetPerformanceFrequency
. Example:
// Once:
uint64_t PerfCountFrequency = SDL_GetPerformanceFrequency();
// ...
uint64_t start = SDL_GetPerformanceCounter();
// ...
uint64_t end = SDL_GetPerformanceCounter();
double elapsed_sec = (double)((end - start) / (double)PerfCountFrequency);
Note: This method would be equivalent to SDL_GetTicks
when no better timer is available.
PHP
Use hrtime
. When called with true
as parameter, it returns nanoseconds. int
or float
depending on the platform.
Example:
$start=hrtime(true);
// ...
$end=hrtime(true);
$eta=$end-$start;
hrtime
is stable across requests, and is not susceptible to changes in system time.
Before PHP 7.3.0
You want microtime
. It uses Unix time and will return a float in seconds if you pass true
as argument.
Example:
$start = microtime(true);
// ...
$elapsed = microtime(true) - $start;
microtime
is stable across requests. It is based on system time.
.NET (C#, VB.NET, etc...)
On Mono or .NET Framework, you want to use System.Windows.Forms.Application.Idle
for your game loop. It has also been added to .NET Core 3.0.
And for the elapsed time, use a Stopwatch.
Example:
var stopWatch = new Stopwatch();
stopWatch.Start();
// ...
var timeSpan = stopWatch.Elapsed;
Stopwatch
will use high resolution timers if available. Otherwise, it falls back to system time.
Processing
In processing you can use the millis
function, which as the name suggest gives you milliseconds, from application start in this case.
Example:
int start = millis();
// ...
int elapsed = millis() - start;
millis
is also available in the ports of processing for Python (millis
) and Javascript (millis
).
Ruby
You want Process.clock_gettime
.
Example:
starting = Process.clock_gettime(Process::CLOCK_MONOTONIC)
# ...
ending = Process.clock_gettime(Process::CLOCK_MONOTONIC)
elapsed = end - start
Note: If you want to measure the time code takes to run, use Benchmark
.
Swift
A in Objective-C/C on OSX, we will use mach_continuous_time
:
var info = mach_timebase_info()
mach_timebase_info(&info)
let start = mach_absolute_time()
// ...
let end = mach_absolute_time()
let elapsed_nanoseconds = (end - start) * UInt64(info.numer) / UInt64(info.denom)
Note: mach_timebase_info
can fail. It should return KERN_SUCCESS
, otherwise, you would have to fall back to system time.
This is a community wiki. Feel free to edit it to add or fix what is in here.
No comments:
Post a Comment