Understanding Delta time in LWJGL - time

I've started looking into lwjgl and I'm particularly having trouble understanding how Delta works. I have browsed other questions and websites related to this but it is still a confusing topic to wrap my head around. It would be great if someone here can help me out so please bear with me.
I understand that the Delta time for 60fps would be 16, around double that if the frame-rate is 30. I don't understand how this is calculated. Is it the time it takes between frames? Sorry for the noobish question.
private long getTime() {
return (Sys.getTime() * 1000) / Sys.getTimerResolution();
}
private int getDelta() {
long currentTime = getTime();
int delta = (int)(currentTime - lastTime);
lastTime = getTime();
return delta;
}

As opiop65 already said, the delta time is simply the time spent between your last frame's beggining and your current frame's beggining.
How does it work?
Delta time can be any kind of unit: nanoseconds, milliseconds (<- usually this is the standard) or seconds. As you said delta time is 16 when the game is running on 60FPS and 32 when the game runs on 30FPS. As for the why, it's simple: In order for a game to run at 60 frames per second it has to produce a frame every 1000/60 (= 16.666667) milliseconds, but if it running at 30 frames then it has to produce a frame every 1000/30 (= 33.333333) milliseconds.
But why do we use delta time?
We use delta time because we want to do movement and all sorts of stuff time dependant and not frame depentdant. Lets say that you want one of your game's character to move 1 unit horizontally per second. How do you do that? Obviously, you can't just add 1 to the character's location's X value, because it would get moved 1*x times per second where x is equal to your FPS (assuming that you would update the character every frame). That would mean that if somebody runs the game on 1 FPS his character would move 1 units per second, where if somebody runs the game on 5000 FPS his character would move 5000 units per second. Of course that is unacceptable.
One could say that he would move the character 1/16.6667 units on every update but then again if somebody has 1 FPS he moves 1/16.6667 units per second, opposed to that guy who runs on 5000 FPS, thus moving 5000*(1/16.6667) units per second.
Yes, you can enable V-Sync but what if somebody has a 120Hz monitor (or even higher) and not 60Hz?
Yes, you can lock the framerate but your players wouldn't be too happy about that. Also that wouldn't stop the character from slowing down when the game drops below 60FPS. So what now?
Delta time to the rescue!
All you have to do is just to move your character 1*delta on every update.
Delta time is low if the game runs on a high FPS and high if the game runs on a low FPS thus making those character go slower who runs the game on a higher FPS (so he would move smaller amounts but more frequently) and those character faster who runs the game on a lower FPS (so he would move larger amounts less frequently) and in the end they would move equal distances over the same time.
Please note that it does matter what unit you use when multiplying with the delta time:
If you use millis then at 60FPS your delta would be 16.6667 ending up with 1*16.6667 = 16.6667 movement every frame. However, if you would measure your delta time in seconds then at 60FPS your delta time would be 0.016667 meaning that your character would move 0.016667 units every frame.
This is not something you should worry about, just keep it in mind.

Delta time is simply the time it takes for one frame to "dispose" of itself and then another to display on the screen. Its basically the time between frames, as you put it. From Google:
Mathematics. an incremental change in a variable.
Let's pick apart your code.
return (Sys.getTime() * 1000) / Sys.getTimerResolution();
This line simply returns the current time in (I believe) milliseconds?
long currentTime = getTime();
int delta = (int)(currentTime - lastTime);
lastTime = getTime();
return delta;
The first line simply gets the current time. Second line then calculates delta by subtracting the current time (which is the time when the current frame was displayed) by the lastTime variable (which is the time when the last frame was displayed). Then lastTime is set to the currentTime, which is when the current frame is displayed. Its really simple when you think about it, its just the change in time between frames.

Related

Recording history for a track in a video

This is a bit of a logical question
I am tracking an object in a video running at N fps. In a practical system, the frames don't have the exact gap and may even have frame drops
-> I am also provided with timestamp with every frame.
-> I start my track at frame X and end at Frame Y
I have divided my video into grids (spatially) and at each instance, I place the object into one of the grids
Now the simplest case of this issue is as follows,
Suppose the object was identified in just 1 frame, what should be the duration for that track?
Options:
1. Exclusive duration computation
Duration = end time - start time = 0 for this instance
Inclusive duration computation
Duration = end time - start time + 1 = 1* for this instance
*let us assume we have the information in milliseconds for now
Add frame gap. Since we know the FPS, we can compute 1/N to be the frame gap
Duration = end time - start time + 1/N
Add average time spent in a grid by computing speed of the person. I am not sure how to compute this since, it depends on previous 3 definitions of duration
Any other metric I can take?
Thank you
If an object appears at frame X, and disappears at X+1, end time shall be the time of frame X+1, not frame X. Now, duration is obviously computed as end time - start time which in this case will be frame gap.

Is there any relation between advertising interval, walking speed, and window size of moving average filter?

My beacons have advertisement interval of 330ms. I use an iOS device to scan the advertisement packet whose scanning rate is 1 scan per second on average. I want to use the moving average filter to smooth the fluctuating RSSI values. Considering the walking speed of 1.2 m/s and the advertisement interval of 330 ms, what should be the size of a window in the moving average filter? Is there any mathematical relationship between them?
Thank you.
There is no one correct answer here. It is a trade-off between noise in the distance estimate and lag time.
The large (and longer) your statistical sample, the more lag time there will be in a running average. A 20 second window will tell you where you were on average over the last 20 seconds, and filter out a lot of noise. A 5 second running average will tell you where you were on average over the last 5 seconds, but with much more noise on the calculation.
How much lag you can tolerate and how much noise you can tolerate all depend on your use case. Use cases that are very time sensitive may sacrifice accuracy for the sake of less lag. Conversely use cases needing greater accuracy may accept more lag to filter out more noise on the estimate.

Animation speed decrease gradually till stop

I am trying to decrease a number toward zero but slowly and constantly using C#.
Actually, I have to stop an animation (train) slowly not rapidly. Train will become slow gradually and then it will stop. I cant stop the train but the problem is, it become stop with little jerk
what I have tried so far
ANIMATION_OBJECT.animation [ClipName].speed -= Time.deltaTime * Speed * .1f;
to set speed gradually low but it is not working.
Because
Time.deltaTime * Speed * .1f;
not equal to last time it was calculate so it will jerk. You should try to decrease with exact value and minimum at anytime like
ANIMATION_OBJECT.animation [ClipName].speed -= 0.05f;
And why you don't decrease it speed instead ?

Changing the playback speed of an animation entity

I have an entity that plays an animation that runs in my world at speed s = 1.
Now starting with a specific time interval in my world it is possible for the animation to slow down, which means it plays at a speed s where: 0 < s < 1.
This time interval is defined by the starttime ta and endtime tb.
So if the time in my world reaches ta, the animation's speed is reduced so that it plays slower, (like a slow motion effect) while everything else remains at it's usual speed.
Now somewhere in this interval ta and tb, the animation stops to play slow and plays faster s > 1, so that when the time reaches tb, it catches up with the rest of the world.
My question is now, how fast has the speed after the slowing down to be, so that the animation catches up exactly? Given that :
the timeinterval ta, tb
the speedfactor by how much the animation is slowed, once ta is reached.
the time between ta and tb, when the slow-effect is stopped and the fast-effect should be started.
I hope the question is understandable, if not please let me know. As an example, please imagine a machine that throws a ball in an arc, then moves along the floor with a constant speed and catches the ball. My case is now, that it first moves at a slower speed, but after a certain time, it has to increase it's speed so that it can catch the ball. What is that speed?
Let's say tc is the point where the animation should start playing faster and ss is the slow animation speed
The formula for the new animation speed should be:
sfast = ( (tb-ta) - ((tc-ta)*ss) ) / (tb-tc)

Bitrate Calculation

(Excuse for My english it's freak i´m from LA)
I'm trying to finish a trascoding process in VB6.0 , i'm working with ffmpeg , its a very good transcoder , to finish the project i want a progress bar for the trascoding process but it's so very hard , first i need to understand , how a program can calculate the time remaining to the process if i have the inputs
Average Bitrate
Frame rate
Start file Size.
I'm trying with : File size (KB) / Average Bitrate Kb/s.
In theory this must to work , but the calculated time it`s very small than the real time processed. Somebody have any idea about this , what is the formula (snipped) to calculate the time remaining in a trascoding process. in this wonderfull web i find many answer to mys projects..
The bitrate won't help you in calculating progress.
If you have the file length in seconds, and the frame rate, and ffmpeg outputs what frame its processing right now, you can calculate the approximate time.
The general solution for "time remaining," given:
A number total_units that represents the size, number of units, etc. to be processed
A number units_processed that represents how many M's have been processed so far
A number start_seconds that gives the time, in seconds since the operation started
is:
seconds_elapsed = current time - start time
seconds_per_unit = seconds_elapsed / units_processed
units_left = total_units - units_processed
seconds_remaining = unit_left / seconds_per_unit
This algorithm does best when the times to process each unit are nearly the same, or at least when the time/unit has little correlation with elapsed time. It stinks on ice if time/unit varies with elapsed time.

Resources