After streaming a still image (with x264) for a long period of time, the transition to live video makes the CPU spike to 100% for a period of time proportionally equal to how long the still image was streaming. More specifically, transitioning after a minute will result in a CPU spike lasting about 15 seconds. Transitioning after 30 minutes will result in that spike lasting closer to 3 minutes.
Does this symptom make any sense and is there anything I can do about it?
Related
I have been stuck on this for the past day. Im not sure how to calculate cpu utilization percentage for processes using round robin algorithm.
Let say we have these datas with time quantum of 1. Job Letter followed by arrival and burst time. How would i go about calculating the cpu utilization? I believe the formula is
total burst time / (total burst time + idle time). I know idle time means when the cpu are not busy but not sure how to really calculate it the processes. If anyone can walk me through it, it is greatly appreciated
A 2 6
B 3 1
C 5 9
D 6 7
E 7 10
Well,The formula is correct but in order to know the total-time you need to know the idle-time of CPU and you know when your CPU becomes idle? During the context-swtich it becomes idlt and it depends on short-term-scheduler how much time it take to assign the next proccess to CPU.
In 10-100 milliseconds of time quantua , context swtich time is arround 10 microseconds which is very small factor , now you can guess the context-switch time with time quantum of 1 millisecond. It will be ignoreable but it also results in too many context-switches.
Is there some type of conversion for ticks to a unit of real time? I would like my program to simulate a 48 hour experiment. Does anyone have any suggestions on how to do this? Thanks!
How quickly do things change in your experiment? From system dynamics, a good rule of thumb is to have a discrete clock tick 4 times during the smallest interval of real time in which something meaningful happens in the modelled system. For example, if you would expect to see changes every minute, then you would have 4 ticks each minute (and your ABM rules about updating the system would be calculated on the basis of 15 seconds) and then run the simulation for 11,520 (=48x60x4) ticks.
I've started looking into lwjgl and I'm particularly having trouble understanding how Delta works. I have browsed other questions and websites related to this but it is still a confusing topic to wrap my head around. It would be great if someone here can help me out so please bear with me.
I understand that the Delta time for 60fps would be 16, around double that if the frame-rate is 30. I don't understand how this is calculated. Is it the time it takes between frames? Sorry for the noobish question.
private long getTime() {
return (Sys.getTime() * 1000) / Sys.getTimerResolution();
}
private int getDelta() {
long currentTime = getTime();
int delta = (int)(currentTime - lastTime);
lastTime = getTime();
return delta;
}
As opiop65 already said, the delta time is simply the time spent between your last frame's beggining and your current frame's beggining.
How does it work?
Delta time can be any kind of unit: nanoseconds, milliseconds (<- usually this is the standard) or seconds. As you said delta time is 16 when the game is running on 60FPS and 32 when the game runs on 30FPS. As for the why, it's simple: In order for a game to run at 60 frames per second it has to produce a frame every 1000/60 (= 16.666667) milliseconds, but if it running at 30 frames then it has to produce a frame every 1000/30 (= 33.333333) milliseconds.
But why do we use delta time?
We use delta time because we want to do movement and all sorts of stuff time dependant and not frame depentdant. Lets say that you want one of your game's character to move 1 unit horizontally per second. How do you do that? Obviously, you can't just add 1 to the character's location's X value, because it would get moved 1*x times per second where x is equal to your FPS (assuming that you would update the character every frame). That would mean that if somebody runs the game on 1 FPS his character would move 1 units per second, where if somebody runs the game on 5000 FPS his character would move 5000 units per second. Of course that is unacceptable.
One could say that he would move the character 1/16.6667 units on every update but then again if somebody has 1 FPS he moves 1/16.6667 units per second, opposed to that guy who runs on 5000 FPS, thus moving 5000*(1/16.6667) units per second.
Yes, you can enable V-Sync but what if somebody has a 120Hz monitor (or even higher) and not 60Hz?
Yes, you can lock the framerate but your players wouldn't be too happy about that. Also that wouldn't stop the character from slowing down when the game drops below 60FPS. So what now?
Delta time to the rescue!
All you have to do is just to move your character 1*delta on every update.
Delta time is low if the game runs on a high FPS and high if the game runs on a low FPS thus making those character go slower who runs the game on a higher FPS (so he would move smaller amounts but more frequently) and those character faster who runs the game on a lower FPS (so he would move larger amounts less frequently) and in the end they would move equal distances over the same time.
Please note that it does matter what unit you use when multiplying with the delta time:
If you use millis then at 60FPS your delta would be 16.6667 ending up with 1*16.6667 = 16.6667 movement every frame. However, if you would measure your delta time in seconds then at 60FPS your delta time would be 0.016667 meaning that your character would move 0.016667 units every frame.
This is not something you should worry about, just keep it in mind.
Delta time is simply the time it takes for one frame to "dispose" of itself and then another to display on the screen. Its basically the time between frames, as you put it. From Google:
Mathematics. an incremental change in a variable.
Let's pick apart your code.
return (Sys.getTime() * 1000) / Sys.getTimerResolution();
This line simply returns the current time in (I believe) milliseconds?
long currentTime = getTime();
int delta = (int)(currentTime - lastTime);
lastTime = getTime();
return delta;
The first line simply gets the current time. Second line then calculates delta by subtracting the current time (which is the time when the current frame was displayed) by the lastTime variable (which is the time when the last frame was displayed). Then lastTime is set to the currentTime, which is when the current frame is displayed. Its really simple when you think about it, its just the change in time between frames.
Anyone know how to avoid that Windows 7 sometimes pauses for 300-600ms, even freezing SystemTime and MultimediaTimer (so if you measure time before and after this pause, it measures 0ms while PerformanceCounter in fact does measure this pause correctly. CPU load is pretty low (10%). The system uses a new MLC SSD. Do these still have stutter issues?
I found this behaviour by measuring timestamps from a camera grapping at 6 frames per second. I logged when images came in, and looking at the grapping log, the time between the images were fine, until I warned if the time between them was 20% too fast and 20% too slow. Then I sometimes (once per hour, sometimes only after 4 hours) got 300-600ms warnings. Followed by some "too fast" (image buffer suddenly give images from the buffer that built up during the 300-600ms pauses in a burst). However, the times in the log entries show that the systemtime wasnt updated during this time.
Log timestamps are given by GetLocalTime(LPSYSTEMTIME), and the time between images grapped are given by PerformanceCounter. When I use multimediatimer for to measure the time between new images , its time duration is the same as you get when subtracting the times in the log. Then I thought it was weird that it gave me extra images with 0-30ms time difference.
I tried all kinds of tweaks and driver updates in the network interface, different cameras, to no luck.
166ms is the ideal time between images , but here is an example of "bursts" of missing time slots and discrepancy between systemtime and performancecounter:
[03:06:09:48:22:615]New Image
[03:06:09:48:22:781]New Image
[03:06:09:48:22:949]New Image
[03:06:09:48:22:974]New Image. Warning Time since last: 224ms
[03:06:09:48:23:083]New Image
[03:06:09:48:23:238]New Image. Warning Time since last: 454ms
[03:06:09:48:23:261]New Image. Warning Time since last: 224ms
[03:06:09:48:23:415]New Image. Warning Time since last: 353ms
[03:06:09:48:23:551]New Image
[03:06:09:48:23:583]New Image. Warning Time since last: 330ms
[03:06:09:48:23:734]New Image. Warning Time since last: 451ms
[03:06:09:48:23:754]New Image. Warning Time since last: 119ms
[03:06:09:48:23:854]New Image
[03:06:09:48:24:020]New Image
[03:06:09:48:24:186]New Image
[03:06:09:48:24:354]New Image
[03:06:09:48:24:520]New Image
[03:06:09:48:24:686]New Image
So it all comes down to this question:
What phenomenon can cause the systemtime and multimedia time to lock up with the rest of the system so the pause is masked in the timings, while performance counter still keeps time, and how can I fix it?
I fixed this by installing a new networks driver, disabling hyperthreading, turbo boost and CPU Pstates.
There are only two hard things in Computer Science: cache invalidation
and naming things.
-- Phil Karlton
My app is reporting CPU time, and people reasonably want to know how much time this is out of, so they can compute % CPU utilized. My question is, what's the name for the wall clock time times the number of CPUs?
If you add up the total user, system, idle, etc. time for a system, you get the total wall clock time, times the number of CPUs. What's a good name for that? According to Wikipedia, CPU time is:
CPU time (or CPU usage, process time) is the amount of time for which
a central processing unit (CPU) was used for processing instructions
of a computer program, as opposed to, for example, waiting for
input/output (I/O) operations.
"total time" suggests just wall clock time, and doesn't connote that over a 10 second span, a four-cpu system would have 40 seconds of "total time."
Total Wall Clock Time of all CPUs
Naming things is hard, why waste a good 'un once you've found it ?
Aggregate time: 15 of 40 seconds.