To clarify, I mean time spent while the system is suspended/hibernated, not the calling thread (GetTickCount() returns the number of milliseconds since system boot).
As far as I know, GetTickCount is unrelated to threads and counts the time since the system has started. But it is better to use GetTickCount64 to avoid the 49.7 day roleover.
By the way, to get what you want you need the GetThreadTimes function. It records the creation and exit time and the amount of time the thread has spend in user or kernel space. So you have a nice way to calculate the amount of time spend.
Ok, I missed the "system" part of the question. But that is simple. When in hibernation GetTickCount continues the counting. Because people have suffered from the 49.7 days bug when the computer was in hibernate most of the time. See link text here for more information.
Short answer : Yes.
Longer answer: Read the GetTickCount() docs: It's the elapsed time since system startup, and even MS wouldn't suggest that time stands still while your computer is hibernating...
Yes, GetTickCount does include suspend/hibernate time.
In the following python script I call the Sleep API to wait 40 seconds to give me a chance to put the computer into hibernate mode, and I print the time before and after, and the tick count difference after.
import win32api
import time
print time.strftime("%H:%M:%S", time.localtime())
before = win32api.GetTickCount()
print "sleep"
win32api.Sleep(40000)
print time.strftime("%H:%M:%S", time.localtime())
print str(win32api.GetTickCount()-before)
Output:
17:44:08
sleep
17:51:30
442297
If GetTickCount did not include the time during hibernate it would be much less than the time I hibernated for, but it matches the actual time elapsed (7 minutes 22 seconds equals 442 seconds, i.e. 442000 millisecond "ticks").
For any one looking for answer under Windows CE platform, from docs:
http://msdn.microsoft.com/en-us/library/ms885645.aspx
you can read:
For Release configurations, this function returns the number of
milliseconds since the device booted, excluding any time that the
system was suspended. GetTickCount starts at 0 on boot and then counts
up from there.
GetTickCount() gives you the time in milliseconds since the computer booted. it has nothing to do with the process calling it.
No, GetTickCount() does not include the time the system spend during hibernate.
A simple test proves this.
in Python:
import win32api
win32api.GetTickCount()
-- do hibernate --
win32api.GetTickCount()
and you'll see the result...
Related
I know that you can use the time.Sleep(d Duration) function to let the Go program wait for a certain duration. Now I have the use-case that I want to keep the point in time even when the whole OS goes to sleep.
So as an example: When I sleep the go program at 1pm for two hours. And I turn off my computer on 2pm for half an hour until 2:30pm. I still want the code to resume at 3pm and not at 3:30pm.
(Of course when I sleep my OS longer than 3pm the code should resume immediately).
How should I do that with go? Is there a nice function or library for that or should I use cron as a workaround?
I'm looping 1000 times with time delay 1ms and calculating total time. It's very interesting how the total time is 15.6 seconds instead of 1. When I opened Google Chrome and surfed some websites, it ran correctly with 1 sec total. Also, it ran fine with Macbook too.
I'm wondering what kind of solutions that I need to do to fix this problem? Please try to run it without Chrome opened an again with Chrome opened to see the difference. It ran normally when Quora or Reddit or Stackoverflow opened on my system.
from timeit import default_timer as timer
import time
start = timer()
for i in range(1000):
time.sleep(0.001)
end = timer()
print ("Total time: ", end - start)
Edit: I didn't run it on Python. I just opened up Chrome and browsed some websites to speed up the time delay.
Updated: It's about the timer resolution from Windows. So basically, Chrome changed the timer resolution from 15.6ms to 1ms. This article explains very well: https://randomascii.wordpress.com/2013/07/08/windows-timer-resolution-megawatts-wasted/
I finally figured it out. Thanks a lot for the comments. Those gave me hints to solve it. To explain why this happened, Windows OS has its default timer resolution set to 15.625 ms or 64 Hz, which is decently enough for most of the applications. However, for applications that need very short sampling rate or time delay, then 15.625 ms is not sufficient. Therefore, when I ran my program itself, it's stuck at 15.6 s for 1000 points. However, when Chrome is opened, the higher resolution timer is triggered and changed to 1 ms instead of 15.6, which caused my program to run as expected.
Therefore, in order to solve it, I needed to call a Windows function named timeBeginPeriod(period) to change the resolution timer. Fortunately, python made it easy for me to fix it by providing ctypes library. The final code is provided below:
from time import perf_counter as timer
import time
from ctypes import windll #new
timeBeginPeriod = windll.winmm.timeBeginPeriod #new
timeBeginPeriod(1) #new
start = timer()
for i in range(1000):
print (i)
time.sleep(0.001)
end = timer()
print ("Total time: ", end - start)
Warning: I read about how this high timer resolution will affect the overall performance and also the battery. I have not seen anything happening yet, and CPU Usage on Activity on Windows Task Manage doesn't seem to overwhelming either. But keep that in mind if your applications happen to cause some strange behaviors.
Let's say I have a task execute. I want to know according to the system clock/time when that finished executing. Is there something in Java I can use to pull the current system time and display it in Java? I'm not looking to measure time like using Millis or Nano (that would only tell me how many milliseconds, not the actual time or date), but actually print the time finished (like "Finished at 9:42 P.M. 10/14/2012" as per my actual Windows time).
A quick Google search finds this: http://www.mkyong.com/java/java-how-to-get-current-date-time-date-and-calender/
Just make sure you import java.util and java.text, and you should be good to go.
I have a vb6 program installed on thousands of machines, some XP and some Win7. A Win7 user has called to say that the time the program applies to its events is one hour earlier than the time on the laptop clock, which is correct. It is set to the correct time zone (Eastern) and to daylight savings time adjustment, which is the way my own Win7 machine is set up (and my own machine does not have this problem).
The line of code that obtains this time in VB6 is:
.IssueDate = Now
to put the current time and date into a member variable.
Does anyone have any ideas why a particular machine would be off by one hour, given that the clock is displaying the correct time and the time zone and DST adjustment appear correct?
I'm going to mark this one 'answered' and move on. I asked my user, diffidently, to reboot, not really expecting it to do anything; he did and said the test case he ran did not show the error. I asked him to call me the next time he used the system for its full-blown purpose, but he hasn't done so. My current suspicion is that the PC clock was off by an hour for a period this morning and he didn't notice it, he only noticed the time on the documents the application was producing.
I am running a vb.exe through automation. In exe I have return a code which takes a data from database and saves that data into file. I ran that .exe for the first time. It took 1 mins. For testing baseline I called same .exe 5 times one after the other. But it took nearly 10 mins to generate.
My question is if it takes 1 min for 1 report to generate then it should take 5 mins to generate 5 report but why it is taking 10 mins (more than the double). Is there any problem while calling a exe one after the other?
No, there isn't a problem with calling an exe multiple times. However, there might be a problem with the data that you're fetching or writing. Maybe you didn't close the database connection correctly upon exiting the program and this causes a delay when you try to read from the database the second time.
But then again... without a more accurate description of what your program exactly does and the relevant pieces of source code, it's very hard to be more specific...