What is the duration measured in XEvent Profiler? - sql-server-profiler

I have XEvent Profiler in SQL Server 2016. Some articles say the duration is measured in microseconds or milliseconds depending on the SQL Server version.
How do I check whether it is microseconds or milliseconds?

Related

How to get equivalent CPU time-difference of GPU rendering time

How to convert the time difference given by the GPU timer while rendering into the equivalent CPU timing?
Let's say,
glGetQueryObjectuiv(query, GL_QUERY_RESULT, &elapsed_time) - will return the elapsed time for that query and I presume this elapsed time will correspond to GPU frequency.
How to get the corresponding CPU time which is equivalent to the GPU elapsed time?
It's a timer query - it returns a time in nanoseconds. Time doesn't change with frequency ...

Jersey, Java 8, Performance, Slow URLConnectionClientHandler.getInputStream

We are currently facing very weird problem in our enterprise application.
We are using
- Jersey 1.17
- JDK 1.8 (which we have recently migrated)
In our application what we do is, we make REST http calls to get the data from our different application server. Everything is normal in terms of speed/performance unless the load is increased.
When we excute the method (CallXService in below trace) which internally makes 10 to 15 Rest calls to different application the two reading are captured like
7 seconds - when there is almost no load.
28 seconds - when there is more load.
When we deep dive into the calls of 28 seconds call we get following trace along with the time captured.
JerseyRestClient.CallXService:469(0 ms self time, 6143 ms total time)
WebResource.post:251(0 ms self time, 6143 ms total time)
WebResource.handle:680(0 ms self time, 6143 ms total time)
Client.handle:648(0 ms self time, 6143 ms total time)
URLConnectionClientHandler.handle:149(0 ms self time, 6143 ms total time)
URLConnectionClientHandler._invoke:249(10 ms self time, 6143 ms total time)
URLConnectionClientHandler.getInputStream:310(6091 ms self time, 6091 ms total time)
Note the time taken by getInputStream here.
Where as in first 7 seconds getInputStream is not taking much time.
What does that time tells
1) Slow responding server
2) Network speed while getting the resource
3) Or the problem of Java 8 and Jersey 1.17.
Any help is much appreciated. Thanks !

Is QueryPerformanceCounter counter process specific?

https://msdn.microsoft.com/en-us/library/windows/desktop/dn553408(v=vs.85).aspx
https://msdn.microsoft.com/en-us/library/ms644904(VS.85).aspx
Imagine that I measure some part of code (20ms)
Context switching happend. And my thread was displaced by another thread which was executed (20 ms)
Then I receive quantum of time back from scheduler and perform some cals during 1ms.
If calculate elapsed time then what time will I receive? 41ms or 21 ms?
If calculate elapsed time then what time will I receive? 41ms or 21 ms?
QueryPerformanceCounter reports wall clock time. So the answer will be 41ms.

Delphi timer more precise than milliseconds

I've got a program in Delphi which takes in frames from an external application in 25 hertz (25 times per seconds) and then converts it to 60 hertz (60 frames per second) by creating 1-2 extra frames. I need to output these extra frames by continuously building a frame buffer and outputting the frames from here from a separate thread. The problem is that 1000/60 is 16.66667 which means I can't just send the frames in a "interval" on 16 or 17 milliseconds, I need it to be more precise. How do I do this in Delphi/Windows?
Use a multimedia timer via the Win32 API timeSetEvent() or CreateTimerQueueTimer() function.
You probably need to make use of both of the following:
The high resolution timer. This is available through the QueryPerformanceCounter Win32 function, and also wrapped by the Delphi TStopwatch type.
A waitable timer. This allows you to set a due date in the future and have your thread block, and be woken at the due date.
Both of these have higher resolution than the GUI timer, and should suffice for your needs. Read an overview here: http://msdn.microsoft.com/en-gb/library/windows/desktop/ms644900.aspx

High Resolution GetMessageTime()?

WINAPI has a function GetMessageTime() that returns the time a message was generated in system time, which has a resolution of 10 to 16 ms. Is there an effective way to get the time an event occurred in interrupt time (100 ns precision), or in some other format with at least 1 ms precision?
Even with Raw Input, the message timing isn't delivered with < 10 ms resolution. (The timing data comes from the WM_INPUT message.) As far as I can tell from the keyboard driver sources, the timing data simply isn't collected with < 10 ms resolution.

Resources