Visual Studio PerfTips elapsed time is different to the time from StopWatch - visual-studio

I use Visual Studio 2017 to debug my code and leverage PerfTips to get the rough elapsed time of a function call.
But I just found big difference between perftips time and the time from StopWatch().
Example:
var sw=StopWatch();
sw.Start();
MyFunction();
sw.Stop();
I set break point before and after the MyFunction() call, PerfTips shows the elapsed time of MyFunction() call is around 260 ms.
But the sw.Elapsed.TotalMillionSeconds value is > 1000 ms. Why so big difference?
Anything wrong of my StopWatch usage or perftips?
BTW: I check the stopwatch time value in debugger mode, that is, set break point on sw.Stop, and read the value on debugger window. Is it incorrect way to get the accurate StopWatch() value?

Related

Code for a function that just returns after an hour -interview question

I saw this question online from an interview:
Suppose you have this code:
void myFunction(){
int time = clcTime();
while (clcTime()-time!=3600);
}
When clcTime() is a method that returns the seconds that passed since 00:00 of today.
(1). Find what this code snippet does.
(2). Some QA tester said this code fails at specific case. What's that case and how can you solve that issue?
(3). Another QA tester that during the day this code worked fine, but when he got to sleep - something went wrong. What can possibly be the problem and how can you solve it?
My attempt:
For (1), I think this function just suppose to run in a loop for an hour.
For (3), I think the problem is when the time variable get its value when the current hour of the day is in the range [23:00:00,23:59:59]. And that's because on that case, the value of time will be in the range [23*3600,23*3600 + 3599] and clcTime() can't return a matching value in the range [24*3600, 24*3600 + 3599]. So in that case, we know that the condition 'clcTime()-time' will never get a value of 3600 and we will get an infinite loop.
My suggestion for solving it is replacing the while line with those lines:
int measure = clcTime() - time;
int measureModulo = measure % 3600;
while (measure==0 || measureModulo!=0){
measure = clcTime() - time;
measureModulo = measure % 3600;
}
The only problem I still have is that I can't figure out (2) - I don't find any other problem with this code.
Do you have any idea what else can be problematic with this code?
Also, please feel free to correct me if I was wrong with what I wrote for (1) and (3).
Another problem with this code, and your fix, is that it checks clcTime() for an exactly matching value. If the system is busy and the loop doesn't get to run for more than a second, then it will miss the matching second and continue waiting for at least another hour.
Also there will be problems when the user changes the system clock or system time zone, when daylight savings time comes into or out of effect, when the clock is automatically adjusted for leap seconds, etc.

How to see what value is being calculated pine Editor

I have the following script running with the intention of closing a trade after it has been open for a period of 4 days since the trade was taken.
TimeDiff = time - time[1]
MinutesPerBar = TimeDiff / 60000
//calcuates how long one bar is in minutes
BarsSinceSwingLongCondition = barssince(SwingLongCondition)
// Calculates how many bars have passed since open of trade
CurrentSwingTradeDuration = BarsSinceSwingLongCondition * MinutesPerBar
//calculates the duration that the trade has been opened for (minutes*number of bars)
MaximumSwingTradeDuration = 4*1440
// Sets maximum trade duration. Set at 4 Days in minutes
SwingLongCloseLogic3 = CurrentSwingTradeDuration > MaximumSwingTradeDuration
// Closes trade when trade duration exceeds maximum duration set (4days)
The close logic however isn't executing when I run the strategy as i have trades open for longer than the maximum duration.
Is there any way to see what value each element of the formula is calculating so that I can see where the error is (i suspect it could be the time element). Or can anyone see where I am going wrong in the code?
The fastest way to achieve that is using the plotchar function, which would show the values in the data-window on mouse-over on each bar. The user manual contains several other techniques available for debugging.

Early wakeups in WaitForSingleObject() ...?

Everything I've read both on the MS docs site (where it's not really addressed) and here in SO, says that Windows WaitForSingleObject() is not subject to spurious wakeups and it waits for at least the provided time, and maybe longer. However, my testing says this is not true and in fact early wakeups almost always happen. Is the "common wisdom" wrong and I just need to add loops to handle early wakeups, or am I doing something wrong and I need to keep banging my head on this to try to figure out?
Unfortunately the full code is too complex to post here, but I have two different threads each with their own event, created via:
event = CreateEventA(NULL, false, false, NULL);
(event is a thread-local variable). I have a mutex I use to ensure that both threads start running at about the same time.
In each thread I call WaitForSingleObject(). In this specific test, I never call SetEvent() so the only way to finish is via timeout, and the return code shows that's what happens. However, the actual amount of time spent waiting is massively variable and 90% of the time is less than the time I requested. I've instrumented this using QueryPerformanceCounter() to detect how long is spent here and it's just wrong. Here's the instrumented code:
LARGE_INTEGER freq, ctr1, ctr2;
QueryPerformanceFrequency(&freq);
QueryPerformanceCounter(&ctr1);
DWORD ret = WaitForSingleObject(event, tmoutMs);
QueryPerformanceCounter(&ctr2);
uint64_t elapsed = ((uint64_t)ctr2.QuadPart - (uint64_t)ctr1.QuadPart) * 1000000ULL / (uint64_t)freq.QuadPart;
(here elapsed is kept in microseconds, just to be a bit more specific)
Then I print this info out. In one thread tmoutMs is 2, and in the other thread tmoutMs is 100. Almost every time the returned values are too short: the 2ms wait can take anywhere from 700us up, and the 100ms wait takes from about 93ms up. Only once in 7 tries or so will the elapsed time be >100ms. Here are some sample outputs:
event=104: pause(tmoutMs=2) => ret=258 elapsed us=169, ms=0
event=112: pause(tmoutMs=100) => ret=258 elapsed us=93085, ms=93
event=104: pause(tmoutMs=2) => ret=258 elapsed us=427, ms=0
event=112: pause(tmoutMs=100) => ret=258 elapsed us=94002, ms=94
event=104: pause(tmoutMs=2) => ret=258 elapsed us=3317, ms=3
event=112: pause(tmoutMs=100) => ret=258 elapsed us=96840, ms=96
event=104: pause(tmoutMs=2) => ret=258 elapsed us=11461, ms=11
event=112: pause(tmoutMs=100) => ret=258 elapsed us=105189, ms=105
The return code is always WAIT_TIMEOUT as expected.
Is this reasonable, even though it's not documented (or is it documented somewhere that I just can't find), and I just have to loop on my own to handle early wakeups?
FWIW, this is a C++ program compiled with Visual Studio 2017 running on Windows10. It's a unit test program using Google Test, and has no graphical interface: it's command-line only.

Does steady_clock::now return seconds?

I am trying to figure out what the value of t is ? Is it seconds or milliseconds ? The steady_clock reference does not mention the unit used.
auto t = std::chrono::steady_clock::now() / 1000;
auto p = t/1000;
I am thinking now() returns seconds and t is in milliseconds and p is in microseconds. Let me know if I am getting this right ?
It's std::chrono::time_point<std::chrono::steady_clock> (the documentation on CppReference is generally better quality).
Guessing your next question — to convert from that to seconds you would use time_since_epoch() (the documentation has an example of extracting a dimension-free number of seconds from it), or alternatively as (now - epoch) / 1_second
Unit of value returned by std::chrono::steady_clock::now() is not defined by standard (it is general value of type std::chrono::time_point).
The resolution of the std::chrono::time_point (it stores a value of type Duration indicating the time interval from the start of the Clock's epoch) is implementation dependent (platforms/compiler), and you shouldn't rely on it.
To get a desired unit, you can easily convert the time_point to a value in seconds, milliseconds, etc. by duration casting:
auto milliseconds = std::chrono::duration_cast<std::chrono::milliseconds>(std::chrono::steady_clock::now().time_since_epoch()).count();
(time_since_epoch() returns a duration representing the amount of time between *this and the clock's epoch).

A VB Progress Bar that moves on timers interval?

I was making a program in Visual Basic 2010, and was wondering if a Progress bar could move accounting on a Timer? If yes, it would be very helpful for my current task.
try this something like this in VB.NET:
Do
Threading.Thread.Sleep(100)
ProgressBar1.PerformStep()
Loop Until ProgressBar1.Value >= ProgressBar1.Maximum
http://checktechno.blogspot.com/2013/03/example-progressbar-in-visual-basic.html
or in C#:
do {
System.Threading.Thread.Sleep(100);
ProgressBar1.PerformStep();
} while (!(ProgressBar1.Value >= ProgressBar1.Maximum));
where "sleep" is set as a constant at 100 ms. You just to dynamically set that value to your timer.

Resources