Is there a way to have OptionalStep timeout quickly in QTP? - vbscript

In my automated test I have an area that occasionally shows up (and needs to be clicked on when it does show up). This is the perfect place to use an OptionalStep prefix, to prevent the step from failing if the optional area never shows up.
Thing is, I would like the OptionalStep to only wait a second or two before moving on to the rest of the test. Just as I can have object.Exist(2) only wait for 2 seconds, is there a way to have OptionalStep wait for only a couple of seconds?
Some other caveats:
I'd like to keep this as one small line. I know I could create a
multi-line logic test that uses object.Exist(2) inside an If/Then
statement, but I'd rather have the code be small and trim.
I don't want to change the global 20 second timeout just for this one
step.
Since this optional step only shows up in one specific area, it seems
like Recovery Scenarios would not be a good choice to have running
throughout the entire test.

Vitaly's comment would be a good solution as you are possibly unnecessarily over complicating your test.
Also having such a long global timeout is not recommended and should be as low as possible. I usually have it set at around 3 seconds and deal with the synchronisation in the code.
Anything that takes a long period of time should be known about upfront and dealt with in the code. Having a global timeout for everything will cause your test to run unnecessarily slow when most object cannot be found errors occur.

Related

Cypress - how to describe a multiple step test

This is probably an anti-pattern, but I'm new to e2e testing and I'm not sure there's a way around my requirements.
I have the need to test a scenario in my system that is several steps long (maybe 50-60 steps). This means going to about 15-20 pages, clicking and entering various details, and seeing the results. The reason the requirement is so long is that the system processes orders from creating an order, creating line items with various details, and running it through a long production process. My most valuable test would be to run through the whole process and verify the results.
I can't find a good way of isolating, say, steps 40-42 to verify that this process alone works well, because to get an order in that state I'd have to run through the first 39 steps.
Is there a good way to write tests to cover this scenario?

Performance tuning VBA code in large procedure

I've been asked to tune the performance of a specific function which loads every time a worksheet is opened (so it's important that it doesn't make things slow). One of the things that seems to make this function slow is that it does a long call to the database (which is remote), but there are a bunch of other possibilities too. So far, I've been stepping through the code, and when something seems to take a long time making a note of it as a candidate for tuning.
I'd like a more objective way to tell which calls are slowing me down. Searching for timing and VBA yields a lot of results which basically amount to "Write a counter, and start and stop it either side of the critical section" (often with the macro explicitly called). I was wondering whether there was a way to (in the debugger) do something like "Step to next line, and tell me the time elapsed".
If not, can someone suggest a reasonable macro that I could use in the Immediate window to get what I'm after? Specifically, I would like to be able to time an arbitrary line of code within a larger procedure (rather than a whole procedure at once, which is what I found through Google).
Keywords for your further search would be to look for a "Profiler" for VBA. I've heard of VB Watch and VBA Code Profiler System (VBACP) as well as from Stephen Bull's PerfMon, but sparing the latter they're mostly not free.
So far for the official part of my answer, and I toss in some extra in terms of maybe useless suggestions:
Identifying "slow" code by "humanly measurement" (run a line and say: "Woah, that takes forever") in the debugger is certainly helpful, and you can then start looking into why they're slow. Your remote database call may take quite long if it has to transmit a lot of data - in which cases it may be a good idea to timestamp the data on both ends and ask the DB whether data had been modified before you grab it.
Writing the data into the sheet may be slow depending on the way you write it - which can sometimes be improved by writing arrays to a range instead of some form of iteration.
And I probably don't need to tell you about ScreenUpdating and EnableEvents and so on?

How to use DoEvents() without being "evil"?

A simple search for DoEvents brings up lots of results that lead, basically, to:
DoEvents is evil. Don't use it. Use threading instead.
The reasons generally cited are:
Re-entrancy issues
Poor performance
Usability issues (e.g. drag/drop over a disabled window)
But some notable Win32 functions such as TrackPopupMenu and DoDragDrop perform their own message processing to keep the UI responsive, just like DoEvents does.
And yet, none of these seem to come across these issues (performance, re-entrancy, etc.).
How do they do it? How do they avoid the problems cited with DoEvents? (Or do they?)
DoEvents() is dangerous. But I bet you do lots of dangerous things every day. Just yesterday I set off a few explosive devices (future readers: note the original post date relative to a certain American holiday). With care, we can sometimes account for the dangers. Of course, that means knowing and understanding what the dangers are:
Re-entry issues. There are actually two dangers here:
Part of the problem here has to do with the call stack. If you call .DoEvents() in a loop that itself handles messages that use DoEvents(), and so on, you're getting a pretty deep call stack. It's easy to over-use DoEvents() and accidentally fill up your call stack, resulting in a StackOverflow exception. If you're only using .DoEvents() in one or two places, you're probably okay. If it's the first tool you reach for whenever you have a long-running process, you can easily find yourself in trouble here. Even one use in the wrong place can make it possible for a user to force a stackoverflow exception (sometimes just by holding down the enter key), and that can be a security issue.
It is sometimes possible to find your same method on the call stack twice. If you didn't build the method with this in mind (hint: you probably didn't) then bad things can happen. If everything passed in to the method is a value type, and there is no dependance on things outside of the method, you might be fine. But otherwise, you need to think carefully about what happens if your entire method were to run again before control is returned to you at the point where .DoEvents() is called. What parameters or resources outside of your method might be modified that you did not expect? Does your method change any objects, where both instances on the stack might be acting on the same object?
Performance Issues. DoEvents() can give the illusion of multi-threading, but it's not real mutlithreading. This has at least three real dangers:
When you call DoEvents(), you are giving control on your existing thread back to the message pump. The message pump might in turn give control to something else, and that something else might take a while. The result is that your original operation could take much longer to finish than if it were in a thread by itself that never yields control, definitely longer than it needs.
Duplication of work. Since it's possible to find yourself running the same method twice, and we already know this method is expensive/long-running (or you wouldn't need DoEvents() in the first place), even if you accounted for all the external dependencies mentioned above so there are no adverse side effects, you may still end up duplicating a lot of work.
The other issue is the extreme version of the first: a potential to deadlock. If something else in your program depends on your process finishing, and will block until it does, and that thing is called by the message pump from DoEvents(), your app will get stuck and become unresponsive. This may sound far-fetched, but in practice it's surprisingly easy to do accidentally, and the crashes are very hard to find and debug later. This is at the root of some of the hung app situations you may have experienced on your own computer.
Usability Issues. These are side-effects that result from not properly accounting for the other dangers. There's nothing new here, as long as you looked in other places appropriately.
If you can be sure you accounted for all these things, then go ahead. But really, if DoEvents() is the first place you look to solve UI responsiveness/updating issues, you're probably not accounting for all of those issues correctly. If it's not the first place you look, there are enough other options that I would question how you made it to considering DoEvents() at all. Today, DoEvents() exists mainly for compatibility with older code that came into being before other credible options where available, and as a crutch for newer programmers who haven't yet gained enough experience for exposure to the other options.
The reality is that most of the time, at least in the .Net world, a BackgroundWorker component is nearly as easy, at least once you've done it once or twice, and it will do the job in a safe way. More recently, the async/await pattern or the use of a Task can be much more effective and safe, without needing to delve into full-blown multi-threaded code on your own.
Back in 16-bit Windows days, when every task shared a single thread, the only way to keep a program responsive within a tight loop was DoEvents. It is this non-modal usage that is discouraged in favor of threads. Here's a typical example:
' Process image
For y = 1 To height
For x = 1 to width
ProcessPixel x, y
End For
DoEvents ' <-- DON'T DO THIS -- just put the whole loop in another thread
End For
For modal things (like tracking a popup), it is likely to still be OK.
I may be wrong, but it seems to me that DoDragDrop and TrackPopupMenu are rather special cases, in that they take over the UI, so don't have the reentrancy problem (which I think is the main reason people describe DoEvents as "Evil").
Personally I don't think it's helpful to dismiss a feature as "Evil" - rather explain the pitfalls so that people can decide for themselves. In the case of DoEvents there are rare cases where it's still reasonable to use it, for example while a modal progress dialog is displayed, where the user can't interact with the rest of the UI so there is no re-entrancy issue.
Of course, if by "Evil" you mean "something you shouldn't use without fully understanding the pitfalls", then I agree that DoEvents is evil.

Creating Cron Jobs in C#

I am writing a Scheduling type application in C#, and allowing the user to store tasks that they want to run at certain times. Right now I give them the option of specifying how often to run it (Daily/Weekly/Monthly) as well as specify a time, which is then stored in a Database.
I am having a little bit of trouble just wrapping my head around the pseudo code behind this, and am looking for some suggestions about how to implement it. I am running a repeating timer every 60 seconds to check each task to see if it needs to run, but always seem to hit road blocks when I need to work with Date/Time and adding Recurring Day (daily/weekly/etc) has complicated it even more.

Why do my WP7 settings take so long to load?

I put a stopwatch on it. The first time the app loads (no settings file exists) it takes about 190ms to fail to load four settings. The app runs, three bools and a short string are written as settings, and the next time the app loads, it takes 400ms to read the first setting from the IsolatedStorageSettings.ApplicationSettings collection and about 1ms to get the remainder.
Is there anything I can do to ameliorate this load time?
Ues a better Serialization method ;)
XMLSerialization is okay for more complex graphs, but for simple settings, binary serialization would be much better. Also, when you say fail to load, I assume you're doing a check to see if the files exist? If not, I think there may be exceptions being thrown internally which would slow down execution as well.

Resources