Is it okay to call "Navigate()" without awaiting it? - xamarin

I have a question about the MVVMCross async Navigate() call. If one were to to not "await" the Navigate() call, could there be adverse effects? The awaits were removed in our app to solve an issue were the MVVM command was keeping a button pressed when going back in the navigation, removing the await on Navigate calls solved this, but may have introduced some threading issues.
Navigate in ViewModel with await
await Navigate<MainMenuViewModel>();
Navigate in ViewModel Without await
Navigate<MainMenuViewModel>();

As a general async/await rule, you should always await asynchronous operations. Not awaiting could cause things to execute out of the expected order or locking up the UI thread.
In this case, it's going to be your call to decide if the risk is worth it. This seems more of a band-aid than a solution to your problem though.

Related

Nativescript - ActivityIndicator not working for a specific chunk of code

I'd like to display an activity-indicator displayed when I do a long process.
I set a model busy flag to true.
I then call a method that returns a Promise - when the promise calls resolve, I then set the busy flag to false in my 'then' handler.
When I wait for the promise to resolve, I expect the Activity Indicator animation to be displayed but it's not.
I checked and made sure the UI set up is correct and it works.
The work that's being done is using nativescript-calendar plugin - I'm adding a few calendar entries.
I commented out the work that's being done and just went to sleep for a little bit and then called resolve() and the animation worked.
So the Activity Indicator and the Promise mechanism is setup correctly - it's something this plug-in is doing that's causing the AI not to display.
What could cause such a behavior?
I actually edited the Promise work code - put to sleep for about 1 second and then started the calendar work. I see the AI for 1 second and then it freezes.
So it looks like the calendar writes is causing the AI to freeze.
I was under the understanding that Promise work is done in the background and should not effect foreground animation.
I've had a similar issue when using SQLite.
As you haven't explicitly stated that your running the calendar in a worker I am assuming your keeping it in the UI thread (main thread).
Any large amount of work done in the UI thread will cause noticeable lag/delays. So for example you call the activity-indicator then call a process that maxes out the thread, the process finishes the activity indicator goes to be drawn on the screen but then is hidden straight away before it is displayed as the process is finished.
The ideal way to solve this is to move the calendar writes code into a worker (multithread your app) and on success message turn off the activity-indicator.

Force UI repaint in Webkit (Safari & Chrome) right before Synchronous "Ajax" request

In a simple Ajax based website we are making some HttpRequests requests synchronously (I realize "synchronous Ajax" is somewhat of an oxymoron). The primary reason this is being done synchronously vs. asynchronously is in order to simplify the programming model for some of those involved (long story).
Anyway, we want to be able to make a styling change (specifically, overlay the screen with semi transparent white as Google search does) right before the request is made, and then take it away when results come back. Essentially this looks like:
load:function(url) {
....
busyMask.className="Shown"; //display=block; absolute positioned full screen semi-transparent
var dta=$.ajax({type:"GET",dataType:"json",url:url,async: false}).responseText;
busyMask.className="Hidden"; //sets display=none;
...
return JSON.parse(dta);
}
It is well known a synchronous requests will lock the UI. So not surprisingly, the white overlay never shows up in Safari and Chrome (it does in Firefox interestingly). I've tried slowing the response way down and using a pink overlay so that it will be painfully obvious, but it just won't update the UI until after the request is complete. Leaving the 'busyMask.className="Hidden"' part out will show the mask-- but only after the ajax request is complete.
I've seen many many tricks for forcing the UI to repaint (e.g. Why HourGlass not working with synchronous AJAX request in Google Chrome?, http://ajaxian.com/archives/forcing-a-ui-redraw-from-javascript), but they all seem to be in conjunction with trying to show actual "permanent" DOM or styling updates, not with temporarily showing a style change while a synchronous request is made.
So is there a way to do this or am I fighting a losing battle? It may be that we'll just need to switch to asynchronous requests on a case by case basis for the worst performing requests, which might be a decent way to tackle the learning curve issue... But I'm hoping there is an outside the box answer here.
Ok for the purpose of this question I will ignore the justification for why you require synchronous XHR requests. I understand that sometimes work constraints don't allow the use of the best practice solution and so we "make do" in order to get the job done. So lets focus on how to get synchronous ajax with visual updated working for you!
Well considering you are using jQuery for your XHR request, I'm going to assume its ok to use jQuery to show/hide the loading indicator and to handle any timing issues.
First let's set up a loading indicator in our markup:
<div id="loading" style="display:none;">Loading...</div>
Now lets create some javascript:
// Show the loading indicator, then start a SYNCRONOUS ajax request
$('#loading').show().delay(100).queue(function() {
var jqxhr = $.ajax({type:"GET",dataType:"json",url:"www.yoururl.com/ajaxHandler",async: false}).done(function(){
//Do your success handling here
}).fail(function() {
//Do your error handling here
}).always(function() {
//This happens regardless of success/failure
$('#loading').hide();
});
$(this).dequeue();
});
First, we want to show our loading indicator and then give the browser a moment delay to repaint before our syncronous XHR request gets started. By using jQuery's .queue() method we are putting our .ajax() call in the default fx queue so that it won't execute until after the .delay() completes, which of course doesn't happen until after the .show() completes.
The jQuery .show() method changes the target element's CSS display style to block (or restores its initial value if assigned). This change in CSS will cause the browser to reflow (aka "redraw") as soon as it is able. The delay ensures that it will be able to reflow before the ajax call. The delay is not necessary in all browsers, but won't hurt any more than the number of milliseconds you specify (as usual, IE will be the limiting factor here, the other browsers are happy with a 1ms delay, IE wanted something a little more significant to repaint).
Here's a jsfiddle for you to test in a few browsers: jsfiddle example
Why do you think:
doSomethingBeforeRequest();
response = synchronousAjax();
doSomethingToTheDataAfterRequest(response);
that much "simpler" than:
doSomethingBeforeRequest();
properAjax(onSuccess(response){
doSomethingToTheDataAfterRequest(response);
};
for your team? I'm not trying to argue, but I'm seriously curious of the justification...
The only benefit of the synchronous code i can think of is that you save a few curly braces; at the cost of freezing the browser.
If the browser doesn't complete the repaint before the request*, the only option I can think of is using a delay (as BenSwayne suggests); which would make the code as complex as the async call, and still make the browser unresponsive during the request.
EDIT (some kind of an answer):
Since JavaScript lacks threads; timeouts and ajax calls (that allows the browser to do something else before it's run; somewhat like sleep() in a threaded language is used ), is fairly fundamental to how you program JavaScript. I know it can be a bit of a learning curve at first (I know I was confused), but there is not really any sensible way to avoid learning it.
One situation I know people may be tempted to make synchronous calls is when several requests have to be made to the server in sequence; but you can do that asynchronous too, by nesting several calls like this:
doSomethingBeforeRequest1();
ajax(onSuccess(response1){
doSomethingToTheDataAfterRequest1(response1);
ajax(onSuccess(response2){
doSomethingToTheDataAfterRequest2(response2);
};
};
But unless each call is fairly slow to finish and you want to indicate progress at each step or something; I would rather recommend that you create a new service to combine the two operations with one call. (This service could just use the two existing services in sequence, if you still need them separately in some other cases).
(* I'm more surprised that Firefox DOES update the dom...)
I've maded some tests and came with some points:
http://jsfiddle.net/xVHWs/1/
Change your code to use jQuery's hide(), show() or animate({ opacity: 'show' }, 'fast') and animate({ opacity: 'hide' }, 'fast')
If you leave de functions without a time param or specify a 0 ms time, Firefox will show the overlay and hides it, the other browsers execute it to fast for you to see. Put a 100 millisecond in show, hide, or animate calls and you will see it.
$.ajaxSetup({async:false});
busyMask.className="Shown"; //display=block; absolute positioned full screen semi-transparent
var dta=$.ajax({type:"GET",dataType:"json",url:url}).responseText;
busyMask.className="Hidden"; //sets display=none;
$.ajaxSetup({async:true});

Visual Studio C++ How to get the Form not freezing while calling a time-consuming function?

I am making a C++/CLI Forms application.
In the main window of my app I have a button. When I click that button I call the Load function. Below there is the C++/CLI code:
private: System::Void Button1_Click(System::Object^ sender, System::EventArgs^ e) {
Load();
}
The function Load() is a time-consuming function. It uses the cURL library to send several HTTP GET request to a website.
In the Form I also included a ProgressBar and a textLabel showing the current request being sended.
The problem is that when I click the button and call the function the Form just freezes. I can't see the progressBar and Textlabel changing it's value while the function Load() is called, the Form is just freezed. When the function Load() has finished sending request, suddenly the progressBar change It's value to 100%.
I hope I described my problem clearly enough to understand it.
Move your task to another thread, or call Application.DoEvents();, just after you updating your scrollbar value.
Either break the task into smaller parts (design a finite state machine or use continuations) or use a separate thread.
The first approach takes more getting used to, but it's easier for an experienced programmer to get right. Threading requires synchronization which is very detail-oriented and causes a lot of hidden sporadic bugs which are extremely difficult to debug.
Call Form1.Refresh() every time you update an element of the form (say Form1). It will show the results immediately.
Before any line command that make probably any load time ...Write This:
System::Windows::Forms::Application::DoEvents();

What is the difference between the Control.Enter and Control.GotFocus events?

This may be a basic question, but I have to admit I've never truly understood what the difference between the Control.Enter and Control.GotFocus events is.
http://msdn.microsoft.com/en-us/library/system.windows.forms.control.enter.aspx
http://msdn.microsoft.com/en-us/library/system.windows.forms.control.gotfocus.aspx
Is it a differentiation between capturing keyboard or mouse input or something else?
The GotFocus/LostFocus events are generated by Windows messages, WM_SETFOCUS and WM_KILLFOCUS respectively. They are a bit troublesome, especially WM_KILLFOCUS which is prone to deadlock. The logic inside Windows Forms that handles the validation logic (Validating event for example) can override focus changes. In other words, the focus actually changed but then the validation code moved it back. The logical state of your UI is that it never moved and you shouldn't be aware that it did.
The Enter/Leave events avoid the kind of trouble these low-level focus change notification events can cause, they are generated when Winforms has established the true focus. You almost always want to use these.
Control.Enter event happens when a control gets focus for the first time. While Control.GotFocus happens EVERY time a control gets focus. For example, you have 'textBox1' that already has focus and you call textBox1.Focus(), the GotFocus event will always fire in this instance, unlike for the Enter event that will only fire if a control doesn't already have the focus and receives it for the first time.

Best UI to be shown to User while his request is still in process behind the scenes?

I am currently involved with an Application where I need to design the UI part of the Application and current I am in the process of implementation of UI which would be displayed to end user while his or her request is being processed behind the scenes.
So my question is that:
What is the best UI approach/symbol/suggestions to be displayed to end User while his or her request is still being processed behind the scenes ?
Thanks.
Any sort of throbber is adequate enough. Here's a nice throbber generator you can use.
And there's nothing wrong with progress bars, unless there the kind of progress bars that start over without actually indicating progress.
If you don't take your program too seriously, this one is always a crowd pleaser:
This is going to take a while, so to pass the time, here's a dancing bunny:
http://img251.imageshack.us/img251/4828/thdancingbunny.gif
A Loading screen of some sort may work.
It depends on how long your user must wait. If it will be <10 seconds, then just show the spinning pie of death as an animated GIF (or flash if you prefer to be non-accessible) (a simple jquery hide/show with CSS)
If it is a longer wait, like >10 seconds, you may want to implement a short but entertaining caption system. Something like the old "Reticulating Splines" type system to give the users a bit of humor while they wait.. see https://stackoverflow.com/questions/182112/what-are-some-funny-loading-statements-to-keep-users-amused for a list of such statements.
If you've got a long running background process, I'd simply display a progress bar with a message below it that states in the least technical terms possible what the current task is. And then, if possible, a cancel button in case the user gets impatient or mistakenly started the process.
I can't provide any specific links to back me up, but I believe its been proven that just the presence of a progress bar can make a longer task seem shorter than a task without the progress bar.
The worst thing you can do is nothing. Users have been conditioned to think that no feedback = locked up program.
Note on typical implementation (that I use):
jQuery has the .ajax function. When I call the function (onClick or whatever) I .remove() the contents of the (div or whatever seems appropriate) and add a css class of waiting which looks like:
.waiting {
background-color: #eee;
background-image: url('some-spinner.png');
}
the .ajax function has a success callback where I remove the .waiting class and put in the new data I got back from ajax (or put back the data I took out with .remove().
Additionally you may change default mouse cursor to wait or progress states with CSS.
Details about changing cursor with CSS here.

Resources