I am using cross-fade-delay transition in my polymer app, in order to play nicely with the hero transition: cross fade text after hero transition is finished.
I currently use the transition-end callback to trigger another operation (some javascript function call). Problem is that this call is a bit ressource expensive so it makes the cross-fade-delay transition lag.
transition-end call back is triggered after the hero transition is completed. Is there a similar event, to know when the cross-fade-delayed finished?
Best,
Nicolas
I don't think there's a similar event to monitor the end of the cross-fade-delayed transition animation.
However, a quick workaround is, inside the on-core-animated-pages-transition-end handler, call your js function after a simple delay of the duration of the cross-fade-delayed transition.
setTimeout(function () {
console.log('animation transition completed!');
},
// convert for example 400ms to 400
parseInt(CoreStyle.g.transitions.xfadeDelay, 10));
You might even want to add between 50ms and 100ms to the delay on top of the xfadeDelay just to ensure the js function works smoothly on the UI.
Related
TL;DR: In a Laravel + InertiaJS + Vue 3 developed application, how can I achieve to have a transition between "pages" loaded within a persistent layout <main> section - for example, have that section animate (say, fade out) before loading the next page, then animate (fade) that new page in - when using standard Inertia routing for navigation? I have managed to do it on entering/showing the page, but have found no way to animate before navigation happens.
LONG(ish): The Way I'm trying to do it
Let's assume there is an application (developed with Laravel + InertiaJS + Vue 3).
I have an element in the markup of an Inertia persistent layout that is conditionally shown if a value is true (v-if="shouldAnimate") that is initially set to false when declared, and when onMounted fires, it sets that value to true which in turn triggers the animation to run (doesn't really matter how the animation works, but just in case, I have options to use either GSAP or anime.js).
Up to this point, all is good: every time I navigate to a page (using Inertia-adequate methods such as the Link component) the animation triggers and I am a happy guy.
BUT: I would very much like to be able to play another animation (the reverse of the previous one) before, say, navigation to the next page occurs. I have tried almost everything I can think of and have not been successful. Here's what got the closest to what I need:
I tried hooking into the InertiaJS event Inertia.on('before', ...): effectively, the event fires up right before navigation (checked with some good-old console log), so I tried firing up the animation at this point, only to find out that the Inertia event looks like it is destroying the page immediately before the animation has had time to play; no problem: I'll just event.preventDefault() it, run the animation and THEN, using a setTimeout timed to the length of the animation (300ms) I'll resume navigation, say, by using Inertia.visit.
Doesn't work. Somehow, the default behaviour is prevented (stops the navigation), the animation plays back, but when it comes to the "resume navigation part" I have had mixed results depending on what I use:
Code looks roughly like this:
let removeListener = Inertia.on('before', (event) => {
event.preventDefault()
// Play animation here
setTimeout(() => {
// SOME INERTIA ACTION DESCRIBED BELOW
}, 300)
})
Independently of whether I use Inertia.get(event.detail.visit.url) or Inertia.visit(event.detail.visit.url) what happens is the animation runs its course, and then the timer runs out and this whole code RUNS AGAIN AND AGAIN in intervals equal to the timer. I also tried to do this using the complete event of the animation to trigger the navigation but it behaves the same.
I know this is related to me being an ignorant about how both Inertia and events work, and I am sure there is a proper (correct? right?) way to achieve what I need, but either I have failed in using the correct terms to look for it, or I am approaching this the wrong way. Hopefully this information is enough to explain my issue.
Any help or pointer would be GREATLY appreciated, so thanks in advance.
I'm using fx: 'fade'
All working dandy, however I need to call a function before the actual fade in.
I've been playing around with the options before and after but before triggers to early and after too late.
In a simple Ajax based website we are making some HttpRequests requests synchronously (I realize "synchronous Ajax" is somewhat of an oxymoron). The primary reason this is being done synchronously vs. asynchronously is in order to simplify the programming model for some of those involved (long story).
Anyway, we want to be able to make a styling change (specifically, overlay the screen with semi transparent white as Google search does) right before the request is made, and then take it away when results come back. Essentially this looks like:
load:function(url) {
....
busyMask.className="Shown"; //display=block; absolute positioned full screen semi-transparent
var dta=$.ajax({type:"GET",dataType:"json",url:url,async: false}).responseText;
busyMask.className="Hidden"; //sets display=none;
...
return JSON.parse(dta);
}
It is well known a synchronous requests will lock the UI. So not surprisingly, the white overlay never shows up in Safari and Chrome (it does in Firefox interestingly). I've tried slowing the response way down and using a pink overlay so that it will be painfully obvious, but it just won't update the UI until after the request is complete. Leaving the 'busyMask.className="Hidden"' part out will show the mask-- but only after the ajax request is complete.
I've seen many many tricks for forcing the UI to repaint (e.g. Why HourGlass not working with synchronous AJAX request in Google Chrome?, http://ajaxian.com/archives/forcing-a-ui-redraw-from-javascript), but they all seem to be in conjunction with trying to show actual "permanent" DOM or styling updates, not with temporarily showing a style change while a synchronous request is made.
So is there a way to do this or am I fighting a losing battle? It may be that we'll just need to switch to asynchronous requests on a case by case basis for the worst performing requests, which might be a decent way to tackle the learning curve issue... But I'm hoping there is an outside the box answer here.
Ok for the purpose of this question I will ignore the justification for why you require synchronous XHR requests. I understand that sometimes work constraints don't allow the use of the best practice solution and so we "make do" in order to get the job done. So lets focus on how to get synchronous ajax with visual updated working for you!
Well considering you are using jQuery for your XHR request, I'm going to assume its ok to use jQuery to show/hide the loading indicator and to handle any timing issues.
First let's set up a loading indicator in our markup:
<div id="loading" style="display:none;">Loading...</div>
Now lets create some javascript:
// Show the loading indicator, then start a SYNCRONOUS ajax request
$('#loading').show().delay(100).queue(function() {
var jqxhr = $.ajax({type:"GET",dataType:"json",url:"www.yoururl.com/ajaxHandler",async: false}).done(function(){
//Do your success handling here
}).fail(function() {
//Do your error handling here
}).always(function() {
//This happens regardless of success/failure
$('#loading').hide();
});
$(this).dequeue();
});
First, we want to show our loading indicator and then give the browser a moment delay to repaint before our syncronous XHR request gets started. By using jQuery's .queue() method we are putting our .ajax() call in the default fx queue so that it won't execute until after the .delay() completes, which of course doesn't happen until after the .show() completes.
The jQuery .show() method changes the target element's CSS display style to block (or restores its initial value if assigned). This change in CSS will cause the browser to reflow (aka "redraw") as soon as it is able. The delay ensures that it will be able to reflow before the ajax call. The delay is not necessary in all browsers, but won't hurt any more than the number of milliseconds you specify (as usual, IE will be the limiting factor here, the other browsers are happy with a 1ms delay, IE wanted something a little more significant to repaint).
Here's a jsfiddle for you to test in a few browsers: jsfiddle example
Why do you think:
doSomethingBeforeRequest();
response = synchronousAjax();
doSomethingToTheDataAfterRequest(response);
that much "simpler" than:
doSomethingBeforeRequest();
properAjax(onSuccess(response){
doSomethingToTheDataAfterRequest(response);
};
for your team? I'm not trying to argue, but I'm seriously curious of the justification...
The only benefit of the synchronous code i can think of is that you save a few curly braces; at the cost of freezing the browser.
If the browser doesn't complete the repaint before the request*, the only option I can think of is using a delay (as BenSwayne suggests); which would make the code as complex as the async call, and still make the browser unresponsive during the request.
EDIT (some kind of an answer):
Since JavaScript lacks threads; timeouts and ajax calls (that allows the browser to do something else before it's run; somewhat like sleep() in a threaded language is used ), is fairly fundamental to how you program JavaScript. I know it can be a bit of a learning curve at first (I know I was confused), but there is not really any sensible way to avoid learning it.
One situation I know people may be tempted to make synchronous calls is when several requests have to be made to the server in sequence; but you can do that asynchronous too, by nesting several calls like this:
doSomethingBeforeRequest1();
ajax(onSuccess(response1){
doSomethingToTheDataAfterRequest1(response1);
ajax(onSuccess(response2){
doSomethingToTheDataAfterRequest2(response2);
};
};
But unless each call is fairly slow to finish and you want to indicate progress at each step or something; I would rather recommend that you create a new service to combine the two operations with one call. (This service could just use the two existing services in sequence, if you still need them separately in some other cases).
(* I'm more surprised that Firefox DOES update the dom...)
I've maded some tests and came with some points:
http://jsfiddle.net/xVHWs/1/
Change your code to use jQuery's hide(), show() or animate({ opacity: 'show' }, 'fast') and animate({ opacity: 'hide' }, 'fast')
If you leave de functions without a time param or specify a 0 ms time, Firefox will show the overlay and hides it, the other browsers execute it to fast for you to see. Put a 100 millisecond in show, hide, or animate calls and you will see it.
$.ajaxSetup({async:false});
busyMask.className="Shown"; //display=block; absolute positioned full screen semi-transparent
var dta=$.ajax({type:"GET",dataType:"json",url:url}).responseText;
busyMask.className="Hidden"; //sets display=none;
$.ajaxSetup({async:true});
Is there some way I can disable all events until an event is completed in DOJO? For instance I am fading elements and the user can click the event again and it will not complete the last event.
If you control all events that need to be disabled, you could try using a global variable as a "lock" - set it on when you start the animation (and have all events abort if they find this flag triggered) and unset it when it ends.
Javascript is not concurrent (so you don't need to worry about timing issues and having an "actual" lock) but perhaps the fading uses setTimeout behind the scenes (allowing other events to trigger before it is done). If this is the case, just remember that you would need to use the onEnd callback to properly detect when the anim is over
var lock = false;
function my_event_handler(evt){
if(lock) return; //someone else is using the lock;
//perhaps cancel event propagation as well?
lock = true;
dojo.anim({
...
onEnd: function(){
lock = false;
}
});
}
caveat: this is pseudocode off the top of my head. I haven't used dojo animations in a while if you didn't notice already :P
I'm not sure I understand what you mean by events here, but if you want to prevent interaction with elements on a page, you can put up a modal shield... basically a transparent DIV element to capture events, positioned over your content with a high z-index
I noticed a lot of JQuery answers on this, but I'm using MooTools...
I have a Table of Contents which uses CSS Fixed positioning to keep it off to the left side, except for 20 pixels. The user hovers their cursor over the 20 pixels, which fires the DIV's mouseover event and the ToC slides fully into the page. When the cursor leaves, the ToC slides back to where it was.
$('frameworkBreakdown').addEvents({
'mouseover': function(event){
event = new Event(event);
$('frameworkBreakdown').tween('left', 20);
event.stop;
},
'mouseout': function(event){
event = new Event(event);
$('frameworkBreakdown').tween('left', (10 - $('frameworkBreakdown').getStyle('width').toInt()) );
event.stop;
}
});
This works well (aside from unrelated issues) except that when I move the mouse on the DIV it starts to jitter, presumably because the contents of the DIV are also firing the event, or the event refires as the mouse tracks over the DIV.
How can I stop this behaviour from occuring? Is there a standard method, or do I use some sort of nasty global variable that determines whether effects are in action, and thus ignore the event?
Use mouseenter/mouseleave instead of mouseover/mouseout.
Also, you shouldn't be doing this in MooTools 1.2+:
event = new Event(event);
event.stop;
A simple event.stop() will suffice.