I have the following example function:
public function backupService()
{
$job = Job::find($this->job_id);
sleep(5);
$job->status = 'in_progress';
$job->update();
$this->emitSelf('refreshComponent');
sleep(10);
$job->status = 'finished';
$job->update();
$this->emitSelf('refreshComponent');
}
When I change the status to 'in_progress' it changes in my database but doesn't update the component. Apparently it is only issuing $this->emitSelf() when the backupService() function finishes, ie the status will never appear as 'in_progress', only as 'finished'.
I don't want to use the wire:poll directive because I don't want to keep updating the page all the time, only when I specifically call it. How can I resolve this?
The event will be emitted once the entire method backupService() is finished with its execution, when the response from that method is sent back to the browser. Livewire-events are actually sent to the browser with the response, and any components listening for those events will be triggering actions on the client, making secondary-requests.
This means that the refresh-event that you emit, will trigger after everything is completed.
If you don't want to use polling, then another alternative is websockets. But this too can be a bit much for such a simple task, so a third alternative is to restructure your method into two methods, one that starts the process, and have events going from there. Something like this, where the first method is only responsible for setting the new status and emitting a new event that will be starting the job, and the second method is responsible for execution.
protected $listeners = [
'refreshComponent' => '$refresh',
'runJob'
];
public function backupService()
{
$job = Job::find($this->job_id);
$job->status = 'in_progress';
$job->update();
$this->emitSelf('runJob', $job);
}
public function runJob(Job $job)
{
sleep(10);
$job->status = 'finished';
$job->update();
$this->emitSelf('refreshComponent');
}
Related
live example
I've an Array of Filters as an Observable and I'd like to add/remove filters from it. Here is the code I have that is currently only adding a Filter the first time the function runs.
The second time nothing happens.
private _filters$ = new BehaviorSubject<Filter[]>([]);
addFilter(added: Filter) {
debugger
// adding to array of filters
this._filters$.pipe(
tap(d => { debugger; }),
first(),
map(filters => ([...filters, added]))
).subscribe(this._filters$);
}
So my question is: why does this happen ? Why does it run only once ? (By the way first() is not the reason).
I know I can make the code work like so:
private _filters$ = new BehaviorSubject<Filter[]>([]);
currentFilters;
init() {
this._filters$.subscribe(f => this.currentFilters = f);
}
addFilter(added: Filter) {
this._filters$.next([...this.currentFilters, added]);
}
Actually, it is because of first. When you run the function the first time it is creating the stream and subscribing to the BehaviorSubject. When it receives the first event it forwards it to BehaviorSubject and then it completes BehaviorSubject. The second time you run it BehaviorSubject is already shutdown so it immediately unsubscribes any new subscriptions to it.
Without knowing too much about your actual goal my suggestion is that instead of putting the BehaviorSubject at the bottom of the pipeline you instead put it at the top.
// You don't actually need the caching behavior yet so just use a `Subject`
private _filters$ = new Subject<Filter>()
// Hook this up to whatever is going to be using these filters
private _pipeline$ = this._filters.pipe(
// Use scan instead mapping back into self
scan((filters, newFilter) => ([...filters, newFilter]), []),
// Store the latest value for new subscribers
shareReplay(1)
);
// Now this method is just pushing into the `Subject` and the pipeline never has to be torn down
addFilter(added: Filter) {
debugger
this._filters$.next(added);
}
I'm trying to use the waitFor function of react.js but it seems I'm doing something wrong.
What I want to do i basic, wait for a store to be filled before calling it from another store.
1.Register token in the first store
RipplelinesStore.dispatcherIndex= Dispatcher.register(function(payload) {
var action = payload.action;
var result;
switch(action.actionType) {
case Constants.ActionTypes.ASK_RIPPLELINES:
registerAccount(action.result);
RipplelinesStore.emitChange(action.result);
break;
}
});
2.Write the wait for in the other store
Dispatcher.register(function(payload) {
var action = payload.action;
var result;
switch(action.actionType) {
case Constants.ActionTypes.ASK_RIPPLEACCOUNTOVERVIEW:
console.log("overviewstore",payload);
Dispatcher.waitFor([
RipplelinesStore.dispatcherIndex,
]);
RippleaccountoverviewsStore.test= RipplelinesStore.getAll();
console.log(RippleaccountoverviewsStore.test);
break;
}
return true;
});
Unfortunately my getall() method return an empty object (getAll() is well written). So it seems that the waitFor dispatcher function is not working.
Basically I know that's because the first store is still receiving the answer from the server but I thought that waitFor would waitfor it to be fetched I don't get it.
Any clue ? Thanks!
Edit: I fire the first store fetch like tha. What I don't understand is that I'm dispatching the load once my backbone collection has fetched (I dispatch on succeed with a promise...)
ripplelinescollection.createLinesList(toresolve.toJSON()).then(function() {
Dispatcher.handleViewAction({
actionType: Constants.ActionTypes.ASK_RIPPLELINES,
result: ripplelinescollection
});
});
I also tried to bind the waitfor to an action which is never called but the other store is still not waiting ! WEIRD !
seems like the problem is the async fetch from the server. waitFor isn't supposed to work this way. You will have to introduce another action that is triggered as soon as the data has been received from the server.
Have a look at this answer: https://stackoverflow.com/a/27797444/1717588
I have an ICEfaces web app which contains a component with a property linked to a backing bean variable. In theory, variable value is programmatically modified, and the component sees the change and updates its appearance/properties accordingly.
However, it seems that the change in variable isn't "noticed" by the component until the end of the JSF cycle (which, from my basic understanding, is the render response phase).
The problem is, I have a long file-copy operation to perform, and I would like the the inputText component to show a periodic status update. However, since the component is only updated at the render response phase, it doesn't show any output until the Java methods have finished executing, and it shows it all changes accumulated at once.
I have tried using FacesContext.getCurrentInstance().renderResponse() and other functions, such as PushRenderer.render(String ID) to force XmlHttpRequest to initialize early, but no matter what, the appearance of the component does not change until the Java code finishes executing.
One possible solution that comes to mind is to have an invisible button somewhere that is automatically "pressed" by the bean when step 1 of the long operation completes, and by clicking it, it calls step 2, and so on and so forth. It seems like it would work, but I don't want to spend time hacking together such an inelegant solution when I would hope that there is a more elegant solution built into JSF/ICEfaces.
Am I missing something, or is resorting to ugly hacks the only way to achieve the desired behavior?
Multithreading was the missing link, in conjunction with PushRenderer and PortableRenderer (see http://wiki.icesoft.org/display/ICE/Ajax+Push+-+APIs).
I now have three threads in my backing bean- one for executing the long operation, one for polling the status, and one "main" thread for spawning the new threads and returning UI control to the client browser.
Once the main thread kicks off both execution and polling threads, it terminates and it completes the original HTTP request. My PortableRenderer is declared as PortableRender portableRenderer; and in my init() method (called by the class constructor) contains:
PushRenderer.addCurrentSession("fullFormGroup");
portableRenderer = PushRenderer.getPortableRenderer();
For the threading part, I used implements Runnable on my class, and for handling multiple threads in a single class, I followed this StackOverflow post: How to deal with multiple threads in one class?
Here's some source code. I can't reveal the explicit source code I've used, but this is a boiled-down version that doesn't reveal any confidential information. I haven't tested it, and I wrote it in gedit so it might have a syntax error or two, but it should at least get you started in the right direction.
public void init()
{
// This method is called by the constructor.
// It doesn't matter where you define the PortableRenderer, as long as it's before it's used.
PushRenderer.addCurrentSession("fullFormGroup");
portableRenderer = PushRenderer.getPortableRenderer();
}
public void someBeanMethod(ActionEvent evt)
{
// This is a backing bean method called by some UI event (e.g. clicking a button)
// Since it is part of a JSF/HTTP request, you cannot call portableRenderer.render
copyExecuting = true;
// Create a status thread and start it
Thread statusThread = new Thread(new Runnable() {
public void run() {
try {
// message and progress are both linked to components, which change on a portableRenderer.render("fullFormGroup") call
message = "Copying...";
// initiates render. Note that this cannot be called from a thread which is already part of an HTTP request
portableRenderer.render("fullFormGroup");
do {
progress = getProgress();
portableRenderer.render("fullFormGroup"); // render the updated progress
Thread.sleep(5000); // sleep for a while until it's time to poll again
} while (copyExecuting);
progress = getProgress();
message = "Finished!";
portableRenderer.render("fullFormGroup"); // push a render one last time
} catch (InterruptedException e) {
System.out.println("Child interrupted.");
}
});
statusThread.start();
// create a thread which initiates script and triggers the termination of statusThread
Thread copyThread = new Thread(new Runnable() {
public void run() {
File someBigFile = new File("/tmp/foobar/large_file.tar.gz");
scriptResult = copyFile(someBigFile); // this will take a long time, which is why we spawn a new thread
copyExecuting = false; // this will caue the statusThread's do..while loop to terminate
}
});
copyThread.start();
}
I suggest looking at our Showcase Demo:
http://icefaces-showcase.icesoft.org/showcase.jsf?grp=aceMenu&exp=progressBarBean
Under the list of Progress Bar examples is one called Push. It uses Ajax Push (a feature provided with ICEfaces) to do what I think you want.
There is also a tutorial on this page called Easy Ajax Push that walks you through a simple example of using Ajax Push.
http://www.icesoft.org/community/tutorials-samples.jsf
I have and application written in NodeJS with Express and am attempting to use EventEmitter to create a kind of plugin architecture with plugins hooking into the main code by listening to emitted events.
My problem comes when a plugin function makes an async request (to get data from mongo in this case) this causes the plugin code to finish and return control back to the original emitter which will then complete execution, before the async request in the plugin code finishes.
E.g:
Main App:
// We want to modify the request object in the plugin
self.emit('plugin-listener', request);
Plugin:
// Plugin function listening to 'plugin-listener', 'request' is an arg
console.log(request);
// Call to DB (async)
this.getFromMongo(some_data, function(response){
// this may not get called until the plugin function has finished!
}
My reason for avoiding a callback function back to the main code from the 'getFromMongo' function is that there may be 0 or many plugins listening to the event. Ideally I want some way to wait for the DB stuff to finish before returning control to the main app
Many Thanks
Using the EventEmitter for plugin/middleware management is not ideal, because you cannot ensure that the listeners are executed sequentially, if they have asynchroneous code. This especially is a problem when these listeners interact with each other or the same data.
That's why i.e. connect/express middleware functions are stored in an array and executed one after the other, instead of using an EventEmitter; They each need to call a next(); function when they are done doing their task.
You can't mix asynchronous calls with synchronous behavior. If you're going to stick with event emitter (which may not be ideal for you as Klovadis pointed out), you'll need to have your plugin emit an event that triggers a function in the main app which contains the code that you want to 'wait' to execute. You would also have to in turn keep track of all the plugin calls you made that you are waiting for event calls for so that your main code doesn't run until all the plugin calls have finished their MongoDB callbacks.
var callList = ['pluginArgs1', 'pluginArgs2', 'pluginArgs3'];
for (var i = 0; i < callList.length; i++){
self.emit('plugin-listener', callList[i], i);
}
self.on('plugin-callback', function(i){
callList.splice(i, 1);
if (callList.length < 1){
//we're done, do something
}
});
Had the same kind of decision to make about some events that I sometime need to wait for before returning the response to the client and sometimes not (when not in an HTTP request context).
The easiest way for me was to add a callback as the last argument of the event.
Stuff.emit('do_some_stuff', data, data2, callback);
In the event check if there is a callback:
Stuff.on('do_some_stuff', function(data, data2, callback) {
// stuff to do
// ...
if (typeof callback === "function") return callback(err, result);
});
I know that mixing event and callbacks can be messy but that work fine for what I need.
The other solution I see is the one proposed by #redben: add an emit function at the end of the event. The problem when in a HTTP context is that you need unique keys so your events don't mess up if they do different stuff per user.
Haven't tried it myself but you could use a property in the event's data object as an array of functions to execute by the code that emitted the event :
Listeners
foo.on('your-event', function(data) {
console.log(data);
// Then add the asynchronous code to a callbacks array
// in the event data object
data.callbacks.push(function(next) {
getFromMongo(some_data, function(err, result) { next(err) }
}
});
Emitter
self.emit('your-event', data);
// listeners have modified data object,
// some might have added callback to data.callbacks
// (suppose you use async)
async.series(data.callbacks);
This seems quite dangerous, but I have to do it anyway...
const ee = new EventEmitter();
if (ee.listeners("async-event").length > 0) {
await new Promise((resolve) => {
ee.emit("async-event", data1, data2, resolve);
});
}
Otherwise, just emit the event back-and-forth.
I've build a livesearch with the jQuery.ajax() method. On every keyup events it receives new result data from the server.
The problem is, when I'm typing very fast, e.g. "foobar" and the GET request of "fooba" requires more time than the "foobar" request, the results of "fooba" are shown.
To handle this with the timeout parameter is impossible, I think.
Has anyone an idea how to solve this?
You can store and .abort() the last request when starting a new one, like this:
var curSearch;
$("#myInput").keyup(function() {
if(curSearch) curSearch.abort(); //cancel previous search
curSearch = $.ajax({ ...ajax options... }); //start a new one, save a reference
});
The $.ajax() method returns the XmlHttpRequest object, so just hang onto it, and when you start the next search, abort the previous one.
Assign a unique, incrementing ID to each request, and only show them in incrementing order. Something like this:
var counter = 0, lastCounter = 0;
function doAjax() {
++counter;
jQuery.ajax(url, function (result) {
if (counter < lastCounter)
return;
lastCounter = counter;
processResult(result);
});
}
You should only start the search when the user hasn't typed anything for a while (500ms or so). This would prevent the problem you're having.
An excellent jQuery plugin which does just that is delayedObserver:
http://code.google.com/p/jquery-utils/wiki/DelayedObserver
Make it so each cancels the last. That might be too much cancellation, but when typing slows, it will trigger.
That seems like an intense amount of traffic to send an ajax request for every KeyUp event. You should wait for the user to stop typing - presumably that they are done, for at least a few 100 milliseconds.
What I would do is this:
var ajaxTimeout;
function doAjax() {
//Your actual ajax request code
}
function keyUpHandler() {
if (ajaxTimeout !== undefined)
clearTimeout(ajaxTimeout);
ajaxTimeout = setTimeout(doAjax, 200);
}
You may have to play with the actual timeout time, but this way works very well and does not require any other plugins.
Edit:
If you need to pass in parameters, create an inline function (closure).
...
var fun = function() { doAjax(params...) };
ajaxTimeout = setTimeout(fun, 200);
You will want some kind of an ajax queue such as:
http://plugins.jquery.com/project/ajaxqueue
or http://www.protofunc.com/scripts/jquery/ajaxManager/
EDIT:Another option, study the Autocomplete plug-in code and emulate that.(there are several Autocomplete as well as the one in jquery UI
OR just implement the Autocomplete if that serves your needs