I am new to Parse Cloud Code.. however I am trying something that theoretically should be straight forward and I cannot understand why it won't work!
in my User class i have an array of playableFriends, when i create a Game object I wish to remove the player and opponent from each others playableFriends Array
The code to remove the opponent from the players array works.. but for some reason unknown to me the code to remove the player from the opponents array does not.. even if it is the only code being run.
I have output the player and opponent objects to the console to ensure they exist and I have also tried using the fetch command on the opponent but that didn't work either
Parse.Cloud.afterSave("Game",function(request, response){
//return if existing game
if (request.object.existed()){
return;
}
//get player
var player = request.object.get("playerIdle");
//get opponent
var opponent = request.object.get("playerTurn");
//remove opponent as playable friend
player.remove("playableFriends", opponent);
//remove player as opponents playable friend
opponent.remove("playableFriends", player);
player.save();
opponent.save();
});
any help would be greatly appreciated
regards
Byron
The problem is that the two save operations at the end of your functions are asynchronous. Basically, this means that when your player.save(); is reached, the execution continues on to the next line even though your save operation may not have completed (in your case it seems that it's the second save that is not completed). There are a couple ways of overcoming this: callbacks and promises.
A callback is a block of code that is passed into your async function that will be executed when the function completes. A promise, on the other hand, allows you to be notified when an async function completes. With promises, you use the then() function to continue execution once the original promise is fulfilled, which can either be resolved if the async operation was successful, or rejected if the async operation returned an error. Using promises, the last two lines of your function could be adapted as follows:
player.save().then( function(result){
return opponent.save();
}, function (error){
// We had an error saving the player
}).then( function(result){
console.log("Successfully saved user and opponent");
}, function(error){
// We had an error saving the opponent
});
I suggest you read up on both callbacks and promises as asynchronous functions are a fundamental part of working in the Parse Cloud Code environment. This blog post would be a great place to start. Callbacks may be easier to understand in the short term, but may lead to code that is harder to understand and maintain as your application grows in complexity.
EDIT
Since the ACLs on your opponent user restrict write operations to only that user, you'll also need to use the master key when saving that objects.
Related
I want to be able to call an HTTP endpoint (that I own) from an Azure Function at the end of the Azure Function request.
I do not need to know the result of the request
If there is a problem in the HTTP endpoint that is called I will log it there
I do not want to hold up the return to the client calling the initial Azure Function
Offloading the call of the secondary WebApi onto a background job queue is considered overkill for this requirement
Do I simply call HttpClient.PutAsync without an await?
I realise that the dependencies I have used up until the point that the call is made may well not be available when the call returns. Is there a safe way to check if they are?
My answer may cause some controversy but, you can always start a background task and execute it that way.
For anyone reading this answer, this is far from recommended. The OP has been very clear that they don't care about exceptions or understanding what sort of result the request is returning ...
Task.Run(async () =>
{
using (var httpClient = new HttpClient())
{
await httpClient.PutAsync(...);
}
});
If you want to ensure that the call has fired, it may be worth waiting for a second or two after the call is made to ensure it's actually on it's way.
await Task.Delay(1000);
If you're worried about dependencies in the call, be sure to construct your payload (i.e. serialise it, etc.) external to the Task.Run, basically, minimise any work the background task does.
I'm trying to wrapping my head around asyncio. I think I'm rather confident using MultiThreads and I think that confuses me, since I keep mapping asyncio back to MultiThreads.
Say I have the following
async def get_url_data(url):
data = await some_api_call()
return data
as far as I understand; when we encounter await it's like saying "Run this function in the background and move on" - but some say that await "gives control back to the event loop". Does that mean that await "pauses" the function (or schedules it?) or is it, as I understood, that await makes the function run in the background? I struggle to figure out when the function is executed and the values are returned
I am new to angular and want to use it to send data to my app's backend. In several occasions, I have to make several http post calls that should either all succeed or all fail. This is the scenario that's causing me a headache: given two http post calls, what if one call succeeds, but the other fails? This will lead to inconsistencies in the database. I want to know if there's a way to cancel the succeeding calls if at least one call has failed. Thanks!
Without knowing more about your specific situation I would urge you to use the promise error handling if you are not already doing so. There's only one situation that I know you can cancel a promise that has been sent is by using the timeout option in the $http(look at this SO post), but you can definitely prevent future requests. What happens when you make a $http call is that it returns a promise object(look at $q here). What this does is it returns two methods that you can chain on your $http request called success and failure so it looks like $http.success({...stuff...}).error({...more stuff..}). So if you do have error handling in each of these scenarios and you get a .error, dont make the next call.
You can cancel the next requests in the chain, but the previous ones have already been sent. You need to provide the necessary backend functionality to reverse them.
If every step is dependent on the other and causes changes in your database, it might be better to do the whole process in the backend, triggered by a single "POST" request. I think it is easier to model this process synchronously, and that is easier to do in the server than in the client.
However, if you must do the post requests in the client side, you could define each request step as a separate function, and chain them via then(successCallback, errorCallback) (Nice video example here: https://egghead.io/lessons/angularjs-chained-promises).
In your case, at each step you can check if the previous one failed an take action to reverse it by using the error callback of then:
var firstStep = function(initialData){
return $http.post('/some/url', data).then(function(dataFromServer){
// Do something with the data
return {
dataNeededByNextStep: processedData,
dataNeededToReverseThisStep: moreData
}
});
};
var secondStep = function(dataFromPreviousStep){
return $http.post('/some/other/url', data).then(function(dataFromServer){
// Do something with the data
return {
dataNeededByNextStep: processedData,
dataNeededToReverseThisStep: moreData
}
}, function(){
// On error
reversePreviousStep(dataFromPreviousStep.dataNeededToReverseThisStep);
});
};
var thirdFunction = function(){ ... };
...
firstFunction(initialData).then(secondFunction)
.then(thirdFunction)
...
If any of the steps in the chain fails, it's promise would fail, and next steps will not be executed.
I have users connecting to a Node.js server, and when they join, I add them into a Lobby (essentially a queue). Any time there are 2 users in the lobby, I want them to pair off and be removed from the lobby. So essentially, it's just a simple queue.
I started off by trying to implement this with a Lobby.run method, which has an infinite loop (started within a process.nextTick call), and any time there are more than two entries in the queue, I remove them form the queue. However, I found that this was eating all my memory and that infinite loops like this are generally ill-advised.
I'm now assuming that emitting events via EventEmitter is the way to go. However, my concern is with synchronization. Let's assuming my Lobby is pretty simple:
Lobby = {
users: []
, join: function (user) {
this.users.push(user);
emitter.emit('lobby.join', user);
}
, leave: function (user) {
var index = this.users.indexOf(user);
this.users.splice(index, 1);
emitter.emit('lobby.leave', user);
}
};
Now essentially I assume I want to watch for users joining the lobby and pair them up, maybe something like this:
Lobby = {
...
, run: function () {
emitter.on('lobby.join', function (user) {
// TODO: determine if this.users contains other users,
// pair them off, and remove them from the array
});
}
}
As I mentioned, this does not account for synchronization. Multiple users can join the lobby at the same time, and so the event listener might pair up a single user with multiple other users instead of just one.
Can someone with more Node.js experience tell me if I am right to be concerned with this event-based approach? Any insight for improvement on this approach would be much appreciated.
You are wrong to be concerned with this. This is because Node.JS is single-threaded, there is no concurrency at all! Whenever a block of code is fired no other code (including event handlers) can be fired until the block finishes what it does. In particular if you define this empty loop in your app:
while(true) { }
then your server is crashed, no other code will ever fire, no other request will be ever handled. So be careful with blocks of code, make sure that each block will eventually end.
Back to the question... So in your case it is impossible for multiple users to be paired with the same user. And let me say one more time: this is simply because there is no concurrency in Node.JS!
On the other hand this only applies to one instance of Node.JS. If you want to scale it to many machines, then obviously you will have to implement some locking mechanism (which ensures that no other process can work with the data at the same time).
I'm looping through several items and making an ajax request for each of them (using jQuery). I want them to execute independently, but populate into the DOM in the order they were called, not the order they are returned (for some reason some requests are taking longer than others). Any tips on the best practice for this type of thing?
Well the results can come back in any undefined order, they are asynchronous and subject to the vagaries of the internet and servers.
What you can do is deal with the problem in the same way TCP does over UDP. You use sequence identifiers.
Keep a sequence identifier going, and increment it every time you send out a request. As requests come back, check them off in order and only process them as they come in. Keep a list of what has returned with the data in order, and have a routine fire to check that list after each update to it. When the first expected is in, it should process the whole list down to the first gap.
Bare in mind that you could lose a request, so a suitable timeout before you ignore a given sequence identifier would be in order.
The answer to this ended up being a jQuery plugin called ajaxManager. This did exactly what I needed:
https://github.com/aFarkas/Ajaxmanager
You could send all the success result objects to a queue. Have an index that was sent with the original request, and continually check that queue for the next index.
But generally browsers only allow two simultaneous ajax requests, so it might be worth it to just send the next ajax request on success of the previous request.
Here's a start at the code:
var results = {}, lastProcessedIndex = 0;
var totalLength = $('a.myselector').each(function(el, index){
$.ajax({
url: $(this).attr('href'),
success: function(result){
results[index] = result; // add to results object
}
});
}).length;
var intervalId = setInterval(function(){
if(results[lastProcessedIndex]){
// use object
lastProcessedIndex++;
}
else if(totalLength == lastProcessedIndex){
clearInterval(intervalId);
}
}, 1000); // every 1 second
I'll be taking a stab in the dark with this one but it might help. Maybe you could create a global buffer array and then whenever the AJAX returns you can add the result to the buffer. You could then set up a timer that, when triggered, will check the contents of the buffer. If they are in order it will output it accordingly.