How to use await on getAll method for IndexDB? - async-await

I am trying to get the data stored in my ObjectStore and I want this synchronously. So instead of using onsuccess I want to use await / async.
I have implemented this below code but somehow its not returning me the data.
async function viewNotes() {
const tx = db.transaction("personal_notes","readonly")
const pNotes = tx.objectStore("personal_notes")
const items = await db.transaction("personal_notes").objectStore("personal_notes").getAllKeys()
console.log("And the Items are ", items.result)
let NotesHere = await pNotes.getAll().onsuccess
console.log("Ans this are the logs", NotesHere)
}
Neither I am getting the data through items.result nor from NotesHere.
When I view from debug mode, items's readyState is still in pending even after using await.
What am I missing ?

The IndexedDB API does not natively support async/await. You need to either manually wrap the event handlers in promises, or (much better solution) use a library like https://github.com/jakearchibald/idb that does it for you.

Related

Process.all(array.map(... doesn't work in parallel with page.goto(

I am using the pupperteer library for my bot and I would like to perform some operations in parallel.
In many articles, it is advised to use this syntax :
await Promise.all(array.map(async data => //..some operations))
I've tested this on several operations and it works but when I embed the code below in my .map promise
await page.goto(..
It did not work during Operation Promise and it considers this to be a synchronous operation.
I would like to know why it reacts like this?
I believe your error comes from the fact that you're using the same page object.
The following should work:
const currentPage = browser.pages().then(allPages => allPages[0]);
const anotherPage = await browser.newPage();
const bothPages = [currentPage, anotherPage];
await Promise.all(
bothPages.map(page => page.goto("https://stackoverflow.com"))
);

settings.getSettings() callback is not invoked on connector save

I have a connector configuration page set up to create a sample connector. It looks like I can successfully save the connector, but I'm unable to retrieve the generated webhook URL. Here's a sample of the code:
const getSettingsHandler = (settings) => {
document.getElementById("url").innerText = settings.webhookUrl;
};
const saveHandler = (event) => {
microsoftTeams.settings.setSettings(settings);
microsoftTeams.settings.getSettings(getSettingsHandler);
event.notifySuccess();
};
Yet, I don't see the URL printed to the screen. It seems like getSettingsHandler() is not being invoked as expected. I thought maybe the issue was that the config screen closes immediately after calling event.notifySuccess(), before the callback is invoked. But all of the examples I've looked at use this pattern, including an official Teams project here. Why isn't this callback being invoked?

Writing an equivalent to Chrome's onBeforeRequest in a Safari extension

Chrome extensions have the ability to intercept all web requests to specified URLs using chrome.webRequest.onBeforeRequest. This includes not only static asset requests, but requests for AJAX, PJAX, favicons, and everything in between.
Apple provides a few close approximations to this functionality, such as the beforeLoad (handles images, CSS, and JS) and beforeNavigate (handles full page loads) event handlers, but neither catch AJAX requests. I've tried overloading XMLHttpRequest in an attempt to catch AJAX loads to no avail (I might be doing something wrong). Here's a brief example of how I'm doing this:
var originalOpen = window.XMLHttpRequest.prototype.open;
window.XMLHttpRequest.prototype.open = function(method, url, async, username, password) {
console.log("overriden");
return originalOpen.apply(this, arguments);
}
How can I catch all web requests (AJAX, CSS, JS, etc.) in a Safari extension?
Update: You can check entire code flow on my first Safari Extension I've wrote for TimeCamp tracker: https://github.com/qdevro/timecamp.safariextz
I have succeeded to intercept all AJAX calls (actually the responses were interesting for me, because there all the magic happens), but unfortunately I couldn't find (yet) a solution to send it back to my injected script (I still work on this) now fully working - getting the xhr to the injected script:
I've done it like this:
1) on the injected START script, I've added into the DOM another script (the one which does the interception):
$(document).ready(function(){
var script = document.createElement('script');
script.type = 'text/javascript';
script.src = safari.extension.baseURI + 'path/to/your/script/bellow.js';
document.getElementsByTagName('head')[0].appendChild(script);
})
2) the interception code uses this repository as override of the XMLHttpRequest object, that I've tweaked a little bit as well in order to attach the method, url and sent data to it in order to be easily available when the response get's back.
Basically, I've overriden the open() method of the XMLHttpsRequest to attach those values that I might need in my script, and added the sentData in the send() method as well:
var RealXHROpen = XMLHttpRequest.prototype.open;
...
// Override open method of all XHR requests (inside wire() method
XMLHttpRequest.prototype.open = function(method, url, async, user, pass) {
this.method = method;
this.url = url;
RealXHROpen.apply(this, arguments);
}
...
// Override send method of all XHR requests
XMLHttpRequest.prototype.send = function(sentData) {
...
this.sentData = sentData;
...
}
Then, I've added a callback on the response, which get's a modified XMLHttpRequest object WHEN the data comes back, and cotains everything: url, method, sentData and responseText with the retrieved data:
AjaxInterceptor.addResponseCallback(function(xhr) {
console.debug("response",xhr);
// xhr.method - contains the method used by the call
// xhr.url - contains the URL was sent to
// xhr.sentData - contains all the sent data (if any)
// xhr.responseText - contains the data sent back to the request
// Send XHR object back to injected script using DOM events
var event = new CustomEvent("ajaxResponse", {
detail: xhr
});
window.dispatchEvent(event);
});
AjaxInterceptor.wire();
For sending the XHR object from the intercept script back to the injected script, I just had to use DOM events like #Xan has suggested (thanks for that):
window.addEventListener("ajaxResponse", function(evt) {
console.debug('xhr response', evt.detail);
// do whatever you want with the XHR details ...
}, false);
Some extra hints / (workflow) optimisations that I've used in my project:
I've cleaned the GET url's and moved all the parameters (? &) into the dataSent property;
I've merged this dataSent property if there's the case (in send(data) method)
I've added an identifier on request send (timestamp) in order to match it later (see point bellow and get the idea);
I've sent a custom event to the script called "ajaxRequest" in order to prepare / optimise load times (I had to request some other data to some external API using CORS - by passing the call / response back and forth to the global.html which is capable of handling CORS), so I didn't had to wait for the original request to come back before sending my API call, but just matching the responses based on timestamp above;

Do I have to use ContinueWith with HttpClient?

when calling rest services using the System.Net.Http.HttpClient i have code like
var response = client.GetAsync("api/MyController").Result;
if(response.IsSuccessStatusCode)
...
is that proper or should i be doing
client.GetAsync("api/MyController").ContinueWith(task => { var response = task.Result; ...}
It is much safer to do the second. There are a variety of scenarios where the first option can cause a deadlock.

can't seem to get progress events from node-formidable to send to the correct client over socket.io

So I'm building a multipart form uploader over ajax on node.js, and sending progress events back to the client over socket.io to show the status of their upload. Everything works just fine until I have multiple clients trying to upload at the same time. Originally what would happen is while one upload is going, when a second one starts up it begins receiving progress events from both of the forms being parsed. The original form does not get affected and it only receives progress updates for itself. I tried creating a new formidable form object and storing it in an array along with the socket's session id to try to fix this, but now the first form stops receiving events while the second form gets processed. Here is my server code:
var http = require('http'),
formidable = require('formidable'),
fs = require('fs'),
io = require('socket.io'),
mime = require('mime'),
forms = {};
var server = http.createServer(function (req, res) {
if (req.url.split("?")[0] == "/upload") {
console.log("hit upload");
if (req.method.toLowerCase() === 'post') {
socket_id = req.url.split("sid=")[1];
forms[socket_id] = new formidable.IncomingForm();
form = forms[socket_id];
form.addListener('progress', function (bytesReceived, bytesExpected) {
progress = (bytesReceived / bytesExpected * 100).toFixed(0);
socket.sockets.socket(socket_id).send(progress);
});
form.parse(req, function (err, fields, files) {
file_name = escape(files.upload.name);
fs.writeFile(file_name, files.upload, 'utf8', function (err) {
if (err) throw err;
console.log(file_name);
})
});
}
}
});
var socket = io.listen(server);
server.listen(8000);
If anyone could be any help on this I would greatly appreciate it. I've been banging my head against my desk for a few days trying to figure this one out, and would really just like to get this solved so that I can move on. Thank you so much in advance!
Can you try putting console.log(socket_id);
after form = forms[socket_id]; and
after progress = (bytesReceived / bytesExpected * 100).toFixed(0);, please?
I get the feeling that you might have to wrap that socket_id in a closure, like this:
form.addListener(
'progress',
(function(socket_id) {
return function (bytesReceived, bytesExpected) {
progress = (bytesReceived / bytesExpected * 100).toFixed(0);
socket.sockets.socket(socket_id).send(progress);
};
})(socket_id)
);
The problem is that you aren't declaring socket_id and form with var, so they're actually global.socket_id and global.form rather than local variables of your request handler. Consequently, separate requests step over each other since the callbacks are referring to the globals rather than being proper closures.
rdrey's solution works because it bypasses that problem (though only for socket_id; if you were to change the code in such a way that one of the callbacks referenced form you'd get in trouble). Normally you only need to use his technique if the variable in question is something that changes in the course of executing the outer function (e.g. if you're creating closures within a loop).

Resources