MooTools AJAX Request on unload - ajax

i'm trying to lock a row in a db-table when a user is editing the entry.
So there's a field in the table lockthat I set 1 on page load with php.
Then I was trying to unlock the entry (set it 0) when the page is unloaded.
This is my approach. It works fine in IE but not in Firefox, Chrome etc....
The window.onbeforeunload works in all browsers, I tested that.
They just don't do the Request
BUT
if I simple put an alert after req.send(); it works in some browsers but not safari or chrome. So I tried putting something else after it just so that's there's other stuff to do after the request but it doesn't work.
function test() {
var req = new Request({
url: 'inc/ajax/unlock_table.php?unlock_table=regswimmer&unlock_id=',
});
req.send();
alert('bla'); // ONLY WORKS WITH THIS !?!?!?
}
window.onbeforeunload = test;
i've already tried different ways to do the request but nothing seems to work. And the request itself works, just not in this constellation.
ANY help would be appreciated!
Thanks

the request is asynchronous by default. this means it will fork it and not care of the complete, which may or may not come (have time to finish). by placing the alert there you ensure that there is sufficient time for the request to complete.
basically, you may be better off trying one of these things:
add async: false to the request object options. this will ensure the request's completion before moving away.
use an image instead like a tracking pixel.
move over to method: "get" which is a bit faster as it does not contain extra headers and cookie info, may complete better (revert to this if async is delayed too much)
you can do the image like so (will also be $_GET)
new Element("img", {
src: "inc/ajax/unlock_table.php?unlock_table=regswimmer&unlock_id=" + someid + "&seed=" + $random(0, 100000),
styles: {
display: "none"
}
}).inject(document.body);
finally, use window.addEvent("beforeunload", test); or you may mess up mootools' internal garbage collection

Related

How to wait all similar requests in cypress?

When start page of app are loading I have a lot of requests on locize api.
requests count
When I try to write a test for it, I need to wait when all of them finished and then start to get an DOM elements.
I do it like this:
cy.intercept({
method: "GET",
url: "https://api.locize.app/**",
}).as("languages");
cy.visit("/");
cy.wait("#languages")
but this code wait only 1st request.
test log
I was try to do it with cy.get("#languages.all")
but it's haven’t any effect.
So the question is, how can I get waiting for all requests before it's go further?
P.S. I'm pretty new in cypress, so I'll be really appreciated for any help.
One of the solution that I found is library cypress-network-idle.
https://www.npmjs.com/package/cypress-network-idle
Basically it helped me solve the problem
If you know the calls for the language, then you can store them in an array and iterate over it to create unique intercepts and wait on them. However, this would be brittle to changes in your app calls.
const enGBItems = ['Objects', 'Locations', ... ]
Cypress._.forEach(enGBItems, function(item){
cy.intercept({
method: "GET",
url: `https://api.locize.app/**/en-GB/${item}`,
}).as(item)
})
cy.visit("/")
Cypress._.forEach(enGBItems, function(item){
cy.wait(`#${item}`)
})

How and in what way do content scripts share content scoped variables across different web pages?

There are some key parts of the MDN content script documentation I am having trouble understanding regarding variable scope:
There is only one global scope per frame, per extension. This means that variables from one content script can directly be accessed by another content script, regardless of how the content script was loaded.
This paragraph may also be relevant (my italics) given the questions below:
... you can ask the browser to load a content script whenever the browser loads a page whose URL matches a given pattern.
I assume (testing all this has proved difficult, please bear with me) that this means:
The content script code will be run every time a new page is loaded that matches the URLs provided in manifest.json (in my case "matches": [<"all_urls">]
The content script code will run every time a page is refreshed/reloaded.
The content script code will not be run when a user switches between already open tabs (requires listening to window focus or tabs.onActivated events).
The variables initialized on one page via the content script share a global scope with those variables initialized by the same content script on another page.
Given the following code,
background-script.js:
let contentPort = null
async function connectToActiveTab () {
let tab = await browser.tabs.query({ active: true, currentWindow: true })
tab = tab[0]
const tabName = `tab-${tab.id}`
if (contentPort) {
contentPort.disconnect()
}
contentPort = await browser.tabs.connect(tab.id, { name: tabName })
}
// listening to onCreated probably not needed so long as content script reliably sends 'connecToActiveTab' message
browser.tabs.onCreated.addListener(connectToActiveTab)
browser.runtime.onMessage.addListener(connectToActiveTab)
content-script.js
let contentPort = null
function connectionHandler (port, info) {
if (contentPort && contentPort.name !== port.name) {
// if content port doesn't match port we have changed tabs/windows
contentPort.disconnect()
}
if (!contentPort) {
contentPort = port
contentPort.onMessage.addListener(messageHandler)
}
}
async function main () {
// should always be true when a new page opens since content script code is run on each new page, testing has shown inconsistent results
if (!contentPort) {
await browser.runtime.sendMessage({ type: 'connectToActiveTab' })
}
}
browser.runtime.onConnect.addListener(connectionHandler)
main()
And from assumptions 1 and 4, I also assume:
The existing contentPort (if defined) will fire a disconnect event (which I could handle in the background script) and be replaced by a new connection to the currently active tab each time a new tab is opened.
The behaviour I have seen in Firefox while testing has so far been a bit erratic and I think I may be doing some things wrong. So now, here are the specific questions:
Are all of my 5 assumptions true? If not, which ones are not?
Is firing the disconnect() event unnecessary, since I should rely on Firefox to properly clean up and close existing connections without manually firing a disconnect event once the original contentPort variable is overwritten? (the code here would suggest otherwise)
Are the connect() methods synchronous, thus negating the need for await and asynchronous functions given the example code?
The tabs.connect() examples don't use await but neither the MDN runtime or connect docs explicitly say whether the methods are synchronous or not.
Thanks for taking the time to go through these deep dive questions regarding content script behaviour, my hope is that clear and concise answers to these could perhaps be added to the SO extension FAQ pages/knowledge base.

Safari doesn't cache resources across different domains

Let’s say we have several different websites: website1.com, website2.com, website3.com. We use jQuery on all of them and include it from CDN like googleapis.com. The expected behavior from a browser would be to cache it once and use it for all other websites. Chrome seems to do it, but Safari downloads jQuery for every domain.
Example
With the given JS code below open nytimes.com, bbc.com and dw.de in Chrome.
Append jQuery on the first website and look at the Network tab of your DevTools. It will say that it got jQuery.
Now open any other website and append jQuery again — the answer will be “from cache”.
However, Safari will say it’s loading jQuery for every domain, but try to open any webpage on one of the domains and append the script again — you will see that now it says it got jQuery from cache. So it looks like it caches data for a domain, even if it has already downloaded a resource from the exact URL for another domain.
Is this assumption correct and if so, how to fix it?
Code you can copy/paste:
setTimeout(function() {
var SCRIPT_SRC = '//ajax.googleapis.com/ajax/libs/jquery/1.11.1/jquery.min.js';
var s = document.createElement('script');
s.type = 'text/javascript';
s.async = true;
s.src = SCRIPT_SRC;
var x = document.getElementsByTagName('script')[0];
x.parentNode.insertBefore(s, x);
}, 0);
UPD: Tested it with a static image.
test.com, test2.com and test3.com have <img src="http://image.com/image.jpg" />. In all browsers except for Safari access log shows only one — first — request for the image. Safari gets the image for every new domain (but not a subdomain).
I've noticed this too, and I suspect it is for privacy reasons.
By default, Safari blocks third-party cookies. A third party cookie is a cookie set on b.com on for a resource that is requested by a.com. This can be used, for example, to track people across domains. You can have a script on b.com that is requested by a.com and by c.com. b.com can insert a unique client ID into this script based on a third-party cookie, so that a.com and c.com can track that this is the same person.
Safari blocks this behavior. If b.com sets a cookie for a resource requested by a.com, Safari will box that cookie so it is only sent to b.com for more requests by a.com. It will not be sent to b.com for requests by c.com.
Now enter caching and specifically the Etag header. An Etag is an arbitrary string (usually a hash of the file) that can be used to determine if the requested resource has changed since the person requested it last. This is normally a good thing. It saves re-sending the entire file if it is has not changed.
However, because an Etag is an arbitrary string, b.com can set it to include a client ID. This is called Etag tracking. It allows tracking a person across domains in almost exactly the same way as cookies do.
Summary: By not sharing the cache across domains, Safari protects you from cross-domain Etag tracking.
This is by design, something the Safari team call Intelligent Tracking Protection - https://webkit.org/blog/7675/intelligent-tracking-prevention/ - and the cache is double-keyed based on document origin and third-party origin
Based on research using HTTP Archive data and the Yahoo / Facebook studies on cache-lifetimes I doubt shared caching of jQuery etc is effective - not enough sites use the same versions of the libraries, and the libraries don't live in cache for very long – so Safari's behaviour helps prevent tracking, while not really affecting performance
Rather than simply adding a DOM element, you could try using XMLHTTPRequest. It lets you define custom headers -- one of which is Cache-Control.
Give this a shot, it should override whatever's going on at the browser level:
(function () {
var newRequest = function() {
return (window.XMLHttpRequest) ? new XMLHttpRequest() : new ActiveXObject( 'MsXml2.XmlHttp' );
}
var loadScript = function(url) {
var http = new newRequest();
http.onReadyStateChange = function() {
if (http.readyState === 4) {
if (http.status === 200 || http.status === 304) {
appendToPage(http.responseText);
}
}
}
// This is where you set your cache
http.setRequestHeader( 'Cache-Control', 'max-age=0' )// <-- change this to a value larger than 0
http.open('GET', url, true);
http.send(null);
}
var appendToPage = function(source) {
if (source === null) return false;
var head = document.getElementsByTagName('head')[0];
var script = document.createElement('script');
script.language = 'javascript';
script.type = 'text/javascript';
script.defer = true;
script.text = source;
head.appendChild(script);
}
loadScript( '//ajax.googleapis.com/ajax/libs/jquery/1.11.1/jquery.min.js' );
})();
Note: Safari has had some issues with caching in the past. However, from what I understand it was mostly about serving stale content -- not the other way around.
Here are some suggestions :
Have you checked if "disable cache" option is disabled ?
Are you looking for HTTP status code in the network dev panel ?
Have you tried capturing traffic with tools like WireShark ?
Best regards.

XMLHttpRequest.send(Int8Array) POST fails in Firefox only

I'm trying to post data (a chunk of a file) using a XmlHttpRequest object with an Int8Array as the data but it fails in FF18, but works perfect in IE 10 & Chrome.
Here's my JS:
//dataObj is an Int8Array with approx. 33,000 items
var oReq = new XMLHttpRequest();
oReq.open("POST", "Ajax/PostChunk");
oReq.onload = function (oEvent) {
//
};
oReq.send(dataObj);
I use Firebug in Firefox to debug my JS and when I watch the activity under the Net tab, nothing ever shows up for this XHR call. As if it was never called.
Also, prior to this call, I call jQuerys .ajax() method for "Ajax/PostChunkSize" and that works fine in all browsers, although that doesn't use an Int8Array for its data. I can't use .ajax() for this since .ajax() doesn't support Int8Array objects, as far as I know.
Does anyone know why Firefox doesn't even attempt to send this? Any questions, please ask.
Thanks in advance.
The ability to send a typed array (as opposed to an arraybuffer) is a recent addition to the in-flux XMLHttpRequest2 spec. It'll be supported in Firefox 20 in April or so (see https://bugzilla.mozilla.org/show_bug.cgi?id=819741 ) but in the meantime if your Int8Array covers its entire buffer, doing send(dataObj.buffer) should work...
Note that per the old spec the code above should have sent a string that looks something like "[object Int8Array]" instead of throwing; you may want to check to make sure that other browsers really are sending the array data and not that string.

IE7 not digesting JSON: "parse error"

While trying to GET a JSON, my callback function is NOT firing.
$.ajax({
type:"GET",
dataType:'json',
url: myLocalURL,
data: myData,
success: function(returned_data) {
alert('success');
}
});
The strangest part of this is that:
my JSON(s) validates on JSONlint
this ONLY fails on IE7...it works in Safari, Chrome, and all versions of Firefox, (and even in IE8). If I use 'error', then it reports "parseError"...even though it validates!
Is there anything that I'm missing? Does IE7 not process certain characters, data structures (my data doesn't have anything non-alphanumeric, but it DOES have nested JSONs)? I have used tons of other AJAX calls that all work (even in IE7), but with the exception of THIS call.
An example data return here is: (this is a structurally-complete example, meaning it is only missing a few second-tier fields, but follows this exact hierarchy)
{"question":{
"question_id":"19",
"question_text":"testing",
"other_crap":"none"
},
"timestamp":{
"response":"answer",
"response_text":"the text here"
}
}
I am completely at a loss. Hopefully someone has some insight into what's going on...thank you!
EDIT
Here's a copy of the SIMPLEST case of dummy data that I'm using...it still doesn't work in IE7.
{
"question":{
"question_id":"20",
"question_text":"testing :",
"adverse_party":"none",
"juris":"California",
"recipients":"Carl Chan"
}
}
I am starting to doubt that it is a JSON issue...but I have NO idea what else it could be. Here are some other resources that I've found that could be the cause, but they don't seem to work either:
http://firelitdesign.blogspot.com/2009/07/jquerys-getjson.html (Django uses Unicode by default, so I don't think this is causing it)
Anybody have any other ideas?
The example data you present looks all right but my strong suspicion still is that there is an unclosed comma somewhere like this:
"timestamp":{
"response":"answer",
"response_text":"the text here"
}, <------------
}
IE is the only browser that (correctly) trips over this.
If this is not it, can you show a full data sample (or confirm that the example you show is indeed a full sample)?
Did you already exclude the possibility of a caching issue?
e.g. you tested with IE7 when myLocalURL returned invalid json. IE7 still caches that response and thus it doesn't work. Try adding something like this (e.g. if php) to myLocalURL or make myLocalURL look like myLocalURL?random=123 just for testing to make sure it isn't a caching thing
header("Cache-Control: no-cache, must-revalidate");
header("Expires: 0");
Are you returning a correct content-typ header? e.g.
header("Content-Type: application/json");
I've just encountered exactly the same issue. It turns out that IE7 fails to parse JSON responses that have leading \r\n line feeds in the response body. Your fix of removing {% load customfilter %} works because you removed the new line that was being included after this tag.
An alternative fix would be to just remove the new line to get
{% load customfilter %}{ "question":{ "question_id":"{{question.id}}",
"question_text":"{{question.question_text|customfilterhere}}"
}
}

Resources