Mobile device doesn't use cookie - ajax

For some reason the mobile authentication detection stoped working. When I access a location containing a form on my page, the form should call (via ajax) for additional information to fill the fields, the server returns depending on user authentification. It worked well for nine months, but stoped working for mobile devices, however it still works for desktop browser access. Mobile and desktop use exactly the same access route, arguments, urls, view function ... everything. The one difference that I noted (after some 8 hours on this issue) when I looked up the request.META, is that desktop sends cookie information, mobile doesn't. However cookies are enabled on mobile and desktop browser. I tested this with Chrome mobile and (a freshly installed) Firefox on mobile, the result is exactly the same.
How is this possible? Why does the form when initializing on desktop send the cookie info and when initializing on mobile don't send it?
Here is a previous version of this question (written before I noticed the cookie difference and thought it was an api issue) containing technical details that I don't consider to be relevant for the particular question of how does the same functionality behave some differently on mobile and desktop.
EDIT: I really don't know what code to show, but probably the ajax request could be relevant. I am really lost here. This is the react-js service I use to load form initials, it's the one that arrives without cookie information when send form mobile device, and with cookie if send from desktop device, to the server. I am not aware to have added anything or changed this in anyway in the past nine months, it always worked only like this:
loadFormInitials = (typ,genderRelevant,auth) => {
$.ajax({
url: UserConstants.BASE_URL+'elements/?item=setup&auth='+auth,
method: 'GET',
dataType: 'json',
success: (result) => {
var data = JSON.parse(result.initial_data);
FormActions.loadFormInitialsAction(data,typ,genderRelevant);
});
}

Mobile companies sometime alter HTTP requests with their heavy caching.
You'd probably want to try again with HTTPS requests, not sure it'll fix though.

Related

Safari on iOS 11.4 not sending cookies on POST requests

I've been using the same code for years and things have been working very well on every browser so far:
$.ajax({
url: '/test.php',
data: parameters,
dataType: 'html',
type: 'post'
})
However, since my users started to upgrade to iOS 11.4, those who use the Private mode of Safari are having an issue with "being disconnected" from my website.
This is because ajax POST requests aren't sending any Cookies, apparently. A whole new session is created as part of that POST request (new cookies created, etc).
However, that doesn't affect anything on the GET requests themselves. They stay with the same cookies (and therefore same session), even though new cookies were set as part of the ajax POST request. It's like those ajax POST requests are "sandboxed", not affecting anything else.
Any ideas? Is this a bug on Safari on iOS 11.4 ?
We worked through a similar issue and figured out that it was actually related to Service Workers. From what I can tell, it's a bug in the latest version of Safari's implementation of Service Workers that causes some cookies not to be passed on POST in Private mode. For now, we have disabled our Service Worker and site functionality has returned to normal for those users.
Not sure if this is your problem or not, but it sounds very similar. Hope this helps!

XHR Get Request not passing Google Chromestore Checks

We've built a very lightweight simple Chrome extension for our customers, its private and won't be made publicly available but due to restrictions by Google now needs to at least exist privately on their store.
The idea of the extension is to automatically track cashback opportunities for our clients computers. It does this by checking each new URL they visit just once against our API - if its not a shopping site that exists (eg Facebook or Google) then storage makes sure that URL is never checked again, if it is, then its checked only once in a 24 hour period and the URL returned by the API is visited with an AJAX GET Request.
The get request loads the URL via the cashback site so it sets the correct cookies for the user automatically, in the background, without disrupting the users browsing session.
The extension though whilst accepted by Apple and Mozilla for Firefox and Safari is being rejected by Google and it appears to be due to our XHR request - though they sent the same generic rejection request and it appears its being rejected via an automatic check - it appears whats being flagged up is that the GET request could be request external javascript (eg malicious stuff) from 3rd parties and of course all code (and quite rightly) needs to be within the extension itself.
They provide examples of using cross-origin calls on their site here https://developer.chrome.com/extensions/xhr
As we only need to set the cookies from the URL we visit, would there be anyway to filter the get request to abide by their security rules and instead of just downloading everything from the URL we'd block downloading external javascript libraries etc?
This is well beyond my realms of coding here so i'm more interested if anyone could think of a way we COULD do this that would pass Google's checks - ultimately the AJAX request we do just loads the website they are already on, but as its going via a tracking company to set the cookie it will of course call on a couple of simple redirects first usually to set the session ID. If its possible for us to still use this data whilst making it pass Google checks, i'm not sure.
Here is a snippet of current XHR related code
kango.xhr.send(details, function(data) {
if (data.status == 200 && data.response != null) {
var text = data.response;
console.log(window.location.href);
kango.console.log("THE RESPONSE URL: " + text);
var affilDetails = {
method: 'GET',
url: text,
async: true,
contentType: 'text'
};
and
kango.xhr.send(affilDetails, function (afilData) {
console.log(thisDomain + " expire updated to " + new Date(expireshort));
if (afilData.status == 200 && afilData.response != null) {
kango.storage.setItem(thisDomain,expireshort);
console.log("RESPONSE URL HAS BEEN VISITED VIA AJAX.");

How do I secure my OPEN APIs?

I've an API endpoint hosted (built via Django Rest Framework), for eg:- domain.com/api/fetch_all?start=0&end=50. This fetches all the results from the database in a pagination manner.
Now I'm representing this information on a webpage. Its more or less like an open forum where everyone can read the data, but only some can write. I'm viewing this data onto the webpage via an AJAX request hitting the above endpoint. For eg:-
$.ajax({
type:'get',
contentType: 'application/json',
url:'domain.com/api/fetch_all?start=0&end=50',
cache : true,
dataType:'json',
success:function(data)
{
// presenting the information when the page loads.
}
});
So, my questing is how can I secure my APIs, so that no robots can access the data that I'm presenting on my forum. For eg:- if any code/script tries to access my APIs, it should throw 403 Forbidden error.
import requests
# this should return 403 error
response = requests.get('domain.com/api/fetch_all?start=0&end=50')
However, if I try to get this data via the browser AJAX request, it should return the data. How can I make sure whether the request is coming from a browser(man-handled) or a robot?
PS: I cannot add OAuth functionality over here, since I dont have a login form.
It's not possible to restrict requesters in this way, because a robot could always add headers to spoof being a browser. Anything you do on your client can be copied by an attacker. Without requiring auth, the best you can do is rate limiting - track requests on a per-client basis, and only allow a certain number of requests per time unit.
A partially-functional solution would be to look at the User-Agent header. That should include browser information, and might let you knock out some robots, but not all or even most of them.

Chrome(driver) basic authentication .... again

Query mainly due to trying to do this using Selenium but I see exactly the same behaviour if I repeat manually so I guess it's a general Chrome question.
So what I'm trying to do is use Chrome with some Selenium tests. Tests happen on a remote machine running 64 bit Ubuntu Linux (running Selenium Server) and are driven from my machine running 64 bit W7 Pro. Scripting is done in Python. Chrome is up to date, Selenium Server is up to date, as is Chromedriver.
The site I'm working on (still in development) uses a lot of AJAX/jQuery calls. It uses basic authentication to log you in.
Using Chrome, if I pass in the login credentials in the URL (as you have to with Selenium it seems) it gets me onto the site OK. Page navigation works OK. But AJAX requests fail as the basic authentication credentials are not added to the header for the request. If I log in via standard URL (manually enter ID + PW) the AJAX requests work OK. I see the same behaviour on Linux and Windows if I try it manually. Using FireFox, it all works OK - the AJAX requests have the authentication header as they're supposed to, regardless how you authenticate. Credentials are carried through correctly throughout. I've checked all the requests using Fiddler and can see the missing header for the Chrome AJAX request when passing in the credentials via the URL.
I did try and use the popup login box instead, but that appears to be a non-starter. Selenium hangs on the initial GET, and until you clear the popup, control is not passed back to the script. So I have no way of sending keys to it. I also tried navigating by using window.location.href = "url" directly, instead of the selenium "get". No luck that way either. And finally, if I reduce the page load timeout, wait for it to fail, and then try and pick up the popup, that doesn't work either. When it times out, the popup is removed.
At this point. I've just about given up. I can't use user profiles as it's a daily changing password (work thing) so theres no point in storing it.
I'm not the developer. I don't know JavaScript terribly well. I've spoken to the lead dev and their response is that this is a Chrome bug and nothing they can fix.
Does anyone concur? Or have a way round this. I'm snookered at the moment because of it ...
If you are facing Basic authentication issues, try authenticateUsing() method.
The Alert Method, authenticateUsing() lets you skip the Http Basic Authentication box.
WebDriverWait wait = new WebDriverWait(driver, 10);
Alert alert = wait.until(ExpectedConditions.alertIsPresent());
alert.authenticateUsing(new UserAndPassword("USERNAME", "PASSWORD"));
PS: Change the syntax for Python bindings

Multiple AJAX posts to Web Service that returns value. Should I synchronize?

My website is sending ajax posts to my web service, and the web service in return will return a json and that json will be further use in success callback.
Now, here's my problem, when my website is sending multiple ajax posts, it seems that those json that the web service returned gets jumbled up.
I'm thinking on synchronizing either the ajax posts or the web service (REST)? I've read that you don't need to synchronize REST services, is that true? If synchronizing is the solution, where should I synchronize it? Will async: false in ajax synchronize the posts?
Thanks.
I thought about including some codes:
Web Service:
#POST
#Override
#Produces("application/json")
#Consumes("application/json")
public Customer create(Customer cust){
custManager.save(cust);
return custManager.getCust(custManager.getCount());
}
AJAX:
$.ajax({
url: custURL,
type: 'POST',
data: JSON.stringify(sdata),
dataType: 'json',
contentType: 'application/json; charset=utf-8',
success: function(json){
var cust = JSON.parse(JSON.stringify(json));
var newId = cust.id;
updateCustId(oldId, newId);
}
});
What it does is it post to web service the data of the customer, and the web service designates an id for it and it return it to the client to update its customer id. That's not it all, I'm also updating other tables that is reference to the old customer id to the new generated from server, and after updating these other tables, it also will be send to the web service. The tables I'm talking about is web sql, its queries too works asynchronously. So end result is sometimes the ids got jumbled up and the other tables were send to the web service but the customerid it used is still the old one (I've used callbacks, ensured that it needed to be updated first before it will be sent to the server).
making my ajax calls async: false to make them synchronous and not get jumbled together works.
I don't think your problem is asynchrony, specifically. It sounds like you have some bug in your web-service implementation. What you describe should work fine.
Multiple Ajax requests are completely independent. Back end servers should treat each request separately.
You may have a problem way back in having data shared between threads on your web-application, which is causing problems with simultaneous access to the same data. If so this is the point to fix, and (in the worst case), put a mutex around just the small bit of code that is doing the cross-contamination.
If you find that data is leaving the server okay, but it is corrupted in your application, then the bug lies there.
I would strongly advise against trying to mask what sounds like a significant back-end bug by making a whole chunk of your app synchronous. Fix the underlying corruption issue, if that issue is caused by something that just cannot be made thread safe, then lock just that tiny bit.
Your question about REST is a bit irrelevant. REST services vs other architectures don't change whether code needs to be synchronous or not. When you use Ajax, you are using the HTTP protocol, which is intended to be asynchronous and stateless. A web-application of any style that requires synchronous access by clients is a big red flag of something wrong, in my view.
Beyond that general advice, you'll need to ask specific code-based questions for more detail.

Resources