Laravel pages with multiple ajax requests sometimes logs a user out - laravel

I have build a CMS where pages are build with elements. Each element has fields etc. When editing a page all elements (with their corresponding fields) are put in an accordion. When clicked on a accordion title, it opens the element (with fields and values). Data is retrieved using Ajax. When the user edits the fields, he clicks on save, which triggers another Ajax call.
The thing me and clients are noticing, is that if you work fast (like opening one element, whoops wrong one, lemme open another etc), the user gets logged out. An error 401 is sent saying 'Unauthorized'
At first thought this had to do with the CSRF tokens. Sent it as a _token field or in the headers with the Ajax calls etc. When this was not working I decided to make a token exception for all Ajax calls, but still the same problems.
It looks like, when one ajax request is still being processed and I call another, I get logged out.
So my question is, what to do about this? As it's very annoying for the 'fast' users among us.

After reading this article: https://github.com/laravel/framework/issues/7549 I have set the session storage to database. Now the errors of 'unauthorized' do not happen with the multiple ajax request we do in our CMS.

Related

what is AJAX? how does it work?

I just started studying Ajax and I totally have not idea what AJAX is. What is the difference between asynchronous and synchronous request? I would like to seek a very simple example demonstrating their differences.
AJAX short for Asynchronous JavaScript And XML is a programming language. It typically involves sending HTTP requests from client to server and processing the server's response, without reloading the entire page. This process is asynchronous. Comparing to synchronous request which blocks the client until operation completes, asynchronous HTTP is more efficient and user-friendly.
Take very simple example, when you are signing up on a commercial website, you can know whether your username is available or not once you finish typing the name. If the username was used already, the website will give you a reminder that your username is used on the same web page. This is the application of AJAX, so you don't need to complete the whole form and click the submit button to know that your username is not available.
AJAX uses two components for request process and display:
 A browser built-in XMLHttpRequest object (to request data from a web server)
 JavaScript and HTML DOM (to display or use the data)
It begins with an event occurs in a web page, such as a button is clicked. Then an XMLHttpRequest object is created by JavaScript, followed by sending a request to a web server. Once the web server receives the request, it will process it and send a response back to the web page. Then the webpage utilizes JavaScript to perform update of the web page without reloading the whole page.
AJAX stands for Asynchronous JavaScript And XML
Ajax main purpose is the loading data from the server without refreshing the web page
It's works in the background thread without interrupting UI thread
AJAX allows web pages to be updated asynchronously by exchanging data with a web server behind the scenes. This means that it is possible to update parts of a web page, without reloading the whole page.
A browser built-in XMLHttpRequest object which is used to request data from a web server
Example
when you are filling any kind of online form that time observe one thing there is option for country,state,district.
In this country drop down initially filled with data but state and district's are empty.
when you select a country like India then Asynchronous call goes to server and fetch the data of state drop down respective to selected country and so on.
when AJAX request fetching the data for the state drop down you are eligible to work with other parts of the form.

Scraping a website made with ICEfaces (session expired on consecutive ajax POST requests)

I'm trying to scrape a website created with the ICEfaces web framework via a node.js script. I've managed to handle the login just fine, and then get some data from the main page (namely ice.session and ice.view, along with the JSESSIONID cookie returned by the login response).
The problem I've run into is when I try to do an AJAX POST request to the /block/ URLs. If I do the request by itself, it returns some data (just not the data I need), but if I do it after any other request, I get <session-expired/> as a result. It doesn't even matter which of the ICEfaces /block/ URLs I send the request to (I've tried with /send-receive-updates, /dispose-views, and even /ping). I've even tried the same request twice in a row just for kicks, and I always get a <session-expired/> response in return on the second one. I've monitored the requests when I browse the page with Chrome, and as far as I know I'm sending all the correct form data (as well as the correct headers). The page works just fine when I load it in the browser, so there must be something I'm not doing right.
Apparently, the order in which you do the requests matters in ICEfaces (i.e. it's not stateless, which kind of makes sense I guess). I just moved the requests around and finally got the response I desired.
IceWindow, IceView and ViewState
Need to be passed as a parameter whenever you do an ajax submit.
Managed bean takes the previous instance of the current view view using ViewState value.

Crawling AJAX requests

I have an ASP.NET MVC website with drop down lists and when the user selects an option in the first drop down list, the other drop down lists are populated using an AJAX call. Based on the logs, crawlers try to access these AJAX methods as normal gets and because of that my app logs errors. I made those AJAX methods as not crawlable, meaning that I return a 404 when the request is not an AJAX call. Is this the best way to do it?
On the other hand, I have a page that has multiple steps, meaning that the user fills a form and then goes to a second step. Every time the user fills a form I do a POST AJAX request and saves the input data. How should I manage this situation?
Add URLs you don't want crawled to robots.txt.
If you offer a link in GET form crawlers will try to crawl it. Returning a 404 is not technically correct - it does work to deter crawlers from indexing the page though!
Consider returning a 500 Internal Server Error or 501 Not Implemented.

How does ajax form submission work?

I know how to use ajax for submitting a form and all. What I am concerned about is, what is actually happening in the background when a form is submitted via ajax.
How are the values transferred? Encrypted or not? And what is the
need of specifying submission type, I mean get or post, if the URL is
not showing the form fields?
Edit: Found this on w3schools:
GET requests can be cached
GET requests remain in the browser history
GET requests can be bookmarked
GET requests should never be used when dealing with sensitive data
GET requests have length restrictions
GET requests should be used only to retrieve data
POST requests are never cached
POST requests do not remain in the browser history
POST requests cannot be bookmarked
POST requests have no restrictions on data length
How do these apply to ajax form submission?
Basically, when you Ajax-submit a form, it is doing exact same thing as what would happen when you as a user GET or POST submit a form - except that it is done in an asynchronous thread by the browser - i.e. called XMLHttpRequest.
If you submit form as a GET request, all of the form values are stitched together as parameter strings and appended to the URL (form's ACTION URL) - prefixed by a ?. This means anyone who can intercept that communication can read the submitted form data even if request is sent to a HTTPS URL. The POST method sends form data as a separate block (from the URL) and if URL is HTTPS then form data gets encrypted.
It looks like you are just starting out in the world of web development - welcome to the world of programming. I would recommend reading up on some good web development/programming books (I don't want to promote any particular book here). Amazon may help suggest few good ones under "Web Development" kind of search terms.
Also, I suggest that you read up a little on GET vs. POST by googling for it (I can only include one or two links - google will show you hundreds).
For the clear understanding & behind the scene things please refer the links given below.
http://www.jabet.com/
How does AJAX work?
Actually ajax request is same as the normal requests at the server end.
GET or POST has their own use cases. for example: GET has a limit of data transfer depending on the browsers from 1KB to 10 KB. where POST has no such limits.
For a server both AJAX & normal request both are same. so it depends on server code which method you wish to support.
ajax requests are NOT encrypted.
http://www.w3schools.com/tags/ref_httpmethods.asp
It looks like you want a very detailed answer so you can find it yourself:
Google it and read thoroughly the pages (wikipedia for example)
Read http://www.w3.org/TR/XMLHttpRequest/
Inspect the packets between your browser and the server

AJAX inside IFRAME not working against same server

I'm using a website, abc.com, that is hosting an iframe of a page on 123.com.
The page inside the iframe is doing an AJAX request to another page on 123.com, but we're seeing that the request is getting cancelled.
Unless I'm wrong — and I haven't found any official information on the internet about this — the call should work fine as it is not a cross-domain request.
Would the fact that the parent frame is on a different domain really hinder the iframe from doing AJAX requests to its own server?
The IFRAME should be able to make an ajax request to its own originating site (same source URL). However, make sure the REQUEST event is FIRED from the IFRAME, not the parent.
My first guess would be you are loading the IFRAME and then addressing it (firing an event) via the parent (JS) to get it to do/get/set something which triggers an ajax call. In short, this is the mostly likely reason the IFRAME domain to same domain request is getting cancelled as it is still recognized by the browser as originating from outside the target domain code.
The REQUEST event needs to be organically generated from the user clicking on something in the IFRAME or from code in the IFRAME itself firing the event.
In other words: just because the IFRAME may have some ability to fire events/ajax in its JS/code to/from itself, it would normally still not be allowed to have the parent reference that ajax/JS directly via JS/code. The IFRAME has to already be coded to do it based on its load parameters (URL values, perhaps) or the user has to physically click/take action on something to create a user generated event on that domain.
Of course, this is going to vary a bit by browser and version on what you might be able coax in terms interactivity between the parent and iframe. But a strict, up-to-date browser will try to keep you from faking insecure interaction on the iFrame via js.
To get a better answer, you would need to provide more detail on exactly what you are doing/getting.

Resources