remix-auth-socials - http 400 {"message": "Missing state on session."} on production, on localhost its working - remix.run

I am trying to make work Google auth on my first remix.run js FW via https://www.npmjs.com/package/remix-auth-socials
On localhost, it works nicely, however on production server its crashing on redirect callback:
URL: [my correct redirect URI]
response http code: 400 Bad request
response body: {message: "Missing state on session."}
My configuration is based on readme in npm package:
// ~/services/session.server
export const sessionStorage = createCookieSessionStorage({
cookie: {
...
secure: true
},
});
...
I found in FW code that its crashing with this message if it fails to read session from remix sessionStorage.
Does anybody knows what can cause this?

https://github.com/sergiodxa/remix-auth/discussions/156#discussioncomment-2536785
This is a common issue if you're coming from one of the remix templates where the cookie's maxAge is set to 0 (useful for implementing a 'remember me' feature). However what this means is the browser immediately sees the cookie as stale. Either remove that property or update it to be a longer time frame. I use maxAge: 60 * 60 * 24 * 30, // 30 days

https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Set-Cookie#attributes
Secure Optional
Indicates that the cookie is sent to the server only when a request is made with the https: scheme (except on localhost), and therefore, is more resistant to man-in-the-middle attacks.
This was it, i am not running on https, so solution was to set:
cookie.secure: false

Related

Laravel/Vue - Sessions on firefox not working correctly (CSRF token mismatch, session data is null, etc)

I'm working on a project with Laravel and inline vue*. For submitting forms we use axios-calls.
On chrome there is no issue, but in Firefox, with every axios call it gives back a 419 error 'csrf tokens mismatched'.
I googled this issue, and some say adding the CSRF-token to the header. Did that, see below, but it does not work.
axios.defaults.headers.common = {
'X-Requested-With': 'XMLHttpRequest',
'X-CSRF-TOKEN': document.querySelector('meta[name="csrf-token"]').getAttribute('content'),
'X-XSRF-TOKEN': decodeURIComponent(this.readCookie('XSRF-TOKEN')),
};
After trying lot's of different answers, I added '*' to the $except array in VerifyCsrfToken.php. (= All my routes will not work with the CSFR-token, not a good practice, but i did this for debug)
The form will submit now. But in the form handling (controller.php), I save some data in the session. In chrome I get the data I need, in Firefox this session data is null.. So I guess the csrf-token stored in the user session is null in firefox. How do I fix this? How do I solve the issue with user-sessions?I also found out that when i send a request with firefox it creates an entire new session/file in storage/framework/sessions, always with a different _token..
I use this in myController.php, to store the data I need in the session:
$request->session()->put('formData', json_encode($data));
$request->session()->save();
TIA!
In .env
App_url=www.site.be
session_domain=
App_name=site
In session.php:
driver => file
lifetime => 120
encrypt => false
secure => false
'http_only' => true
*: We use inline vue-scripts in the blade.php files to submit data.
I tried:
CSRF-token to the header
$req->session()->save()
Changed .env variables around, and in session.php (values above are current)
Changed app_name so it will not have a '.' or '_'
Added _token="{{ csrf_token() }}" to formdata in axios call.
Added "content-type": "application/json" to axios call
Php artisan optimize:clear
Checked permissions on server
I expected it to make the site work, but alas, it does not.
Asked this question on r/laravel and it is solved.
It looks like Firefox isn't sending the session cookie, wich will
cause all the problems you described as a result. Use the browser
developer tools to inspect requests to confirm this. Then, try to
configure Axios to always send cookies, something like
withCredentials: true.
So I added "axios.defaults.withCredentials = true;". And it works! Will keep this here, so others might learn from my mistakes

"Not allowed to request resource" in Safari and "Blocked loading mixed active content" in Firefox. Perfect functionality in Chrome

I am working on an app using a React frontend and Express backend, with GraphQL setup through Apollo (I am following and modifying tutorial https://www.youtube.com/playlist?list=PLN3n1USn4xlkdRlq3VZ1sT6SGW0-yajjL)
I am currently attempting deployment, and am doing so with Heroku. Everything functions perfectly on my local machine before deployment and on Heroku in Google Chrome. However, I get the aforementioned errors in Safari and Firefox, respectively. Wondering why this is happening in these browsers and how to fix.
I have spent about 10 hrs doing research on this. Things I tried that made no difference:
I tried adding CORS to my express backend
I tried serving the graphql endpoint as HTTPS
Moving app.use(express.static) in main app.js server file
I couldn't find many other things to try. Everywhere I looked seemed to say that CORS fixed the problem, but mine persists.
Github link: https://github.com/LucaProvencal/thedrumroom
Live Heroku App: https://powerful-shore-83650.herokuapp.com/
App.js (express backend):
const cors = require('cors')
// const fs = require('fs')
// const https = require('https')
// const http = require('http')
app.use(express.static(path.join(__dirname, 'client/build')));
app.use(cors('*')); //NEXT TRY app.use(cors('/login')) etc...
app.use(cors('/*'));
app.use(cors('/'));
app.use(cors('/register'));
app.use(cors('/login'));
app.get('/login', (req, res) => {
res.sendFile(path.join(__dirname, "client", "build", "index.html"));
});
app.get('/register', (req, res) => {
res.sendFile(path.join(__dirname, "client", "build", "index.html"));
});
server.applyMiddleware({ app }); // app is from the existing express app. allows apollo server to run on same listen command as app
const portVar = (process.env.PORT || 3001) // portVar cuz idk if it will screw with down low here im tired of dis
models.sequelize.sync(/*{ force: true }*/).then(() => { // syncs sequelize models to postgres, then since async call starts the server after
app.listen({ port: portVar }, () =>
console.log(`🚀 ApolloServer ready at http://localhost:3001${server.graphqlPath}`)
)
app.on('error', onError);
app.on('listening', onListening);
});
app.use(bodyParser.json());
app.use(bodyParser.urlencoded({ extended: true }));
app.use(function(req, res, next) {
res.header("Access-Control-Allow-Origin", "*");
res.header("Access-Control-Allow-Headers", "Origin, X-Requested-With, Content-Type, Accept");
next();
});
Full file is on Github, I tried to post only relevant parts above.
The expected result is that it works in all browsers. It seems from my research that since Heroku serves on HTTPS, Safari and Firefox do not allow requests to HTTP (which is where the graphql server is located, http://localhost:3001/graphql'). When I tried serving Apollo on HTTPS, Heroku just crashed, giving me H13 and 503 errors.
Thanks for any help...
This may also happen during local development when running the front end using HTTPS, but the back end using HTTP.
This is because CORS treats two URLs as having the same origin "only when the scheme, host, and port all match". Matching scheme means matching protocols e.g. both http, or both https.
One solution for local development is to proxy the back end using a tool such as ngrok.
Suppose the front end uses an environment variable which indicates the back end's URL:
BACK_END_API_URL=http://localhost:3005. Then do the following.
Install ngrok
Identify what port the back end is running on e.g. 3005
Run ngrok http 3005 at the command line, which will establish both http and https endpoints. Both will ultimately proxy the requests to the same back end endpoint: http://localhost:3005
After running ngrok it will display the http and https endpoints you can use. Put the one that matches the front end protocol you're using (e.g. https) into your front end environment variable that indicates the back end's URL e.g.
BACK_END_API_URL=https://1234asdf5678ghjk.ngrok.io
Was going to delete this because it is such a silly problem but maybe it will help someone in the future:
I simply replaced all of my 'http://localhost:PORT' endpoints in development with '/graphql'. I assumed that localhost meant local the machine running the code. But an app running on Heroku does not point to localhost. The express server is served on the url (https://powerful-shore-83650.herokuapp.com/) in our case...
At any rate I am so glad I came to a solution. I have a full stack app deployed and connected to a db. Hopefully this post can save someone lots of time.

prevent ajax request caching in IE during post method

How to prevent IE from caching the request sent to the server?
i tried by setting ("Cache-Control: no-cache) in the https response object but still the IE is caching my request data.
Please find tmy project details as below:
in my application i am sending login request to the server. so after i login if i take the memory dump using winHex tool i am able to get the password details in the memory.
i am clearing the dialog refrense also but still the request data is getting cached.
Please suggest me some work arround for this
You could try to add a parameter to your URL with a random value, this will prevent that the URL is always thesame.
Example:
Normal URL:
www.test.com/test.php
Fake different URL:
www.test.com/test.php?_dc=12353somerandomval
Make sure the _dc parameter always has a different value, you can (for example) use JavaScript date object for this (It returns the current time in milliseconds, which will virtually always be different):
params: {
_dc : new Date().getTime()
}
In a project I did a while back I had the exact same issues, I searched around and saw a few things that recommended adding a time stamp to the request, that does work too, but this was the most elegant way that worked for me.
$('document').ready(function () {
$.ajaxSetup({
cache: false
});
});

Refused to set unsafe header "Origin" when using xmlHttpRequest of Google Chrome

Got this error message:
Refused to set unsafe header "Origin"
Using this code:
function getResponse() {
document.getElementById("_receivedMsgLabel").innerHTML += "getResponse() called.<br/>";
if (receiveReq.readyState == 4 || receiveReq.readyState == 0) {
receiveReq.open("GET", "http://L45723:1802", true, "server", "server123"); //must use L45723:1802 at work.
receiveReq.onreadystatechange = handleReceiveMessage;
receiveReq.setRequestHeader("Origin", "http://localhost/");
receiveReq.setRequestHeader("Access-Control-Request-Origin", "http://localhost");
receiveReq.timeout = 0;
var currentDate = new Date();
var sendMessage = JSON.stringify({
SendTimestamp: currentDate,
Message: "Message 1",
Browser: navigator.appName
});
receiveReq.send(sendMessage);
}
}
What am I doing wrong? What am I missing in the header to make this CORS request work?
I tried removing the receiveReq.setRequestHeader("Origin", ...) call but then Google Chrome throws an access error on my receiveReq.open() call...
Why?
This is just a guess, as I use jquery for ajax requests, including CORS.
I think the browser is supposed to set the header, not you. If you were able to set the header, that would defeat the purpose of the security feature.
Try the request without setting those headers and see if the browser sets them for you.
In CORS the calling code doesn't have to do any special configuration. Everything should be handled by the browser. It's the server's job to decide if request should be allowed or not. So any time you are making a request which breaks SOP policy, the browser will try to make a CORS request for you (it will add Origin header automatically, and possibly make a preflight request if you are using some unsafe headers/methods/content types). If the server supports CORS it will respond properly and allow/disallow the request by providing CORS specific response headers like
Access-Control-Allow-Origin: *
Keep in mind that Chrome is very restrictive about 'localhost' host name. (At least it was when I was working with it). Instead use your computer name or assign it another alias in 'hosts' file. So for example don't access your site like:
http://localhost:port/myappname
Instead use:
http://mymachinename:port/myappname
or
http://mymachinealias:port/myappname
For more details please check specification.
Are you working cross-domain?
Try Brian S solution or try this:
instead of setting to localhost just pass anything... and see what happens.
receiveReq.setRequestHeader("Origin", "*");

Making requests from userscript to the localhost

So, I'm trying to make cross-site AJAX request from my own script to the localhost. In the userscript (running on Firefox's Scriptish engine) I'm loading my script like this
myscript_include.setAttribute('src', 'http://localhost/myscript.js?' + Math.random());
head.appendChild(myscript_include);
It works indeed. Then, in myscript.js, I try to read data from localhost (finally, I would like to make get-post requests to scripts on my localhost to add any needed functionality to the web-page without writing actual Firefox extension).
Following instructions on making cross-site AJAX requests I add to myscript.js:
$.getJSON('http://localhost/ajaxdata.json', function(json) {
alert(json.message);
});
Firefox JS console shows that GET request was actually made, and status is 200 OK. It even shows Content-Length 39, which is true indeed, but Response field remains empty and alert isn't shown!
What's wrong with that construction (except of it's horrible itself)? Is there some way to do what I want?
Not sure, but maybe adding Access-Control-Allow-Origin headers to localhost will solve this?
ref: https://developer.mozilla.org/En/HTTP_access_control

Resources