Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
I see iframe/p3p trick is the most popular one around, but I personally don't like it because javascript + hidden fields + frame really make it look like a hack job. I've also come across a master-slave approach using web service to communicate (http://www.15seconds.com/issue/971108.htm) and it seems better because it's transparent to the user and it's robust against different browsers.
Is there any better approaches, and what are the pros and cons of each?
My approach designates one domain as the 'central' domain and any others as 'satellite' domains.
When someone clicks a 'sign in' link (or presents a persistent login cookie), the sign in form ultimately sends its data to a URL that is on the central domain, along with a hidden form element saying which domain it came from (just for convenience, so the user is redirected back afterwards).
This page at the central domain then proceeds to set a session cookie (if the login went well) and redirect back to whatever domain the user logged in from, with a specially generated token in the URL which is unique for that session.
The page at the satellite URL then checks that token to see if it does correspond to a token that was generated for a session, and if so, it redirects to itself without the token, and sets a local cookie. Now that satellite domain has a session cookie as well. This redirect clears the token from the URL, so that it is unlikely that the user or any crawler will record the URL containing that token (although if they did, it shouldn't matter, the token can be a single-use token).
Now, the user has a session cookie at both the central domain and the satellite domain. But what if they visit another satellite? Well, normally, they would appear to the satellite as unauthenticated.
However, throughout my application, whenever a user is in a valid session, all links to pages on the other satellite domains have a ?s or &s appended to them. I reserve this 's' query string to mean "check with the central server because we reckon this user has a session". That is, no token or session id is shown on any HTML page, only the letter 's' which cannot identify someone.
A URL receiving such an 's' query tag will, if there is no valid session yet, do a redirect to the central domain saying "can you tell me who this is?" by putting something in the query string.
When the user arrives at the central server, if they are authenticated there the central server will simply receive their session cookie. It will then send the user back to the satellite with another single use token, which the satellite will treat just as a satellite would after logging in (see above). Ie, the satellite will now set up a session cookie on that domain, and redirect to itself to remove the token from the query string.
My solution works without script, or iframe support. It does require '?s' to be added to any cross-domain URLs where the user may not yet have a cookie at that URL. I did think of a way of getting around this: when the user first logs in, set up a chain of redirects around every single domain, setting a session cookie at each one. The only reason I haven't implemented this is that it would be complicated in that you would need to be able to have a set order that these redirects would happen in and when to stop, and would prevent you from expanding beyond 15 domains or so (too many more and you become dangerously close to the 'redirect limit' of many browsers and proxies).
Follow up note: this was written 11 years ago when the web was very different - for example, XMLhttprequest was not regarded as something you could depend on, much less across domains.
That's a good solution if you have full-control of all the domains backend. In my situation I only have client (javascript/html) control on one, and full-control on another, therefore I need to use the iframe/p3p method, which sucks :(.
Ok I seem to have found a solution, you can create a script tag that loads the src of the domain you want to set/get cookies on... only safari so far seems not to be able to SET cookies, but Ie6 and FF work fine... still if you only want to GET cookies, this is a very good approach.
The example in that article seems suspicious to me because you basically redirect to a url which, in turn, passes variables back to your domain in a querystring.
In the example, that would mean that a malicious user could simply navigate to http://slave.com/return.asp?Return=blah&UID=123" and be logged in on slave.com as user 123.
Am I missing something, or is it well-known that this technique is insecure and shouldn't be used for, well, things like that example suggests (passing user id's around, presumably to make one's identity portable).
#thomasrutter
You could avoid having to manage all outbound links on satellites (via appending "s" to querystring) by making an ajax call to check the 'central' domain for auth status on page load. You could avoid redundant calls (on subsequent page loads) by making only one per session.
It would be arguably better to make the auth check request server-side prior to page load so that (a) you have more efficient access to session, and (b) you will know upon page render whether or not the user is logged in (and display content accordingly).
We use cookie chaining, but it's not a good solution since it breaks when one of the domains doesn't work for the user (due to filtering / firewalls etc.). The newer techniques (including yours) only break when the "master" server that hands out the cookies / manages logins breaks.
Note that your return.asp can be abused to redirect to any site (see this for example).
You also should validate active session information against domains b,c,d,... this way you can only login if the user has already logged in at domain a.
What you do is on the domain receiving the variables you check the referrer address as well so you can confirm the link was from your own domain and not someone simply typing the link into the address bar. This approach works well.
Related
Recently I got person ask me why our website doesn't work without cookies. My explanation is we need to save tokens and some reference in cookie, So that later on we can use it to make requests and there is limit options that we can use to save data in browser. But he doesn't satisfied with my answer and I also think there is a few options that we can make it work instead of using cookies/localStorage/sessionStorage.
My question is why most of the website cannot work without cookies? Can we make the website works without any storage in the browser?
Using cookies allows your website to remember the user (e.g. last login, avoiding having to login again) and offer corresponding benefits to them and you (e.g. tracking usage/interest, advertising). If you don't want these benefits then of course you can deliver a website which doesn't use cookies. If the website needs a login they will have to login on every different page viewed.
apologies if this has been asked but I'm trying to figure out this kind of stuff for the first time -
I'm developing an app where I want to divide the authenticated content from the web-facing side, completely; therefore I am not using a simple backbone.js-style "keep all views in one file" (unless I'm wrong about this, please illuminate!) but actually divided server files (using PHP).
Current flow: the user logs in client-side (using the Parse.com Todo app as an example) and, if successful, I store cookie (via POST/AJAX) with user email and the returned sessionToken on client side. I then thought that when user next visits site, the server can read cookie and shuffle the user to the private/locked portion of site, which, again, is a different set of PHP files.
Here I get lost -- how do I then tell Parse.com that the user is logged in, if I don't have her username/password (only email), and start grabbing data from the classes? Is there a way to do this that I'm not recognizing? I guess I can load different .JS files, read if a session exists, and JS-redirect to a different url, but that seems to me to be a weird way of going about it.
Is there a general philosophy/methodology to my questions that I should read up on, along concrete advice dealing with Parse.com questions?
I believe the Parse User session management functions should be good for you.
Check out https://parse.com/docs/cloud_code_guide#webapp-users
There is an example at the bottom of their announcement blog post here: http://blog.parse.com/2013/09/04/new-cloud-modules-for-images-and-users/
It gives you user session management with minimal effort.
Background story: We run a website with thousands of users and a handful of admins. Some of these admins don't need all-access to the website, so I want to restrict their access by giving them individual permissions.
My plan is to set a Session on user login with the users perimissions, if given any. However, I'm concerned that this might be an unsafe action.
Can a Session be manipulated by a user client side? In this case a regular user could gain access to the admin features if they knew the permission names and set a Session for themselves.
I found some related questions on Stackoverflow, but they didn't give give me enough information on the subject.
You are already providing the login for admins and users so save type of permission they have and give them rights to modify data according that..And as long as your session state is encrypted it is very hard to manipulate on client side.
If you have concern about security of your existing session and cookies here is link to make it secure.
Secure your Session
This is full Article how to make your session and cookies secure...
You can indeed store server variables such as the user-agent, the ip address and so forth (and even JavaScript variables), but they are only good for validating that the persistent cookie data matches the client's new connection. The ip address isn't a good idea except when you know that the client (like you only) isn't going to change on every page load (a la AOL).
Modern web browsers and 3rd party services like LastPass can store login credentials that only require a key press (and sometimes not even that) to send the data to the login form. Persistent cookies are only good for those people who refuse to use what's available otherwise. In the end, persistent, non-session cookies are not really required anymore.
There is no such thing as secure cookie UNLESS it's transmitted over SSL only. It can be mitigated some when using a persistent non-session cookie (like remember me), by doing exactly what you're doing, but not in the same way you're thinking of doing it.
I've done a bit of reading on working around the cross domain policy, and am now aware of two ways that will work for me, but I am struggling to understand how CORS is safer than having no cross domain restriction at all.
As I understand it, the cross domain restriction was put in place because theoretically a malicious script could be inserted into a page that the user is viewing which could cause the sending of data to a server that is not associated (i.e. not the same domain) to site that the user has specifically loaded.
Now with the CORS feature, it seems like this can be worked around by the malicious guys because it's the malicous server itself that is allowed to authorises the cross domain request. So if a malicious script decides to sending details to a malicious server that has Access-Control-Allow-Origin: * set, it can now recieve that data.
I'm sure I've misunderstood something here, can anybody clarify?
I think #dystroy has a point there, but not all of what I was looking for. This answer also helped. https://stackoverflow.com/a/4851237/830431
I now understand that it's nothing to do with prevention of sending data, and more to do with preventing unauthorised actions.
For example: A site that you are logged in to (e.g. social network or bank) may have a trusted session open with your browser. If you then visit a dodgy site, they will not be able to perform a cross site scripting attack using the sites that you are logged in to (e.g. post spammy status updates, get personal details, or transfer money from your account) because of the cross domain restriction policy. The only way they would be able to perform that cross site scripting attack would be if the browser didn't have the cross site restriction enabled, or if the social network or bank had implemented CORS to include requests from untrusted domains.
If a site (e.g. bank or social network) decides to implement CORS, then they should be sure that it can't result in unauthorised actions or unauthorised data being retrieved, but something like a news website content API or yahoo pipes has nothing to lose by enabling CORS on *
You may set more precise origin filter than "*".
If you decide to open your specific page to be included in another page, it means you'll handle the consequences.
But the main problem cannot be that a server can receive strange data : that's nothing new : everything that is received by a server is suspect. The protection is mainly for the user which cannot be abused by an abnormal composition of sources (the englobing one being able to read the englobed data, for example). So if you allow all origins for a page, don't put inside data that you want to share only with your user.
I am making a bookmarklet, which calls a Google App Engine app. The GAE app uses login information, which I want to store in bookmarklet, so when user first clicks bookmarklet,it asks for login info, but from next time onwards it automatically supplies it.
The difficulty of a bookmarklet directly storing data is that it can only store data in cookie or in localStore, both of which "belong" to whatever page it is currently on. That means it won't work again the next time you use it on a different page, and it also means the page you are on can access the data, which is generally very bad for security.
There are two basic ways your situation is generally handled. The two main ways are:
1.) The application used keeps the user logged in with a cookie. The login information is not stored in the cookie; only a session ID is. This is like when you return to many popular websites, you don't have to log in again. Very often these types of bookmarklets open a small popup for the user which contains a page from the app. If the user is not logged in, the app prompts the user to login first. The bookmarklet in fact knows nothing about being signed in or not.
2.) Each bookmarklet is custom created for each person. So my bookmarklet would be different than yours. The difference is simply that mine will contain my login info in the code, and yours will contain your login information in the code. In fact we would each have to login to the app first before we can get our own personalized bookmarklet.
Generally, option 1 is better and easier and more secure.
If I understand it correctly,this Might help you. http://ajaxian.com/archives/whats-in-a-windowname
It allows for storing data in windowname in JS. Allowing for access of up-to 2 MB of data (A lot more than cookies can hold) and I believe can be used across tabs...