I'm looking at bringing over a feature from a previous MVC site that serves files based on our own simple custom authentication service.
Authenticated users (who simply have a authentication cookie saved) are able to download 'secure' files, served through a controller which checks their credentials and returns an unsecure media library link with content disposition set to 'attachment'. This is functional, but the media library files remain unprotected if users know the urls for the files.
Is it possible to use the secure media libraries for our purposes in such a scenario? Our users won't have distinct user rolls in the Kentico system; is it possible to spoof user rolls in the MVC app when we return urls from our controller?
this process takes so much of time. I don't know how long it might take to solve the issues
Related
I have created an application with Laravel 7 that contains users that can log in.
In parallel, I create a showcase site for the application (another domain) and essentially html/css.
I would like on this showcase site to propose login and registration buttons if there is no user connected to the laravel.Otherwise I would just like to propose a "Dashboard" button if a user is connected to the Laravel application.
How to do that? I confess that I'm a bit lost. Thanks for your help.
You need to create an API on the laravel app which will be used by the "showcase site".
To login and authorize themselves you can use JWT
to keep user data and use it on the showcase site you can either save them in
localstorage (just be careful not to save any sensitive data there as people can take that information in case of XSS vulnerable)
indexdb
cookies
None of these methods are safe. They can be exploited using XSS so i advice on using JWT to secure sensitive data.
I'm building an application that has a core hub, say it's called musictickets.com
We'll provide a subdomain (bandname1.musictickets.com) to bands on which only their content will display, which they can mask using a CNAME record to be part of their domain - so tickets.bandname1.com
There would be multiple bands using the platform so you'll end up with pages at
tickets.bandname1.com
tickets.bandname2.com
etc.
I'd like a user who registers at tickets.bandname1.com to be automatically logged in on every site that uses the service, including the parent, musictickets.com . They should be able to register/login using OAuth or directly via form based authentication.
I'm looking at SAML (specifically https://github.com/aacotroneo/laravel-saml2) as one option, but want to throw this out to the wider community for comment.
I've also looked at using token based SSO as described here (single sign on (sso) laravel) and running an auth server (which I may do in any case). Alternatively, I've looked at using iframes to provide the functionality which feels quick but dirty.
As I understand it, I wouldn't be able to use cookies (for an API key for instance) because whilst all of the content will be displayed via a subdomain, the CNAME would make it a different domain.
Does anyone have any thoughts on the best strategy?
I've got a project made of two websites:
The front : A Laravel website without database and logic (just showing static pages and my javascript)
The API : A project using Lumen/Dingo API with my endpoints, my database, my logic and my models
I want to allow my front to ask data to my API depending the user.
Ex. I want to log the user, retrieve his friends, add some post to his
account, etc. (from the Javascript)
What is the best solution?
Using an identification per user (using o-auth or JWT)
Allow my front project to ask to my API then each javascript call needs to use my front without knowing my API) (In this solution I need to create routes similars to my API's routes)
Identification per user is always a better solution as the claims can be decided per user, also in future when it is required to provide permissions to access API based on claims or roles, it becomes easy to give information using the claims and amount of access you can give to a specific user.
So the way it will work is:
There will be identity server which will be containing the list of users in the system, and also the clams and scopes per user.
The API project will trust the identity server, that means any token provided by the identity server can get verified.
Based on the token per user your API app can decide how much information you want to give to the user.
That way in future you can have methods based on roles, claims and users will be provided with only the information that they have access to.
I'm currently working on a rather interesting... project. I have a client who wants to allow form uploads (from a page presented on their server) specifically to their own google drive account. The platform being used is essentially LAMP.
Single (pre-authenticated) google drive account. Multiple otherwise anonymous upload sources (users).
They do not want users to be required to have their own google accounts (rules out simply using Picker on the user's own drive files).
They want some degree of backwards browser compatibility, such as IE8 (rules out XHR to form the post using HTML5's file API to read the filedata). They don't want to use flash/etc due to potential compatibility issues with certain mobile browsers.
What is working:
Authenticating (getting a refresh token, storing, using it to get access tokens as needed)
Uploading a file to the account without metadata
Result of file upload being sent to hidden iframe
Catching the iframe load event via jquery to at least know something has happened
Problems:
The REST API upload endpoint does not support CORS: there is no way to access the result iframe directly. (see: Authorization of Google Drive using JavaScript)
The return from a successful upload is only raw JSON, not JSONP.
There is seemingly no way to host anything with proper headers to open via browser on the googleapis.com domain, so easyXDM and similar multi-iframe with cross origin workaround communication javascript approaches are ruled out.
There is no way to embed a callback URL in the POST from the submit, the API does not allow for it.
The Picker displays errors on trying to upload if you pass it an Oauth2 token that is not for a user who is also authenticated in their browser (assumedly via cookie). Strangely enough you can show files from the Oauth2 token's matching account, but other than in a browser instance where the target Oauth2 token's account matches the already logged in user any file uploads fail with an ambiguous "Server rejected" message. This happens with all files and file types, including files working in an authenticated browser instance. I assume it's an authentication flow/scope issue of some sort. I haven't tried diving the Picker source.
All of the javascript Google Drive API upload examples seem to rely on using HTML 5 to get the file data, so anything of that nature seems to be ruled out.
While files are uploaded, there's no way other than guesstimating which file came from which user, since we can't get the file object ID from the result in our inaccessible iframe. At best we could make a very rough time based guess, but this is a terrible idea in case of concurrency issues.
We can't set the file name or any other identifier for the file (not even a unique folder) because the REST API relies on that metadata being sent via JSON in the post request body, not via form fields. So we end up with file objects in the drive with no names/etc.
We can't create the file with metadata populated server side (or via jquery/XHR, or the google javascript API client) and then update it with a form based upload because the update API endpoint exclusively works with PUT (tested).
We can't upload the files to our local server and then send them to google (proxy them) as the php ini is locked down to prevent larger file uploads (and back to restrictions imposed on using HTML5 or flash for why we can't chunk files/etc).
All of this has been both researched and to varying degrees tried.
At the moment this is going on hold (at least it was a useful way to learn the API and gain a sense of its limitations) and I'm just going to implement something similar on dropbox, but if anyone has any useful input it would be lovely!
e.g. is there any way to get this working with Drive? Have I overlooked something?
I also realize that this is probably essentially a less than intended use-case, so I'm not expecting miracles. I realize that the ideal flow would be to simply allow users to upload if necessary to their own google drives and then have them grant file access to our web app (via Picker or API+our own UI), but this becomes a problem when not all of our own users are necessarily already google account users. I know that google would OBVIOUSLY prefer we get even more people to sign up with them in order to have this happen, but making people sign up for a google account to use our app was ruled out (not out of any prejudice on our part, it was just too many added steps and potential user hurdles). Even simply having them sign in to google if they did have accounts was deemed unwanted for the basic LCD feature functionality, although it's likely to be added as an additional option on top of whatever becomes the base solution.
The biggest problem with the approach you described is you're introducing a big security issue. Allowing an anonymous user to directly upload to Drive from the client requires leaking a shared access token to anyone who comes by. Even with the limited drive.file scope, a malicious or even slightly curious user would be able to list, access (read/update/delete!) any file that was uploaded by that app.
Of course a public drop box feature is still useful, but you really need to proxy those requests to avoid revealing the access token. If your PHP environment is too restrictive, why not run the proxy elsewhere? You can host a simple proxy to handle the uploading just about anywhere -- app engine, heroku, etc. and support whatever features you need to ensure the metadata is set correctly for your app.
I downloaded the WIF Extensions for SAML2 a few days ago and have been experimenting with them. The samples in the download use a WebForms application, and I am trying to figure out how to use them in MVC3.
I am currently able to auth against the sample Identity Provider that comes with the download, using this:
Saml2AuthenticationModule.Current.SignIn(
"~/sign-on/saml2/success", "urn:samples:identityprovider");
I have an action method at the "sign-on/saml2/success" route / URL, and when the application flow reaches it, the Thread.CurrentPrincipal.Identity is indeed an instance of IClaimsIdentity. Although the Identity.IsAuthenticated equals true, the Identity.Name is an empty string. (This will be problematic in our app, which so far has used FormsAuthentication, and relies on Identity.Name to resolve to a user account in the db.)
I also see that there are 4 new cookies at this point:
FedId
FedAuth
FedAuth1
[fourth cookie name is a GUID, changes for each SSO]
My inclination at this point is to delete these 4 cookies and use the claims NameIdentifier to create a new account in our app (unless one already exists), and then use FormsAuthentication to write a .ASPXAUTH cookie for the user.
The first affiliate IdP we will be integrating with uses Shibboleth, and they do not yet implement SingleLogOut. So my assumption is that the following would have no effect when we begin testing this integration:
Saml2AuthenticationModule.Current.SignOut("~/sign-off/saml2/success");
So, manually deleting the 4 cookies is the only way we would be able to get the Identity.IsAuthenticated back to a false value.
Am I going about this in an incorrect fashion? Are there any implications of trashing the IClaimsIdentity after it has been consumed and transferred to FormsAuthentication, that I am not considering?