I want to user Dropbox chooser API for my Ruby on rails application(This is not web app, This will be installed as standalone).
Issue is in specifying "Drop-ins domains", which currently i gave as "localhost. But for the machines on which it will be installed "machine name" will be used instead of localhost. And i cannot keep track of all installations and manually adding those domains
Please suggest, Is there some way to solve this problem? Can i use chooser API without drop-in domain?
The Dropbox Drop-ins API doesn't have any way of automatically adding domains, or registering any sort of wildcard, but we're tracking this as a feature request.
For reference, one thing that does work, though it sounds like it may not apply to your scenario, is registering just, for example, example.com, which would enable use on any subdomain of example.com, e.g., sub1.example.com, sub2.example.com, webmail.example.com, etc.
Alternatively, you can embed an iframe containing the button, which would be hosted on your own domain. This would let you just set that one domain in the app options page. It would be very important that you then restrict the set of domains that you allow to iframe your button though, but this list would now be under your control so you can set it programmatically. For example:
How to limit display of iframe from an external site to specific domains only
Related
I have a website, running on Laravel 8 and Vue.js 3. Admin panel's front-end is completely on Vue, while guest users are served with Laravel's blade.
I have worries about unauthorized client's possibility to inspect admin panel's code on login page, as it's part of Vue.
Of course, client will not get any information from server, without authentication. All she/he can see is blank panel with no information at all.
So the problem is, client can analyze a whole functionality of code, view all routes that is used to manage site content. This gives full information to security researchers where to target, what to send and what to expect.
Also, I know about Asynchronous Components, but this is not answer here, as those component's are named by predictable names. So it's possible to get whole working code anyway.
If I will make subdomain separately, those subdomains can be also scanned and exposed. As managers are working from separate locations, denying of route, based on IP address is also not solution.
How to control this from Laravel, so only authenticated users can see panel's code? Should I try to fix this at all?
The way I do is by compiling two differents files using webpack and a logical test whitin the blade file.
So, depending on the user type, the page will load differents files, but I do not mind them staying in the public directory. I put every administrator request inside an administrator middleware.
You can make laravel moving files from public directory and an another one using some sort of control. Exposing your admin files only when an administrator request a page and until the load is done. I know this is possible, I never tested it myself.
I'm building an application that has a core hub, say it's called musictickets.com
We'll provide a subdomain (bandname1.musictickets.com) to bands on which only their content will display, which they can mask using a CNAME record to be part of their domain - so tickets.bandname1.com
There would be multiple bands using the platform so you'll end up with pages at
tickets.bandname1.com
tickets.bandname2.com
etc.
I'd like a user who registers at tickets.bandname1.com to be automatically logged in on every site that uses the service, including the parent, musictickets.com . They should be able to register/login using OAuth or directly via form based authentication.
I'm looking at SAML (specifically https://github.com/aacotroneo/laravel-saml2) as one option, but want to throw this out to the wider community for comment.
I've also looked at using token based SSO as described here (single sign on (sso) laravel) and running an auth server (which I may do in any case). Alternatively, I've looked at using iframes to provide the functionality which feels quick but dirty.
As I understand it, I wouldn't be able to use cookies (for an API key for instance) because whilst all of the content will be displayed via a subdomain, the CNAME would make it a different domain.
Does anyone have any thoughts on the best strategy?
We have a page on our site that uses Google's reCAPTCHA before allowing the user to download a file.
It works great and we totally stopped all the evil bots from spamming our servers.
Now we want to allow a specific entity (user, domain, whatever) to be able to automatically download files without solving the challenge. Or maybe solving it once per session (which will be longer than 2 minutes) and not once per file.
Is there some way we can issue them a multi-use token or have them get a token from Google that will allow them (temporary?) unfettered access to our file downloads? Can we whitelist their domain in the Google admin settings?
Or is this something I need to build myself?
EDIT: It turns I didn't get all the requirements for this assignment. Whitelisting will not satisfy the requirements since it is apparently multiple entities, and that will indubitably change in the future.
reCAPTCHA does not provide specific whitelisting for users or domains.
Instead, you should be looking at making this dynamic on your side. For example, disable reCAPTCHA for signed-in users or generate a token on your server with an expiry time, set that as a cookie on the client, and disable reCAPTCHA for valid tokens.
I sat up facebook connect extension on my magento store, which allows customer to login to the store with their facebook account. After filling api key, api secret in magento and config site url in facebook apps, the extension worked perfectly. However, if I switch to another store (with another domain), it won't work anymore. Is there a way to have magento connect to facebook without matching site url?
Here is the extension I got from: http://inchoo.net/ecommerce/magento/facebook-connect-magento-extension/
I'm not fully aware of how that Magento app works internally, however what I would say is that Facebook strictly speaking, does not allow apps to work across multiple different URLs. You can add multiple subdomains, however.
There is also some unsupported functionality allowing you to run apps across different domains details in this question, though it's worth remembering, this is unsupported.
The Facebook docs have some more info on "App Domains", and how they should be configured.
I want to create an app only applicable for one or two domains. And i am trying to follow the doc here, https://developers.google.com/apps-marketplace/listing
I have done all the steps except 7th step, as i don't want it to be published on marketplace.
Also i have got an url after publishing the app in webstore, but when i click on the link it only allow me to add as a chrome extension, but nothing as an market place app?
So now, how can i add it to my domain and any other specific domain i want?
Please let me know if you need more information.
Thanks,
Ramesh.V
You will actually want to publish but make sure you configure the visibility options correctly. In the Developer Dashboard you can make a listing available to your domain, a Google Group of testers, etc. In order to restrict to a domain, the admin creating the listing must be a member of the domain. So in order to restrict to more than one domain, I would recommend creating a unique listing per domain. Many system integrators take advantage of this by creating unique branding per domain. Alternatively you could use the Google Group route if that's all you need.