Too many requests for crossdomain.xml - crossdomain.xml

I have a zend framework application that sends me an email in the error controller whenever an error occurs that includes pages not found, can someone tell me why someone requests this file: /crossdomain.xml ? Should I permanently block the offending IP Address?

Crossdomain.xml is a file giving flash certain permissions when a flash object makes a request of your server. IIRC it is assumed that a flash file served from your domain will necessarily have the rights to access items on your domain. However, if a flash piece is being served from another domain and making requests from your server, the flash runtime has to access crossdomain.xml to see if the flash piece may have the rights to access the data.
It's hard to know from what you've posted if there is indeed abuse going on.

crossdomain.xml is used to specify domains YOU trust which can access your website (domain) data. You can safely ignore or block these IP addresses if you aren't making use of crossdomain.xml file.

Related

How to remove the address bar on non-Secure http page

I try to remove the address bar on EDGE Win10,
I can remove the address bar by installing a test page as an EDGE app,
but if the page is non-secure HTTP the address bar remains.
As you can see in this test:
test with HTTP and HTTPS
How can I prevent EDGE to insert the non-Secure warning / address bar within specific HTTP contents?
Please note that HTTP content is inside INTRANET as web applications.
I should try regedit first to design the policy later.
Please could you help me about?
You can't hide the address bar in any way if it's through Group Policy or the registry. Also, it is not recommended to do it. Users should see what URL they are browsing. Otherwise, it may cause security issues.
If a website doesn't have a valid certificate. The information sent to and from it is not secure and can be intercepted by an attacker or seen by others. There's a risk to your personal data when sending or receiving information from this site. In my opinion, users should know if a site is at risk to decide whether to continue accessing it. Otherwise,as mentioned above, it may cause security issues.
Therefore I don't think your requirement can be achieved. Maybe you could refer to this doc: Securely browse the web in Microsoft Edge

How to upload/download files directly from Google Drive to the browser without using server bandwidth?

I want to make a web app where the only cost to me is serving the webpage to the user, and all their data is saved to their Google Drive so I don't have to pay for storage or bandwidth.
Is this possible using Google Drive?
I can't see how:
If I want to save something directly from the browser, it needs my application's API key, and I can't put that in the HTML as it is non-secure.
If I try to do anything where the webpage calls my server, the file will have to pass through my server to get to Google.
If I want to save something directly from the browser, it needs my application's API key, and I can't put that in the HTML as it is non-secure.
You need client credentials, api key will only give you access to public data and wont give you the ability to write anything. Web credentials if configured properly are bound to the domain that they are intended to be use for there for they are considered secure.
If I try to do anything where the webpage calls my server, the file will have to pass through my server to get to Google.
Well this is true considering that your server is running the code. There is no way to route directly from the client to the user. Unless you did this with javascript in which case the code is running client sided and running in the users browser.

Form based (Cross Domain) Google Drive API Upload with caveats

I'm currently working on a rather interesting... project. I have a client who wants to allow form uploads (from a page presented on their server) specifically to their own google drive account. The platform being used is essentially LAMP.
Single (pre-authenticated) google drive account. Multiple otherwise anonymous upload sources (users).
They do not want users to be required to have their own google accounts (rules out simply using Picker on the user's own drive files).
They want some degree of backwards browser compatibility, such as IE8 (rules out XHR to form the post using HTML5's file API to read the filedata). They don't want to use flash/etc due to potential compatibility issues with certain mobile browsers.
What is working:
Authenticating (getting a refresh token, storing, using it to get access tokens as needed)
Uploading a file to the account without metadata
Result of file upload being sent to hidden iframe
Catching the iframe load event via jquery to at least know something has happened
Problems:
The REST API upload endpoint does not support CORS: there is no way to access the result iframe directly. (see: Authorization of Google Drive using JavaScript)
The return from a successful upload is only raw JSON, not JSONP.
There is seemingly no way to host anything with proper headers to open via browser on the googleapis.com domain, so easyXDM and similar multi-iframe with cross origin workaround communication javascript approaches are ruled out.
There is no way to embed a callback URL in the POST from the submit, the API does not allow for it.
The Picker displays errors on trying to upload if you pass it an Oauth2 token that is not for a user who is also authenticated in their browser (assumedly via cookie). Strangely enough you can show files from the Oauth2 token's matching account, but other than in a browser instance where the target Oauth2 token's account matches the already logged in user any file uploads fail with an ambiguous "Server rejected" message. This happens with all files and file types, including files working in an authenticated browser instance. I assume it's an authentication flow/scope issue of some sort. I haven't tried diving the Picker source.
All of the javascript Google Drive API upload examples seem to rely on using HTML 5 to get the file data, so anything of that nature seems to be ruled out.
While files are uploaded, there's no way other than guesstimating which file came from which user, since we can't get the file object ID from the result in our inaccessible iframe. At best we could make a very rough time based guess, but this is a terrible idea in case of concurrency issues.
We can't set the file name or any other identifier for the file (not even a unique folder) because the REST API relies on that metadata being sent via JSON in the post request body, not via form fields. So we end up with file objects in the drive with no names/etc.
We can't create the file with metadata populated server side (or via jquery/XHR, or the google javascript API client) and then update it with a form based upload because the update API endpoint exclusively works with PUT (tested).
We can't upload the files to our local server and then send them to google (proxy them) as the php ini is locked down to prevent larger file uploads (and back to restrictions imposed on using HTML5 or flash for why we can't chunk files/etc).
All of this has been both researched and to varying degrees tried.
At the moment this is going on hold (at least it was a useful way to learn the API and gain a sense of its limitations) and I'm just going to implement something similar on dropbox, but if anyone has any useful input it would be lovely!
e.g. is there any way to get this working with Drive? Have I overlooked something?
I also realize that this is probably essentially a less than intended use-case, so I'm not expecting miracles. I realize that the ideal flow would be to simply allow users to upload if necessary to their own google drives and then have them grant file access to our web app (via Picker or API+our own UI), but this becomes a problem when not all of our own users are necessarily already google account users. I know that google would OBVIOUSLY prefer we get even more people to sign up with them in order to have this happen, but making people sign up for a google account to use our app was ruled out (not out of any prejudice on our part, it was just too many added steps and potential user hurdles). Even simply having them sign in to google if they did have accounts was deemed unwanted for the basic LCD feature functionality, although it's likely to be added as an additional option on top of whatever becomes the base solution.
The biggest problem with the approach you described is you're introducing a big security issue. Allowing an anonymous user to directly upload to Drive from the client requires leaking a shared access token to anyone who comes by. Even with the limited drive.file scope, a malicious or even slightly curious user would be able to list, access (read/update/delete!) any file that was uploaded by that app.
Of course a public drop box feature is still useful, but you really need to proxy those requests to avoid revealing the access token. If your PHP environment is too restrictive, why not run the proxy elsewhere? You can host a simple proxy to handle the uploading just about anywhere -- app engine, heroku, etc. and support whatever features you need to ensure the metadata is set correctly for your app.

How is CORS safer than no cross domain restrictions? It seems to me that it can be used maliciously

I've done a bit of reading on working around the cross domain policy, and am now aware of two ways that will work for me, but I am struggling to understand how CORS is safer than having no cross domain restriction at all.
As I understand it, the cross domain restriction was put in place because theoretically a malicious script could be inserted into a page that the user is viewing which could cause the sending of data to a server that is not associated (i.e. not the same domain) to site that the user has specifically loaded.
Now with the CORS feature, it seems like this can be worked around by the malicious guys because it's the malicous server itself that is allowed to authorises the cross domain request. So if a malicious script decides to sending details to a malicious server that has Access-Control-Allow-Origin: * set, it can now recieve that data.
I'm sure I've misunderstood something here, can anybody clarify?
I think #dystroy has a point there, but not all of what I was looking for. This answer also helped. https://stackoverflow.com/a/4851237/830431
I now understand that it's nothing to do with prevention of sending data, and more to do with preventing unauthorised actions.
For example: A site that you are logged in to (e.g. social network or bank) may have a trusted session open with your browser. If you then visit a dodgy site, they will not be able to perform a cross site scripting attack using the sites that you are logged in to (e.g. post spammy status updates, get personal details, or transfer money from your account) because of the cross domain restriction policy. The only way they would be able to perform that cross site scripting attack would be if the browser didn't have the cross site restriction enabled, or if the social network or bank had implemented CORS to include requests from untrusted domains.
If a site (e.g. bank or social network) decides to implement CORS, then they should be sure that it can't result in unauthorised actions or unauthorised data being retrieved, but something like a news website content API or yahoo pipes has nothing to lose by enabling CORS on *
You may set more precise origin filter than "*".
If you decide to open your specific page to be included in another page, it means you'll handle the consequences.
But the main problem cannot be that a server can receive strange data : that's nothing new : everything that is received by a server is suspect. The protection is mainly for the user which cannot be abused by an abnormal composition of sources (the englobing one being able to read the englobed data, for example). So if you allow all origins for a page, don't put inside data that you want to share only with your user.

Why is Lotus WCM generating references with myconnect to anonymous users?

Hi we've been getting an issue in which WCM references are being generated
towards protected content for anonymous/unauthenticated users. In WCM this
basically means that the links have a "/myconnect" (protected) context instead
of a "/connect" (unprotected) context in their URL.
Now WAS has a functionality in which if a user generates a request towards a
protected resource while being unauthenticated WAS places the URL to that
resource in a cookie called WASReqURL, and when the user authenticates it
redirects the user to that URL. In my case our users are getting redirected
to an image resource or a file resource, etc. This changes depending what is
the last requested protected item and what's in the cache, etc.
Now I've checked permissions on the libraries, sites, content items, components,
etc, but that still doesn't help. And the worst thing is that this issue is
intermittent almost as if it was tied to some sort of performance issue.
Thank you
Lotus WCM really does that when it thinks the anonymous user does not have enough access rights. If you are absolutely sure that your access rights are properly defined, congratulations. You have found something worth opening PMR. Attach the PUMA mustgather to it.

Resources