problem with cross-domain ajax calls - ajax

i have two servers a main site and a static server.
i want to get a file's content from ajax in runtime, which is stored in static server.
obviously cross domain problem will occur.
so what i am trying to do is storing that ajax .js in the static server, so that calling the local file wont be a problem.
but after i include that js file from static, still that problem remains...
Any solutions?!

n't use X domain Ajax Requests. Create a "proxy" on your own server (domain) then forward the call to the other domain, cache it, check for security issues and send it back to client again...

Depending on the information you want your ajax request to recieve you could always use something like jsonp which could have the cross-site call.
try looking here for some examples:
http://remysharp.com/2007/10/08/what-is-jsonp/

Take a look at EasyXDM. It's a library which wraps cross-browser quirks and provides an easy-to-use API for communicating in client script between different domains using the best available mechanism for that browser (e.g. postMessage if available, other mechanisms if not).
Caveat: you need to have control over both domains in order to make it work (where "control" means you can place static files on both of them). But you don't need any server-side code changes.

Related

How to use http-proxy-middleware / node-http-proxy as a reverse proxy?

I'm investigating the use of http-proxy-middleware / node-http-proxy as a reverse proxy. Does anyone know if this is really possible?
I've already setup http-proxy-middleware so that I can proxy a request through it (the results are displayed in an iframe), and I'm also able to modify the request headers and html results. Specifically, I'm setting the host/origin headers and rewriting the result to change embedded links so that they go through the proxy as well.
But, some links are generated by js, and rewriting javascript responses seems to be very difficult to do correctly.
Is there a way to do this without rewriting links? I.e., is there any method to configure the iframe to automatically send all requests through the proxy?
Or maybe this is not really possible, and I'd need to use a full proxy like Squid?
Thanks!
This does seem to be possible, to a limited extent. http-proxy-middleware can be configured to edit response headers and to also rewrite the response body, so that links can be rewritten to use the proxy URL. XmlHttpRequest and fetch() requests can also be intercepted to rewrite the requests to use the proxy URL.
Use of a reverse proxy should be 100% transparent to clients and your application code, with zero code changes. So perhaps it is a design problem where I can clarify requirements for you.
URL DESIGN
As an API example, I might design URLs as follows for an API:
Public URL: https://api.mycompany.com/products
Internal URL: https://productservice.internal.com:3000
Note that the public URL of the API is actually that of a route within the reverse proxy.
An internet client would only ever use the public URL. If the internal API ever returns URLs to internet clients, it needs to be configured to use the public URL.
REVERSE PROXIES
The most mature options are probably the nginx based ones, which provide both declarative routing and also the ability to write any logic you like via plugins. There are plenty of examples in Curity guides, which may make you aware of some use cases
A mainstream option is to use the proxy-pass directive to route to an internal URL. The same pattern should work for the node RP you mention, though for simple tasks no custom logic should be needed.
Header configuration is a common thing to do in the RP, eg to ensure that the component receives the original client's IP address, rather than that of the RP, but that is often optional.
MISBEHAVING BACKEND COMPONENT
Perhaps this is the root of the problem - if a website returns the internal URL, eg in redirects or image URLs, then it is wrong. Many tech stacks will have a property such as BaseUrl that fixes this.

Can I request by ajax to different domains or is that a cross-site limitation?

I am planning to develop a project which will have access to different services placed in different domains using ajax, so that it may get different types of data from each of them.
At the beginning I thought that due to cross-site scripting that can't be done so I would have to use a different approach or maybe use a bridge (make the calls to my server which will behind the scenes call the others) but the bridge would become a performance issue.
But then I was testing Angular using Google's API and realized that it just works. I mean, I could make AJAX calls to my localhost (though I know localhost may work just because it's localhost) using a script loaded from googleapis.com.
Now I wonder if it is possible or not to have a page with ajax calls to other domains like: mail.mydomain.com, profiles.mydomain.com, media.mydomain.com, and so on. And if so, can that be done just like that or are there any limitations? Because I remember that some years ago I had trouble doing things like that due to the cross-script block.
Just in case it helps, I'm planning to use Angular to get the data and paint it over the views.
Thanks.
Use JSON-P for cross domain AJAX. http://json-p.org/
Yes, it has limitations, but can be relieved easily.
Set HTTP header "Access-Control-Allow-Origin" to "*" does it.

How to make search-engine friendly pages which display dynamic cross-domain api driven content?

As part of a product we are deploying, clients need to access a remote API on our servers to access content and data. Nonetheless, for some reasons and some clients, a solution where the entire page is on our servers is not desireable (reasons include: control over design, but mostly SEO, and them wanting this content to be available under "their domain")... A script that accesses the API server-side is not desirable due to other issues.
My idea follows (and I will point out its flaws so others can please suggest alternatives):
1) Make a simple script to be hosted on the clients server which will obtain all traffic from a certain URI path (catch-all script, similar to any framework router). so /MyApp/*. This script would always return a single code, a "loader javascript and styling"...
2) Through javascript returned from the script above, extract the URL, and process the URI after the desired path /MyApp/[*] and send it to an external call with JSONP or CORS regular ajax, the return is then styled appropriately and displayed.
With this, a url such as /MyApp/abc and /MyApp/def would have the same html/js in the browser source, but the JS would load different data from the ajax call, therefore showing different content...
This would seem like a good solution, the only drawback is that from my understanding, google and other searchengines wouldnt ever be able to access the content from abc and def, they would only access the "loader javascript and styling" (obvious enough, they arent going to be running the JS)...
So this is better than #! in that it wouldnt screw with URLs, but would still be depending on JS, so not search engine friendly...
Due to server restrictions, I'd much rather have a simple "catchall" page, and have the API called from the client-side than have to impose minimum requirements such as curl, etc... plus I'd have access to the end-user ip address more easily this way (although I could make a more elaborate proxy - which would make installing it much harder on clients' servers)...
Is there a way of achieving this without conneting to the api from the server-side?
The easiest method of doing this IMO is to have an AJAX controller (assuming MVC design) to handle all remote requests. Have each action in your controller return JSON, and then you have easy access to the data with a serverside call.
Otherwise you are using the #! solution (which you don't like, and rightly so..), or using JSONP (a hassle as well).

Is it still necessary to check if a visitor's browser supports Ajax?

If I have Ajax code on my website:
1) Is it still necessary to check if a user's browser supports Ajax? Don't they all?
2) if so, is there an non-ActiveX approach to check this? I'd really like to avoid ActiveX as some security filters could flag it as potential malware.
It's not a question of if browsers support 'ajax', this is just a pithy term used to describe the process of a client retreiving data from a server via Javascript, typically asynchronously, and typically using the XMLHttpRequest object.
This object is not defined by IE 5-6, so you have to write code to compensate, or use a library such as jQuery which encapsulates this.
So, what you should be asking is whether your site gracefully degrades if Javascript is not available on the client. i.e. can users still get to the content without Javascript?

Cross site scripting(XSS)

I am loading content from another page and depending on the content of page, changing content of my page and this is giving me cross site scripting issues.
When i use iframe, since the content is from other domain, content of iframe becomes inaccessible.
When i use ajax and try to inject the content as plain html code, XmlHttpRequest object throws permission denied exception due to cross site scripting.
When i use JSONP, such as getJSON in JQuery, it only supports GET protocol and it is not adequate for further processing.
I wonder what other options i can try. Heard that DOJO, GWT,Adobe Air do some XSS, but dont know which one is the best.
Thanks,
Ebe.
Without JSON-P, your only option is to run a proxy script on your own server that fetches the content from the external site and pipes it back to the browser.
The browser fetches the content from the script on your server, hence no cross-domain issues, but the script on your server dynamically fetches it from the external site.
There's an example of such a script in PHP here: http://www.daniweb.com/code/snippet494.html (NB. I haven't personally used it).
If you have control over both domains, take a look at EasyXDM. It's a library which wraps cross-browser quirks and provides an easy-to-use API for communicating in client script between different domains using the best available mechanism for that browser (e.g. postMessage if available, other mechanisms if not).
Caveat: you need to have control over both domains in order to make it work (where "control" means you can place static files on both of them). But you don't need any server-side code changes.
To add to what RichieHindle says, there are some good script (Python+Cron) that you can plonk on your server and it will check for changes to a POST/GET location and cache the changes on your server.
Either set your triggers low (once every 10 mins/ 1 per day) or you might get blacklisted from the target.
This way, a local cache won't incur the HTTP overhead on every AJAX call from the client.

Resources