Azure and CORS Access-Control-Allow-Origin with ajax and php - ajax

First I'm not in the web side of our world, so be nice with the backend guy.
A quick background : For a personal need I've developped a google chrome extension. They are basically a webpage loaded in a chrome windows and... yeah that's it. Everything is on the client side (scripts, styles, images, etc...) Only the data are coming from a server through ajax calls. A cron job call a php script every hours to generate two files. One, data.json contains the "latest" datas in a json format. Another one hash.json contain the hash of the data. The client chrome application use local storage. If the remote hash differ from the local one, he simply retrieve the data file from the remote server.
As I have a BizSpark account with Azure my first idea was : Azure Web Site with php for the script, a simple homepage and the generated file and the Azure Scheduler for the jobs.
I've developed everything locally and everything is running fine... but once on the azure plateform I get this error
XMLHttpRequest cannot load http://tso-mc-ws.azurewebsites.net/Core/hash.json. No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://localhost:23415' is therefore not allowed access.
But what I really can't understand is that I'm able (and you'll be too) to get the file with my browser... So I just don't get it... I've also tried based on some post I've found on SO and other site to manipulate the config, add extra headers, nothing seems to be working...
Any idea ?

But what I really can't understand is that I'm able (and you'll be
too) to get the file with my browser... So I just don't get it
So when you type in http://tso-mc-ws.azurewebsites.net/Core/hash.json in your browser's address bar, it is not a cross-domain request. However when you make an AJAX request from an application which is running in a different domain (http://localhost:23415 in your case), that's a cross-domain request and because CORS is not enabled on your website, you get the error.
As far as enabling CORS is concerned, please take a look at this thread: HTTP OPTIONS request on Azure Websites fails due to CORS. I've never worked with PHP/Azure Websites so I may be wrong with this link but hopefully it should point you in the right direction.

Ok, will perhap's be little troll answer but not my point (I'm .net consultant so... nothing against MS).
I pick a linux azure virtual machine, installed apache and php, configure apache, set some rights and define the header for the CROS and configure a cron in +/- 30minutes... As my goal is to get it running the problem is solved, it's running.

Related

Fastly CDN Heroku url redirecting

I recently added Fastly domain from addons in heroku application. And when fastly was provisioned I got a test url which is as follows:
https://felix-homes-herokuapp-com.global.ssl.fastly.net/
Whenever I click on this url it gets redirected to
https://felix-homes.herokuapp.com for some unknown reason.
Note my nodejs app uses Heroku-SSL-Redirect. Is it because of this?
I have already followed setup guide and asked multiple issues from the support
https://support.fastly.com/hc/en-us/requests/323620?page=1
And nearest question I find to SO is following
Adding Fastly to a Heroku app does not forward to proper url
Clearing browser cache or changing browser did not help me. Can you please try hitting fastly url on your computer and let me know if you are also face same redirect problem?
Yes, very likely the library (Heroku-SSL-Redirect) is the issue.
In the end, you have two separate requests. An encrypted HTTPS/SSL request from the browser. And then an unencrypted request from Fastly to Heroku.
Your node-application and the library only see the unencrypted request and return the redirect.
There are two ways to solve this:
You configure Fastly do do encrypted requests to Heroku as its backend.
Every routing / proxy layer (fastly, but also the Heroku routing layer) typically use the X-Forwarded-Proto HTTP header to tell the backend application that the initial request was already encrypted. So either heroku-ssl-redirect doesn't look at the header, or it did get lost somewhere on way.

Proxy the right way to go for external REST Api?

We have a need to consume an external REST Api and dynamically update content on our website and have ran into the age old problem of cross site scripting and Ajax.
I've read up on JSONP however I don't want to go down that route in a million years as it seems like really a rather dirty hack.
As a solution to this issue is it "right" and "proper" to have a local service that acts as a proxy for any requests to an external Api? So on the client there would be an Ajax call to ../RestProxy/MakeRequest passing it the details of the request it needs to make to the external api, it performs the request and returns anything passed back.
Any thoughts would be appreciated.
There are three ways to do this:
1. JSONP
This is accepted by many popular APIs and frameworks. JQuery makes it easy. I would recommend this.
2. Proxy
Works pretty much as you described. Adds an extra step and server code and server load for you. However, it does allow you to filter or otherwise manipulate the results before sending them to the client.
3. Rely Access-Control-Allow-Origin
This is a header that the server can set to allow you to read json directly from their server even though you aren't on the same domain. This eliminates the need for the jsonp hack, but it requires the the server be setup to support it and it requires a web browser that supports it.
Access-Control-Allow-Origin is supported in:
IE8+
Firefox 3.6+
Safari 4.0+
Chrome 6+
iOS Safari 3.2+
Android browser 2.1+
If you need to support IE7, then this option isn't for you.

Cross Domain request for service using SproutCore

I have been trying to get this resolved, without any success.
I have a webapp residing on my domain, say www.myDomain.com. I need to call a service which is present on another domain, say www.anotherDomain.com/service.do?
I'm using SproutCore's SC.Request.getUrl(www.anotherDomain.com/service.do?) to call that service.
I get an error that says, Origin www.myDomain.com is not allowed by access-control-allow-origin.
When I was in dev stages, and using sc-server, the issue was resolved using proxies. Now that I have deployed the app to an actual server, I replaced all the lines where I had set up the proxy with the actual domain name. I have started getting that error again.
The problem is that I CANNOT MAKE ANY CHANGES to the server on the other domain. All the posts that I have come across state that the other server on the other domain ought to provide access-control-allow-origin header and that it ought to support the OPTIONS verb.
My question is, is it possible for me to connect to that service using SproutCore's SC.Request.getUrl() method?
Additionally, the other posts that I have read mentioned that a simple GET request ought not to be preflighted. Why then are my requests going as OPTION instead of GET?
Thanks a ton in advance! :D
This is not a Sproutcore issue; it's a javascript Same Origin Policy issue.
If you can't modify the production server, you have no option but to develop your own proxy server, and have your proxy hit the real service.
This is effectively replacing sc-server in your production environment.
All this server would do is take the incoming request and pass it along to www.anotherDomain.com/?service.do.
You would need to make sure you passed all parameters, cookies, headers, the http verb, etc....
This is far from ideal, because now errors can occur in more places. Did the real service fail? Did the proxy fail? etc.
If you could modify the other domain, you could
1) deploy your SC app there.
2) put in the CORS headers so you could make cross domain requests

Google checkout callback can't seem to reach https server

I am trying to implement Google Check out (GCO) on a new server, the process seemed to work fine on the old server.
The error from GCO integration console is the timeout error you might expect if there is load on the server and/or the response takes longer than 3 seconds to respond.
To perform a test (not integrating with my database), I have set some code to send an email to me instead. If I hit the https url manually, I get the email and I can see an output to the screen. If I then leave it as that, Google still returns the Timeout error and I don't get an email. So I have doubts as to whether google is even able to hit the https url.
I did temporarily attempt to use the unsecure url for testing and indeed I received the email, however this solution isn't the route we've developed for, so the problem is something to do with the secure url specifically.
I have looked into the certificate which is a UTN-USERFirst-Hardware which is listed as accepted on http://checkout.google.com/support/sell/bin/answer.py?answer=57856 . I have also tried to temporarily disable the firewall with no joy. Does anyone have any sugestions?
Good to hear you figured out the problem.
I'm adding the links below to add a litle more context for future readers about how Google Checkout uses HTTP Basic Authentication:
http://code.google.com/apis/checkout/developer/Google_Checkout_XML_API.html#urls_for_posting
http://code.google.com/apis/checkout/developer/Google_Checkout_XML_API.html#https_auth_scheme
http://code.google.com/apis/checkout/developer/Google_Checkout_HTML_API_Notification_API.html#Receiving_and_Processing_Notifications

localhost :: cross domain ajax

Is there any way to tell your localhost that it can do cross domain ajax calls?
I need this for my testing.
If it is a browser specific issue i am using google chrome.
Cheers.
It's very possible. Let's start with a dev browser.
Step 1: Download Chromium
Windows -- http://www.chromium.org/getting-involved/download-chromium
Mac -- http://www.macupdate.com/app/mac/36244/chromium/
There should be a build ready to go, but these locations change over time. So if these end up with 404's do a Google search for Windows Chromium Download and you'll find it.
Step 2: Then run the executable with this flag after it. --disable-web-security
Windows -- Create a shortcut to the executable and tag this in the Properties. Or run from [CMD].
Mac -- Open up a terminal and run this straight from there with the flag.
And, you should be good to go. I also setup a quick Apache service and run through a 127.0.0.1 configured domain, but localhost should be just fine. Here's proof.
I hope this helps you!
No, it's absolutely not possible. If it could be disabled by the user then it would be the main target for anyone with nefarious or dubious intent, and as prone as any other software to exploitation. It's difficult enough making secure software, without painting on even more attractive targets.
The only way to implement cross-domain Ajax is to route requests via a server-side script.
It's worth mentioning that there is, perhaps, a glimmer of hope for you: in the form of cross-window messaging with HTML 5 postMessage
It's probably worth your having a read of some related (though I'm not sure they're duplicate) questions:
Why the cross-domain Ajax is a security concern?
Firefox Cross Domain Request
Edited in response to comment:
So you mean have a script that takes the params, adds them to the request, sends it out, and then echos out the response object?
Essentially yes. In picture format:
client |--------------> | server side |-----------------------> | remote domain
browser | <----ajax------| script | <------------------------|--/
Edited to add that this is now sort of possible, using Cross-Origin Resource Sharing (CORS); in which a script from one domain sends an Origin HTTP header stating the URL of the page, and the server can respond (if configured to do so) with either an error (if CORS is disabled, or unsupported) or with any requested data.
References:
CORS compatibility.
Cross-Origin Resource Sharing, at the W3.org.
Enable Cross-Origin Resource Sharing.

Resources