Do all API calls contain a Referer header on VodaPay Mini-Programs? - vodapay-miniprogram

When inspecting outbound API requests, there is a header called Referer,
In the simulator I see the referrer is https://[app-id].hybrid.alipay-eco.com/index.html where [app-id] is the App ID. I want to confirm this is the same regardless of the Sandbox environment or for Production. Also if there is any difference between simulator and device.

The URL https://[app-id].hybrid.alipay-eco.com is for the Referer for the simulator.
The URL https://[app-id].sass.mini-program.com is the referrer being used in production. In both cases is the App ID set by VodaPay.
However, if you preview the app on the device (i.e. a dev build from the simulator that is only valid for 15 min) no Referer header is sent.

Related

safari and firefox does not send cookie when send http request to remote server with the same sub domain name but chrome does

I have two servers, a.example.com and b.example.com
The cookie with domain .example.com was set in a.example.com/admin
I visit a.example.com/admin page, and in this page, a http request was send to b.example.com
I had a packet capture and just found that the cookie was not send when I use safari and firefox browser, but in chrome, the cookie was send.
so I was wondering way this happen, and does there exist any method by which the safari and firefox can send the cookie?
Check this link, it may help you figure this out: https://discourse.mozilla-community.org/t/webextension-xmlhttprequest-issues-no-cookies-or-referrer-solved/11224/15
It seems that either you need to enable 'third party cookies' or you need to wrap XMLHttpRequest. Also, make sure the website is listed in the permissions section of your manifest file: https://developer.mozilla.org/en-US/Add-ons/WebExtensions/manifest.json/permissions

UIWebView load https but quick change it to http

I use UIWebview to load https url, mostly https urls work normal in my app, but some urls make the UIWebview delegate webView:shouldStartLoadWithRequest:navigationType: called twice.
First time, the request parameter has the right https url. Second time, the url in request parameter change to http scheme, this cause the url load failed with error 1022: "The resource could not be loaded because the App Transport Security policy requires the use of a secure connection."
I use Charles to see the network, can not see the https request, means not 320 reqeust, not see the http request either, because the request is block by iOS system, so failed with 1022 code
It's so weird, what the problem is!!!
The https url can be load normal in safari or chrome. I create a new project with only the webview load the https url,It's OK

CORS with client https certificates

I have a site with two https servers. One (frontend) serves up a UI made of static pages. The other (backend) serves up a microservice. Both of them happen to be using the same (test) X509 certificate to identify themselves. Individually, I can connect to them both over https requiring the client certificate "tester".
We were hiding CORS issues until now by going through an nginx setup that makes the frontend and backend appear that they are same Origin. I have implemented the headers 'Access-Control-Allow-Origin', 'Access-Control-Allow-Credentials' for all requests; with methods, headers for preflight check requests (OPTIONS).
In Chrome, cross-site like this works just fine. I can see that front-end URLs and backend URLs are different sites. I see the OPTIONS requests being made before backend requests are made.
Even though Chrome doesn't seem to need it, I did find the xmlhttprequest object that will be used to perform the request and did a xhr.withCredentials = true on it, because that seems to be what fetch.js does under the hood when it gets "credentials":"include". I noticed that there is an xhr.setRequestHeader function available that I might need to use to make Firefox happy.
Firefox behaves identically for the UI calls. But for all backend calls, I get a 405. When it does this, there is no network connection being made to the server. The browser just decided that this is a 405 without executing any https request. Even though this is different behavior from Chrome, it kind of makes sense. Both the front-end UI and backend service need a client certificate to be chosen. I chose the certificate "tester" when I connected to the UI. When it goes to make a backend request, it could assume that the same client certificate should be used to reach the back-end. But maybe it assumes that it could be different, and there is something else I need to tell Firefox.
Is anybody here using CORS in combination with 2 way SSL certificates like this, and had this Firefox problem and fixed it somewhere. I suspect that it's not a server-side fix, but something that the client needs to do.
Edit: see the answer here: https://stackoverflow.com/a/74744206/537554
I haven't actually tested this using client certificates, but I seem to recall that Firefox will not send credentials if Access-Control-Allow-Origin is set to the * wildcard instead of an actual domain. See this page on MDN.
Also there's an issue with Firefox sending a CORS request to a server that expects the client certificate to be presented in the TLS handshake. Basically, Firefox will not send the certificate during the preflight, creating a chicken and the egg problem. See this bug on bugzilla.
When using CORS with credentials (basic auth, cookies, client certificate, etc.):
Access-Control-Allow-Credentials must be true
Access-Control-Allow-Origin must not be *
Access-Control-Allow-Origin must not be multi-value (neither duplicated nor comma-delimited)
Access-Control-Allow-Origin must be set to exactly the value from the request's Origin header in order for the request to work (either hard-coded that way or if it passes a whitelist of allowed values)
The preflight OPTIONS request must not require credentials (including the client certificate). Part of the purpose of the preflight is to ask what is allowed in a CORS request, and therefore sending credentials before knowing if they are allowed is incorrect.
The preflight OPTIONS request must return a 200-level response, generally 204
Note: For Access-Control-Allow-Origin, you may want to consider allowing the value null since redirect chains (like the ones typically used for OAuth) can cause that Origin value in a request from a browser.

Selective "No 'Access-Control-Allow-Origin' header is present on the requested resource" error

I get this error when I execute a GET request to a remote REST service from a web app running on a local computer, but when I run this through the browser (Chrome) or through "Advanced Rest Client Application" (chrome extension)- everything goes well, but I see that there is no 'Access-Control-Allow-Origin' there either!
Why is that?
You will need to have appropriate Access-Control-Allow-Origin:* or the actual domain's value from where the client app makes request to REST service present as part of your response header. Chrome browser and Rest client extensions do not need this header present that's why you are able to see correct output there
See https://developer.mozilla.org/en-US/docs/Web/HTTP/Access_control_CORS for further info

Selenium IDE: How to detect secure cookies on page loaded with http://?

I am using Firefox 22 and Selenium IDE 2.2.0.
I have loaded a page in firefox using the HTTP protocol (not HTTPS). I know for sure that the page has set a secure cookie (as a result of an embedded AJAX request). I can verify this using the browser internal url chrome://web-developer/content/generated/view-cookie-information.html - because among other cookies that page shows a cookie like this:
Name WC_AUTHENTICATION_5122759
Value 5122759%2cDKppXa7BAqnZ0ERDLb0Wee%2bXqUk%3d
Host .testserver.dk
Path /
Expires At end of session
Secure Yes
HttpOnly No
However, when I run assertCookie in the Selenium IDE I can only see the unsecure cookies. I.e. all cookies - except then one above - are detected by Selenium IDE:
Executing: |assertCookie | glob:WC_AUTHENTICATION_* | | yields this set of visible cookies:
[error] Actual value 'JSESSIONID=0000uCQdh2FZ0ZA8z-O5zcGoUtD:-1;
WC_PERSISTENT=lT8Z5tbkQrvLhNm%2bGyCj%2bh4yPAU%3d%0d%0a%3b2013%2d07%2d05+13%3a18%3a18%2e807%5f1373023098807%2d3048%5f10201%5f5122827%2c%2d100%2cDKK%5f10201;
WC_SESSION_ESTABLISHED=true;
WC_ACTIVEPOINTER=%2d100%2c10201; WC_USERACTIVITY_5122827=5122827%2c10201%2cnull%2cnull%2cnull%2cnull%2cnull%2cnull%2cnull%2cnull%2cy6bjcrZgvCVe5c52BBKvcItxyF5lLravpDq9rd9I0ZmRfRNxcC2oG13Eyug3kKgbtLOHVLxm9T76%0d%0a%2fGJFLp5bOrkPoNqmc38TIr%2fO7eU%2fbd7Mfny2kQg7v6xGweYoRkXYgAEz91rH0QavFhlOjpd12A%3d%3d;'
did not match 'glob:WC_AUTHENTICATION_*'
So does anyone know how can I use the Selenium IDE to verify the presence of secure cookies on a page loaded with http:// (not https://) ?
Sadly, what you are doing is breaking the specifications. A secure cookie is suppose to be only available if the connection is secure. Hence, if you are connecting with HTTP, you can't see it.
However, if this is just on your test machine (not your end user), you can modify the response from the server using Fiddler. With Fiddler, you can program something like, if you see this cookie, add another cookie, or strip the secure flag.
EDIT:
Some background information about Selenium and cookies:
Selenium works through the browser with JavaScript as part of the page. Because it is essentially a part of the page, it has to follow all the same rules as the page. This means that it still has to abide by the security rules on cookies. A secure only cookie can only be read on a secure connection, thus Selenium cannot read a secure cookie if it's not on a secure connection.
The place where HTTP request comes in is that cookies are a part of the HTTP header. Both the request (from the browser) and the response (from the server) have an HTTP header. Cookies are present in both.
You want to verify if the server has set the cookie, so you want to inspect the HTTP response from the server for the presence of the cookie. Because of security restrictions, however, you cannot from Selenium. These security restrictions are enforced by the browser. All reputable browsers enforce these policies, since without these policies, the end user's credentials will be easily compromised.
This is where Fiddler comes in. Fiddler inspects the HTTP data at a lower level, before the browser gets to it. Thus, you can use Fiddler to manipulate the data before it gets to the browser to give some kind of indication that the cookie was present.

Resources