HTTPS Request From a Credential Provider DLL - winlogon

I've been creating a Credential Provider DLL that authenticates via the internet before allowing login. However this hasn't worked as well as I expected because my WinHTTP request isn't getting sent. I've confirmed this using wire shark. but I can't figure out why it's not sending any requests off. I've checked that my code is actually calling the functions properly - and they are. But the http request never makes it off. I'm slightly confused at this point and I'm seeking to know if there is some blanket block against https Requests at login.

If you are using C++ CP-V2, so you can use easily from CPPRESTSDK library.
Note: You must note the DNS address and the digital certificate you are using must be valid.

Related

On MacOSX, QNetworkAccessManager gets into an infinite loop when invalid auth credentials specified

In my cross-platform app, I use QNetworkAccessManager to send HTTP requests to my HTTP service that requires authentication. I recently upgraded to QT5, and to my complete surprise on MacOSX my app would send a massive amount of requests to the my service as fast as possible in some scenarios.
After doing some debugging, it turns out that this would only happen when I specify bad auth credentials in my requests. QNetworkAccessManager would indefinitely resend requests to my service if invalid username/password were specified in my HTTP requests.
My code has worked for a long time in previous QT versions, so I decided it has to be something with QT5.
I stumbled upon a following enhancement that was added in QT5: https://bugreports.qt.io/browse/QTBUG-22033
Basically, the idea behind this enhancement os to check keychain for username/password if it intermediate proxy is requiring auth credentials. It turns out this was badly implemented, and this code has been added to the QNetworkAccessManager::authenticationRequired() signal, instead of being added to proxyAuthenticationRequired() signal.
The interesting part about this problem is that I don't set proxy for my application nor QNetworkAccessManager that I use. Which makes this problem so hard to debug!
Because of the bad placement, this "keychain querying" is happening with any authenticationRequired signal. The underlying getProxyAuth() method is calling "SecKeychainFindInternetPassword" with blank hostname which is matching a first "Internet Password" from my keychain and using it to send a request to my service with this new credentials. Imagine my surprise when I saw one of my other/personal passwords being sent to my HTTP service!
Not only this is a security issues, but it cause an infinite loop in your app. I opened a bug with QT about this: https://bugreports.qt.io/browse/QTBUG-30434
Is there a temporary solution? There is! I looked for a workaround to this issue for while. It is a nasty hack. But it works until QT guys get their ducks in a row. This hack works because it ensures that "SecKeychainFindInternetPassword" does not match any entries in the keychain, and therefore skipping that "keychain query".
Basically I am setting proxy hostname to " " instead of "" which will prevent any matching that causes an infite loop in my app.
Workaround:
QNetworkProxy proxy = manager_->proxy();
proxy.setHostName(" ");
manager_->setProxy(proxy);
I hope this is resolved in the next version of QT, so I can remove this horrible hack.

security of sending passwords through Ajax

Is it ok to pass passwords like this or should the method be POST or does it not matter?
xmlhttp.open("GET","pas123",true);
xmlhttp.send();
Additional info: I'm building this using a local virtual web server so I don't think I'll have https until I put upfront some money on a real web server :-)
EDIT: According to Gumo's link encodeURIComponent should be used. Should I do xmlhttp.send(encodeURIComponent(password)) or would this cause errors in the password matching?
Post them via HTTPS than you don't need to matter about that ;)
But note that you need that the page which sends that data must be accessed with https too due the same origin policy.
About your money limentation you can use self signed certificates or you can use a certificate from https://startssl.com/ where you can get certificates for free.
All HTTP requests are sent as text, so the particulars of whether it's a GET or POST or PUT... don't really matter. What matters for security in transmission is that you send it via SSL (and handle it safely on the other end, of course).
You can use a self-signed cert until something better is made available. It will be a special hell later if you don't design with https in mind now :)
It shouldn't matter, the main reason for not using GET on conventional web forms is the fact that the details are visible in the address bar, which isn't an issue when using AJAX.
All HTTP requests (GET/POST/ect) are sent in plain text so could be obtained using network tracing software (e.g. Wireshark) to protect against this you will need to use HTTPS

Cross Domain request for service using SproutCore

I have been trying to get this resolved, without any success.
I have a webapp residing on my domain, say www.myDomain.com. I need to call a service which is present on another domain, say www.anotherDomain.com/service.do?
I'm using SproutCore's SC.Request.getUrl(www.anotherDomain.com/service.do?) to call that service.
I get an error that says, Origin www.myDomain.com is not allowed by access-control-allow-origin.
When I was in dev stages, and using sc-server, the issue was resolved using proxies. Now that I have deployed the app to an actual server, I replaced all the lines where I had set up the proxy with the actual domain name. I have started getting that error again.
The problem is that I CANNOT MAKE ANY CHANGES to the server on the other domain. All the posts that I have come across state that the other server on the other domain ought to provide access-control-allow-origin header and that it ought to support the OPTIONS verb.
My question is, is it possible for me to connect to that service using SproutCore's SC.Request.getUrl() method?
Additionally, the other posts that I have read mentioned that a simple GET request ought not to be preflighted. Why then are my requests going as OPTION instead of GET?
Thanks a ton in advance! :D
This is not a Sproutcore issue; it's a javascript Same Origin Policy issue.
If you can't modify the production server, you have no option but to develop your own proxy server, and have your proxy hit the real service.
This is effectively replacing sc-server in your production environment.
All this server would do is take the incoming request and pass it along to www.anotherDomain.com/?service.do.
You would need to make sure you passed all parameters, cookies, headers, the http verb, etc....
This is far from ideal, because now errors can occur in more places. Did the real service fail? Did the proxy fail? etc.
If you could modify the other domain, you could
1) deploy your SC app there.
2) put in the CORS headers so you could make cross domain requests

Authentication protocol for AJAX-based service

We are currently in the process of building a new service, the intent is to use PHP for the backend, and more importantly, use AJAX rather than regular HTTP requests for the frontend. So there will only ever be one initial page request.
While doing this, we'd also like to make sure that it is secure.
So the problem is this:
Login is based on regular username/password. The AJAX frontend will make AJAX-requests to the server as necessary, but what should be done to avoid unnecessary security issues? Hashing the password is obviously one, it can be further improved by also including a server generated token in the hash, etc, etc.
But, I'm sure there are established protocols for these things, but I really don't know the merits of them... or even what they're called or where to find them (note, the server itself is trusted).
Would using HTTPS make all this redundant? Or is for instance hashing the password still strictly necessary (theoretical question)? Would using a protocol still be important/useful/pointless over HTTPS?
http://en.wikipedia.org/wiki/Secure_Remote_Password_protocol, is that something I should look into? Does HTTPS make SRP redundant? Are there more suitable protocols, especially over HTTPS?
If you use PHP sessions, you should be fine. When you make a AJAX call, the server will be sent a cookie payload from your page indicating which session ID it corresponds to. Obviously, do this over HTTPS to make the communication secure.

HTTP digest authentication for AJAX requests

Hey SO, so I've got an API I'm making calls to in a browser application. Said API lives on a server that requires whitelisting and HTTP Digest Authentication.
To meet the whitelisting requirement, I'm running all API calls through a proxy, which is whitelisted. The calls are originating from an iFrame, currently populated by an index.html file.
What I need to know is how I can authenticate via HTTP Digest in the background. Most of the resources I can find online seem to involve the original HTTP Digest Authentication setup, but what I'm looking to do is automate login.
Despite the non-secretive subject matter, it is somehow critical that I keep the digest parameters obfuscated from users. Perhaps I could change the served file to index.php and then somehow set the magic headers? Even then, if the calls made via XHR, would the index.php headers authenticate the separate request?
Overall, I'm just lost, and the API developers in question are not exactly responsive, so thought I'd turn here.
It appears that in the end, this was not possible. I had to switch to building a thin back-end to route requests through.

Resources