Sudden self signed certificte problem in multiple environments - laravel

I am using Laravel and am trying to send email using Mailgun and Laravel's native Mailable class. The emails are generated as a result of submitting one of several forms. I have been developing my features for a couple of days, and have successfully been receiving emails from my local machine (using homestead) throughout this time.
I have uploaded my code to a server, tested the forms, and everything is still good. Additionally, a colleague of mine has downloaded the code and tests are still successful. So in short, 2 local homestead environments and one ubuntu server are all working as expected.
Suddenly, this functionality has stopped working in all three environments. Upon submitting any form, I get the following error message:
GuzzleHttp \ Exception \ RequestException
cURL error 60: SSL certificate problem: self signed certificate in
certificate chain (see http://curl.haxx.se/libcurl/c/libcurl-errors.html)
This has started happening without any changes being made to the code, and is happening on both the local environments and the server - all at the same time.
I have absolutely no idea what could cause this. Is this mail related and something to do with Mailgun? Is this really a certificate issue and maybe something to do with a corporate certificate that allows traffic to leave the network? I am at a loss.
Is anyone able to offer any advice?
Thanks

Well, after a good night's sleep, I returned this morning and found all my forms working again. I have no idea what the problem was, but it seems to be a problem with an external service, not my setup.
Thanks

Related

Not able to intercept traffic from nike.com login request

I'm using BurpSuite to intercept the HTTP/HTTPS requests sent when logging in on https://www.nike.com/. I'm trying to achieve this with the following step:
Opening BurpSuite and Firefox
Turning on the proxy intercept
Turning on FoxyProxy on Firefox
Opening the website and trying to logging
These steps usually work for me, but in this case, I'm getting a "we are unable to connect to our servers" error without anything appearing on the intercept tab when trying to logging (I have tried turning off the intercept feature but it still yields the same issue, so I think it might be a proxy and certificate problem).
To clear things up:
I'm running the latest versions of BurpSuite and FireFox.
I have installed and reinstalled the BurpSuite certificate using this guide.
I've tried all of this on my iMac, MacBook and iPhone all of these devices yield the same issue
Here bellow is the error message I'm getting:
Here are my BurpSuite Proxy setting:
(in the Certificate tab I just have Generate CA-signed per-host certificates selected)
I have been using BurpSuite for over 2 years now and it's the first time I'm facing such an issue, any help is appreciated
I have shared my question with the Portswigger support (the team behind BurpSuite) and got the following response:
Hi
Thanks for your message.
We have reproduced the issue in our testing environment.
It looks like Nike.com are performing a fairly sophisticated check to
stop automated tool from accessing parts of their site.
Please let us know if you need any further assistance.
Cheers
Liam Tai-Hogan
PortSwigger Web Security

LimeSurvey RemoteControl API failing in localhost in Macos Sierra

Currently using Version 2.57.1+161205 of Lime Survey.
I recently got a new Macbook Pro laptop and set up my dev environment. Everything seems to be working except when I try to make JSON RPC calls to the LimeSurvey Remotecontrol API!
It fails at getting the session key from Lime Survey (let alone any other call).
It just pauses for a bit then returns an error saying it is unable to connect to the server. Before the Mac, I had a Windows based PC using Vagrant/Homestead - I had no problems accessing the API.
If I open a browser tab and type in http://lime.app/admin/remotecontrol (lime.app is my vhost pointing to my limesurvey installation) -- I get the list of available API functions etc
But when I try to use the functions through the RPC client (I'm using weberhofer/jsonrpcphp), I get the error.
As a test, I tried to get a session key from an online instance of LimeSurvey (its in a test server). That works perfectly.
But when I'm trying to do the same call locally, it fails each time.
Is there something that needs to be set in the Mac to allow this type of call??
I didn't work out if this was just a Mac issue, but I seem to have resolved the problem.
I'm using the JSON RPC client from https://github.com/weberhofer/jsonrpcphp
On a whim I thought I'd try a different client, so I decided to use https://github.com/fguillot/JsonRPC instead.
Strange thing is, it works!! I have no idea why the previous client would no longer work.
I guess this is a mac address issue , right?
I am not a mac user, so I can't help you much.
Nevertheless, I would test if you are pointing to the right server.
As that open the remotecontrol_handle file and add some kind of error_log command in the get_session_key function. That will tell if Lime is ever being contacted or not.

Laravel 5 mailgun works locally (Mac OS) but nothing happened on live server

I developing a web application using Laravel 5, application needs send email to members when they registered.
I use mailgun as mail driver, everything works fine locally on my Mac Book Pro machine, it means everything should configured correctly, but when I move to Ubuntu server (14.04), the mailgun didn't send any email out neither any error message.
I tried using curl command on server, it works fine, means server should not block any port for mailgun.
Anyone please can help on this issue?
As this works from one environment, and not in another, it's very likely to be a configuration problem. This might differ in a few places with mail in Laravel.
You'll need to check that pretend in config/mail.php is set to false. See here.
You'll need to make sure your .env file (if you're using .env) is first renamed from .env.example to .env and then check that the settings are set up correctly for Mailgun. I wrote a small tutorial for Mailgun Laravel 5 here if you have any problems with that.
You'll need to check that the third party services file in config/services.php is configured correctly. More specifically, you need to ensure that the API key is set up correctly.
If none of these work correctly, then you'll almost certainly have some error message in your storage/logs folder, I recommend checking here for some hints.
Most of the time this happens due to wrong env configuration try this one. You can simplify the confusion between the Mail.php and .env settings

Web API on IIS7.5 Unable to download * from localhost

I'm currently trying to install web api project directly from vs 2013 to my local dev machine, but when I test one of my http get controllers, I get the following:
My url looks like this:
http://localhost:8081/api/Location/States?queryTerm=Ark
When I test this on my local, this url works and I get my json response.
pay no attention to the web config error in the background. If I enter anything invalid, I get an IIS error, so I know I'm hitting the right URL.
Side note: I'm using IE8 for testing.
I found out the solution, and it may be a config thing, but when I made this as an virtual application to a website, this went away, and it gave me a yellow sign of death (which is a good thing). I was able to deduce this to be an oracle issue, where it couldn't find the database connection, and then found out it couldn't resolve tns names.
So I guess Web API can not be hosted by itself on IIS? I haven't looked into it, but it seems this to be the case, unless there needs to be more tweaking involved.
i know too late for response. But i get same error when I try deploy Odata v4 Application in IIS server.
In server , i hit this error as above, i try everything config but nothing change.
At last, i try connect from client and it work.
I recognize that in Client , it download a json file from Server (in case use Odata , it seem like wsdl file in webservice) and API still work well.
For some reason, it cant download in server, but dont worry, it still work.
Hope this helps!

Deploy using IIS Web Deploy (WMSv)c with basic auth fails but NTLM works?

I'm trying to setup Web Deploy on IIS 7, so that 1-click publishing in Visual Studio works.
Every time i try and publish the app i get a 401 error, which seems to be failing to auth against WMSvc. I have set the build output verbosity to detailed and can see the web deploy command being used. When i try and run it from the command prompt i get the same 401 error (ERROR_USER_UNAUTHORIZED), however when i change the the authType parameter in the command from basic to NTLM it works fine and publishes correctly...
As far as i was aware WMSvc only worked with basic auth and not NTLM. As far as my server config goes i have tried setting the management service to accept only windows users and to allow Windows users and management service users, neither setting seems to make any odds.
I can connect fine using IIS manager locally to the remote server, but as soon as i try and use any of the export functionality on the remote server i get permission issues from the remote connection. This all seems most odd, can any one shed some light on this behaviour?
Just providing the answer that worked for me, after searching in vain I stumbled upon an article by Phil Haack (whilst looking for something else entirely):
It turned out I had a URL-ACL defined which was stopping everything from working.
Followed the instructions in that post and it all just worked like it should :-)
I personally wish web deploy was a bit less fragile when it comes to setting it up, works great once you've gone through the pain.

Resources