Mule4: "Remotely Closed" error for HTTPS POST - https

HTTPS POST requests are successful in POSTMAN both with and without Port 443. But failing in MULE Application with the error Remotely Closed.
Also, I tried to access the host using commands PING, TELNET and TRACENET for the host server via Command Prompt. Those fail with error Request Timed Out.
Can you please let me know where the issue lies?
Workaround:
For Mule4, remove the Global Element "HTTP Request Configuration" and pass the URL directly.
I am working with SAP Revsym Rest API's and this worked for me.

Plese, share config details. My guess is that config has path as well as path exist in the http request. These 2 pathes combined lead request to wrong place. That would be only possible explanation for provided information.

Related

Using a proxy that requires authentication with pybliometrics

I am using pybliometrics, a Python interface to the Scopus API, to download the abstracts of some papers.
Unfortunately Scopus only works inside the network of the university that subscribed to it. I am currently at home and whenever I try to download something using pybliometrics it gives me the following error:
pybliometrics.scopus.exception.Scopus401Error: The requestor is not authorized to access the requested view or fields of the resource
I need to use my university's proxy in order to enter the internet with the IP address of my university. The proxy has a WPAD configuration file available, but I fail to realize how to use it with pybliometrics. The pybliometrics documentation says to add a block in the configuration file like this:
[Proxy]
ftp = socks5://127.0.0.1:1234
http = socks5://127.0.0.1:1234
https = socks5://127.0.0.1:1234
But this proxy requires authentication. How can I specify the proxy username and password?
EDIT: I have tried setting up the block in config.ini like:
[Proxy]
ftp = http://username:password#proxy.thing.it:8080
http = http://username:password#proxy.thing.it:8080
https = http://username:password#proxy.thing.it:8080
but it still fails with the following error message:
requests.exceptions.ProxyError: HTTPSConnectionPool(host='api.elsevier.com', port=443): Max retries exceeded with url: /content/abstract/scopus_id/84983158344?view=META_ABS (Caused by ProxyError('Cannot connect to proxy.', OSError('Tunnel connection failed: 407 Proxy Authentication Required')))
From our perspective the API will work via a proxy as long as the proxy is configured correctly. I would suggest you speak to the provider of the proxy to see if they can help.
We don't have specific instructions on how to use APIs with a proxy (as there are many potential different versions and potential configurations); however, the general instructions are here:
https://service.elsevier.com/app/answers/detail/a_id/29026/supporthub/elsevieraccess/
To me your new proxy block looks suspicious. It funnels ftp and https requests through the http as well. Maybe try ftp and https as protocols in the corresponding sections.
The other solution is to ask Scopus Integration Support for an InstToken, which you use instead of a proxy. You then specify the InstToken in the configuration file as well.
The problem was that my proxy requires DigestAuth rather than BasicAuth.

Micronaut server and httpclient behind corporate proxy

I'm running a micronaut microservice on a Win 7.
My GET Request looks like : http://localhost:8080/maps/myreq.
The controller use a httpclient to send request to an external webseite : image.maps.api.here.com
When running without proxy, all went fine and the response is ok (an image).
But when running behind the proxy, connection timed out. Proxy works fine for any other applications or browser.
How to set micronaut server behind proxy to properly root requests?
edit : when sending a request, the netty server respond with an error : unable to connect to image.maps.api.here.com:xx.xx.xx.xx:xxxx where xx.xx.xx.xx:xxxx is the proxy
How to set micronaut server behind proxy to properly root requests?
You can set the https.proxyHost, https.proxyPort, http.proxyUser and http.proxyPassword system properties. A common place to do that is in the MN_OPTS environment variable. For example, you could set MN_OPTS to have a value like "-Dhttps.proxyHost=127.0.0.1 -Dhttps.proxyPort=3128 -Dhttp.proxyUser=test -Dhttp.proxyPassword=test".
See https://docs.micronaut.io/1.1.0/guide/index.html#proxy for more info.
I hope that helps.
I fixed the problem with settings the proxy for the CLI but also by setting the proxy in the application.yml like here :
https://github.com/micronaut-projects/micronaut-core/issues/1611

How to simulate "server down" situation using Burp suite proxy?

I tried using Burp suite to simulate the above for a particular domain. I am a beginner on this and don't know how to set it up. I couldn't find a built-in option for this.
Also, if there is no option, will I need to forward the request to some random IP address so that the connection gets timed out?
Update
Actually I doubt if redirecting to some invalid IP will give a connection timeout. Or will it give a timeout? I just want to know what response will I get if the server is down.
There isn't a feature to simulate "server down" but you can redirect as you suggest. If you redirect to an unused IP address (perhaps 192.168.99.99) you will normally get a timeout.
You can configure this in Proxy > Options > Proxy Listeners > Edit > Request Handling
You can just edit the response code to be a server error. You can do this automatically using match and replace as well.

proxying through corporate firewall

I'm trying to get some protocols work through my company's firewall. Until now I have been succesfull in masking either http or https data by setting a http proxy on localhost and one on a remote server I own. The communication is done via $_POSTed and received modified .bmp files that contain a header and the encripted serialised request array.
This works fine, but there are a few drawbacks that make me think I might have taken a wrong approach.
Firstly I do not use apache's mod-proxy. instead I just created a local subdomain (proxy.localhost) and use that in browser's proxy settings. the subdomain's index.php does all the work. This creates some problems. I cannot use http and https simultaneously or the server will complain of using either "http on a https enabled port" or "incoresc ssl response length".
The second problem is, well, other protocols. I could make use of some ftp, sftp, remote deskoptop, ssh, nust name another... I need it
there are 2 solutions I can think of: First is if I run a php script in CLI so that it listens on a predefined port and handles the requests differently, or some sort of ssh tunnel. Problem is I haven't had any success with freeSSHd and putty because of my ignorance.
Thanks in advance for any advice.
I used the free version of bitvise SSH Client and server and it seems to work just fine.

Node.js: Running example of Chat?

Trying to setup an example for node.js chat on Windows x64.
Command line:
D:\Websites\dev\chat>node server.js
Server at http://127.0.0.1:8001/
Now when server part runs, trying http://dev/chat/index.html
After submitting Name, it gives me "error connecting to server".
Same error message on http://chat.nodejs.org/
Does the thing actually work? =)
Do I need to set up an Apache's mod_proxy to handle /join to port 8001?
Some of the issues are with using http://dev/chat/index.html and also, I suspect, with:
Do I need to set up an Apache's mod_proxy to handle /join to port 8001?
Node's http module is more for creating the server than it is for integrating with other servers like Apache. (It's possible, e.g. iisnode, but not the default.)
While node server.js is running, you should be able to access index.html via either:
http://localhost:8001/
http://127.0.0.1:8001/
Then, /join, /recv, /send, etc. should be able to route through the same origin.
Otherwise, using http://dev/ has 2 problems:
Requests will route based on the current address. For example, /join will request http://dev/join rather than http://127.0.0.1:8001/join, likely resulting in a 404 response. And, even if you modified the client script to specify the origin...
Same-origin policy. Pages requested from http://dev/ cannot make Ajax requests to http://127.0.0.1:8001 without exceptions, which this demo does not have established.

Resources