Connection between javafx2.2 webengine and webscarab fails - proxy

Here is the deal. I want to set WebScarab as the internal proxy in my javaFX2.2 Web engine browser. I try a solution as described in here and also had a look to the links in the same page, but i get an error in the web view. here is a sample of my code:
public WebBrowser() {
System.setProperty("http.proxyHost", "localhost");
System.setProperty("http.proxyPort", "8008");
//ProxySelector.setDefault(new AlwaysProxySelector());
WebView view;
final WebEngine eng;
view = new WebView();
view.setMinSize(10, 10);
view.setPrefSize(500, 400);
eng = view.getEngine();
eng.load("http://www.google.gr");
}
As you can see i try also the alternative solution with class AlwaysProxySelector.
Web Scarab has a proxy that runs in localhost at port 8008 by default. First I run web scarab and then my JavaFX application. And here is the problem. The application does not throw any exception in the output screen. But in the web view, the page I want to load, never appears and a message from web scarab loads in the web view as below:
WebScarab encountered an error trying to retrieve
GET http://www.google.gr:80/ HTTP/1.1
Accept-Language: en-us,en;q=0.5
Accept-Encoding: gzip
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
User-Agent: Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/535.14 (KHTML, like Gecko) JavaFX/2.2 Safari/535.14
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,/;q=0.8
Cache-Control: no-cache
Pragma: no-cache
Host: www.google.gr
Proxy-Connection: keep-alive
The error was :
Connection refused: connect
at java.net.DualStackPlainSocketImpl.waitForConnect(Native Method)
at java.net.DualStackPlainSocketImpl.socketConnect(Unknown Source)
at java.net.AbstractPlainSocketImpl.doConnect(Unknown Source)
at java.net.AbstractPlainSocketImpl.connectToAddress(Unknown Source)
at java.net.AbstractPlainSocketImpl.connect(Unknown Source)
at java.net.PlainSocketImpl.connect(Unknown Source)
at java.net.SocksSocketImpl.connect(Unknown Source)
at java.net.Socket.connect(Unknown Source)
at org.owasp.webscarab.httpclient.URLFetcher.connect(URLFetcher.java:363)
at org.owasp.webscarab.httpclient.URLFetcher.fetchResponse(URLFetcher.java:224)
at org.owasp.webscarab.plugin.saml.SamlHTTPClient.fetchResponse(SamlHTTPClient.java:84)
at org.owasp.webscarab.plugin.proxy.CookieTracker$Plugin.fetchResponse(CookieTracker.java:130)
at org.owasp.webscarab.plugin.proxy.BrowserCache$Plugin.fetchResponse(BrowserCache.java:101)
at org.owasp.webscarab.plugin.proxy.RevealHidden$Plugin.fetchResponse(RevealHidden.java:100)
at org.owasp.webscarab.plugin.proxy.BeanShell$Plugin.fetchResponse(BeanShell.java:229)
at org.owasp.webscarab.plugin.proxy.ManualEdit$Plugin.fetchResponse(ManualEdit.java:243)
at org.owasp.webscarab.plugin.proxy.ConnectionHandler.run(ConnectionHandler.java:228)
at java.lang.Thread.run(Unknown Source)
As I can understand from the error, it seems that web scarab successfully connect to the page, but it can not retrieve the page back to the webview. The same problem occurs for every page, not only google.
I do not want to use any other proxy, but only web scarab to get the advantage of using its plugins. Thanks for any idea.

For some reason, WebScarab is unable to reach the sites in question. This is clearly nothing to do with WebView, so we can eliminate this from the equation.
The most likely problem is that there is a proxy configured in WebScarab itself, that WebScarab cannot reach. To check this, go to Tools -> Proxies, and make sure that there is no proxy configured (unless you need an upstream proxy to reach the sites normally, in which case make sure that that is properly configured.)

Related

Jmeter: 443 failed to respond

I'm a newbie to JMeter so be gentle. I have a simple test plan that hits a login page. At this point, it doesn't even log in, just loads the page. The problem is that on each run, the request for one of the CSS files on the page results in:
org.apache.http.NoHttpResponseException: example.com:443 failed to respond
at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:141)
at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:56)
at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:259)
at org.apache.http.impl.AbstractHttpClientConnection.receiveResponseHeader(AbstractHttpClientConnection.java:286)
at org.apache.http.impl.conn.DefaultClientConnection.receiveResponseHeader(DefaultClientConnection.java:257)
at org.apache.jmeter.protocol.http.sampler.hc.ManagedClientConnectionImpl.receiveResponseHeader(ManagedClientConnectionImpl.java:199)
at org.apache.jmeter.protocol.http.sampler.MeasuringConnectionManager$MeasuredConnection.receiveResponseHeader(MeasuringConnectionManager.java:212)
at org.apache.http.protocol.HttpRequestExecutor.doReceiveResponse(HttpRequestExecutor.java:273)
at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:125)
at org.apache.http.impl.client.DefaultRequestDirector.tryExecute(DefaultRequestDirector.java:684)
at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:486)
at org.apache.http.impl.client.AbstractHttpClient.doExecute(AbstractHttpClient.java:835)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83)
at org.apache.jmeter.protocol.http.sampler.HTTPHC4Impl.executeRequest(HTTPHC4Impl.java:697)
at org.apache.jmeter.protocol.http.sampler.HTTPHC4Impl.sample(HTTPHC4Impl.java:455)
at org.apache.jmeter.protocol.http.sampler.HTTPSamplerProxy.sample(HTTPSamplerProxy.java:74)
at org.apache.jmeter.protocol.http.sampler.HTTPSamplerBase$ASyncSample.call(HTTPSamplerBase.java:2034)
at org.apache.jmeter.protocol.http.sampler.HTTPSamplerBase$ASyncSample.call(HTTPSamplerBase.java:2002)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
If I run the test again, all other requests succeed, but this one file fails every time.
Here's the request data:
GET https://example.com/mysite/style/bootstrap.min.css
GET data:
Cookie Data:
JSESSIONID=0753E35583DE3C882E88BE4C37FCFB47; BIGipServerdemo-tomcat=654354624.51526.0000
Request Headers:
Connection: keep-alive
Upgrade-Insecure-Requests: 1
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8
User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/67.0.3396.79 Safari/537.36
Accept-Language: en-US,en;q=0.9
Accept-Encoding: gzip, deflate, br
Host: example.com
When I load this page in Chrome or Firefox, all files load, including this CSS file, without any problem. How can I fix this NoHttpResponseException so this CSS file (and I'm assuming more down the road) return as expected?
I was having the same issue with both embedded resources and the main HTTP requests.
Disabling the 'Use KeepAlive' option in the HTTP Request Samplers fixed the issue for me.
This file might be in your browser cache, try using clean session and inspect the request using browser developer tools and double check if the file still there. If not - it's an issue with your application deployment. If yes - I can think only of comparing requests sent by JMeter and real browser using either View Results Tree listener and browser developer tools or external sniffer tool like Wireshark
As a workaround you can disable checking missing embedded resources by adding the next line to user.properties file
httpsampler.ignore_failed_embedded_resources=true
JMeter restart will be required to pick the property up.
You should check HTTP_Request's Retrieve All Embedded Resources from HTML Files checkbox
Tell JMeter to parse the HTML file and send HTTP/HTTPS requests for all images, Java applets, JavaScript files, CSSs, etc. referenced in the file
So you don't need to make explicit request for each CSS (As browser)

Omit display of HTTP requests by URL or other means

When using Fiddler for web debugging with Visual Studio, the vast majority of requests appear to be Visual Studio keepalive's which have nothing to do with development of the website.
I just discovered the "Filters" tab which includes: Show only if URL contains:, but I don't see anything like "Do not show if URL contains:"
Below is an image showing the traffic in question.
The contents of which resembles:
GET /67e56dbd9660475b992bdb4884bf024c/arterySignalR/poll?transport=longPolling&connectionToken=AQAAANCMnd8BFdERjHoAwE%2FCl%2BsBAAAAA9mo0FfMdkuV%2FOrook6XLgAAAAACAAAAAAADZgAAwAAAABAAAACdwdngu4Q3YaxNPSSSB6SaAAAAAASAAACgAAAAEAAAAEpHLB83IL2dS4l5v3LvZ4woAAAAPAHEqYMxK%2Fwwk%2Be%2FEq3MMrbOM4ao8Nhip4toaFxOxM0ARXitnQCueRQAAADELXsi%2FlcBeN%2BcFxQKtcMb7Yvd3A%3D%3D&messageId=d-B39A7C95-E4%2C0%7CE7%2C4%7CE8%2C0&requestUrl=http%3A%2F%2Flocalhost%3A56602%2FReticleDatabase%3Fsubmit%3DSearch%26process%3D%26device%3D%26lev_no%3D999%26xadj%3Dtrue%26xadj%3Dfalse%26xmag1%3Dtrue%26xmag1%3Dfalse&browserName=Internet+Explorer&tid=8&callback=jQuery18206701631324945791_1391540298842&_=1391540397878 HTTP/1.1
Accept: application/javascript, */*;q=0.8
Referer: http://localhost:56602/ReticleDatabase?submit=Search&process=&device=&lev_no=999&xadj=true&xadj=false&xmag1=true&xmag1=false
Accept-Language: en-US
User-Agent: Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)
Accept-Encoding: gzip, deflate
Host: localhost:61010
Connection: Keep-Alive
How can I filter (not display) this junk data in Fiddler?
Fiddler offers many ways to filter data. The most powerful mechanism is FiddlerScript. Click Rules > Customize Rules. Scroll to OnBeforeRequest and add:
if (oSession.urlContains("SignalR/poll?")) { oSession["ui-hide"] = "FiddlerScript hides signalR"; }
Save the file.
(It's not entirely clear that SignalR's longPolling requests really have "nothing to do with development of the website", but if you don't want to see them, they're easily hidden.)
Incidentally, the next build of Fiddler's Filters tab will include a Hide URLs containing option. Thanks for the suggestion.

XMLHTTP / HTTPRequest returning 404 response on custom error page requests

I have a program that uses an XMLHTTPRequest to gather contents from another web page.
Problem is, that web page has cloaking custom errors set-up (ie. /thisurl doesn't literally exist as a file on their web server, it is being generated by the custom 404 error file.), so its not returning the page it shows in the browser, instead its showing its default 404 error response from that custom error page, in my HTTPRequest response.
By using this website http://web-sniffer.net/ I have narrowed down what the problem may be, but I don't know how to fix it.
Web-sniffer has 3 different versions to submit the request:
HTTP version: HTTP/1.1
HTTP/1.0 (with Host header)
HTTP/1.0 (without Host header)`
When I use HTTP/1.1 or HTTP/1.0 (with Host header) I get the correct response (html) from the page. But when I use HTTP/1.0 (without Host header) it does not return the content, instead it returns a 404 error script (showing the custom error page).
So I have concluded that the problem may be due to the Host header not being present in the request.
But I am using MSXML2.XMLHTTP.3.0 and haven't been able to read the page using HTTP/1.1 or HTTP/1.0 (with Host header). The code looks like this:
Set objXML = Server.CreateObject("MSXML2.XMLHTTP.3.0")
objXML.Open "GET", URL, False
objXML.setRequestHeader "Host", MyDomain '< Doesnt work with or w/out this line
objXML.Send
Even after adding a Host header to the request, I still get the template of the 404 error returned by that custom error script in my response, the same as HTTP/1.0 (without Host Header) option on that web-sniffer site. This should be returning 200 OK like it does on the first two options on web-sniffer, and like in a web browser.
So I guess my question is, what is that website (web-sniffer.net) able to get the proper response with their first two HTTP version options, so I can emulate this in my app. I want to get the right page, but its only returning the 404 error from their 404 error template.
In response to an answerer, I have provided screen shots from 2 seperate cUrl requests below, one from each one of my servers.
I executed the same cURL command, same url (that points to a site on the main host), which is cURL -v -I www.site.com/cloakedfile . But looks like its not working on the main server, where it needs to be. It can't be a self-residing issue, because from secondary to secondary it works fine, these are both identical applications/sites, just different ip's/host names. It appears to be an internal issue, that may not be about the application side of things.
I dont have any idea bout MSXML2.XMLHTTP.3.0. But from you problem statement i understand that the issues is certainly due to some HTTP header field that is wrongly set or missed out in your request.
By default HTTP 1.1 clients set Host header. For example if you are connecting to google.com then the request will look like this
GET / HTTP/1.1
Host: google.com
The "Host" header should have the domain name of the server in which the requested resource is residing. Severs that has virtual hosting will get confused if "Host:" header is not present. This is what happens with groups.yahoo.com if you havent specified Host header
$ nc groups.yahoo.com 80
GET / HTTP/1.1
HTTP/1.1 400 Host Header Required
Date: Fri, 06 Dec 2013 05:40:26 GMT
Connection: close
Via: http/1.1 r08.ycpi.inc.yahoo.net (ApacheTrafficServer/4.0.2 [c s f ])
Server: ATS/4.0.2
Cache-Control: no-store
Content-Type: text/html; charset=utf-8
Content-Language: en
Content-Length: 447
And this should be the same issue you are facing with. And also make sure that you are sending the domain name of the server from which you are trying to fetch the resource. And the Host header should have a colon ":" to delimit the value like "Host: www.example.com".

Unable to recieve PUT Data using Code Igniter, REST, Lighttpd url-rewrite and Backbone.js

I am using CodeIgniter with REST APIs and Backbone.js. Web server is lighttpd. I also have lighttpd.conf changed for rewriting URLs to remove index.php from urls as follows:-
url.rewrite-once = (
"/(.*)\.(.*)" => "$0",
"/(css|files|img|js|stats)/" => "$0",
"^/([^.]+)$" => "/index.php/$1"
)
The call from js reaches to Codeigniter Put function but the value for $this->put() is empty. Please help me out in tracing why the value for PUT data is empty.
Note: This works perfectly fine when I don't rewrite the URLs. Also this works for Apache web server.
On debugging REST_Controller.php file, in the function "protected function _parse_put()", I found out that the contents of file "php://input" is also empty
Following is the Request Headers and Payload taken from Chrome Web inspector, while making a PUT Call:-
**Request Headers**
PUT http://10.82.0.160/cav_settings/api/2 HTTP/1.1
Pragma: no-cache
Origin: http://10.82.0.160
Cache-Control: no-cache
User-Agent: Mozilla/5.0 (X11; Linux i686) AppleWebKit/537.17 (KHTML, like Gecko) Chrome/24.0.1312.56 Safari/537.17
Content-Type: application/json
Accept: application/json, text/javascript, */*; q=0.01
Referer: http://10.82.0.160/settings/2
X-Requested-With: XMLHttpRequest
**Request Payload**
{"aggregation":{"enabled":"false","mode":""},"dns":{"auto":"false","servers":["10.0.0.5","10.20.0.20"]},"id":"2","dhcp":"false","ip":"10.82.0.160","netmask":"255.255.255.0","default_gw":"","mtu":"1500"}
you can use Backbone.emulateJSON = true;
This is the best answer for your doubt :)
Backbone.js PUT/DELETE problems with Codeigniter REST server

403 Forbidden error in Firefox only, works in Chrome and Safari

I have a Firefox quicksearch bookmark that runs a Maxmind query. This worked until recently. I type 'ip 82.176.230.15' (for example) into the URL bar and it queries Maxmind to retrieve the location of the IP:
http://www.maxmind.com/app/locate_demo_ip?ips=82.176.230.15
Within the past week, for reasons unknown, I now get a 403/Forbidden error when I try to access Maxmind.
"You don't have permission to access /app/locate_demo_ip on this server"
Strangely, the same URL is accessible in Chrome and Safari. I can also access the same URL with Firefox, Chrome, or Safari on my Mac.
I've deleted all cookies, disabled all addons, and still can't get it to work. Any idea what could be happening? I know that the 403 has to come from the server, so I don't know why it would work in other browsers. And it's been going on for days, definitely not some glitch on their server.
Get an HTTP debugger like firebug or fiddler (not sure that will work with FireFox, but probably if you set it up right)
Look at the difference between using your quick bookmark and just typing the URL. The server could return 403 whenever it feels like -- see if there's any difference, and what it is.
I recently had the same issue and was able to fix it.
In my case the problem was in headers that Mozilla sent.
Particularly it was because of header:
"User-Agent": "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:100.0) Gecko/20100101 Firefox/100.0"
What makes web-site refuse connection is this part of string "(X11; Ubuntu; Linux x86_64; rv:100.0)" and i have no idea why.
I found a nice solution, you can change Mozilla settings to include other browsers in this header (Chrome and Safari) and it could make sites with this problem works.
Here is how to do it:
Type about:config into the URL bar. Press Enter.
Create a new entry with key=general.useragent.override and add this string there Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/102.0.5005.115 Safari/537.36. I found that Google Chrome uses this string as User-Agent header probably to prevent such issues. So you should see something like:
Now save this settings and go reload your page, it should work now

Resources