jmeter http request no value, but wrong name - jmeter

I create a jmeter http request
with parameter name & value.
While the test result is no value only the parameter names.(parameter names are wrong :(
parameters config
error result
POST http://localhost:8080/aqnu/loginsys
POST data:
200117003200117003
Cookie Data:
JSESSIONID=438A7FCE7211763AFFF57F89F5A9FCB3
Request Headers:
Connection: keep-alive
Referer: http://localhost:8080/aqnu/login
Accept-Language: zh-CN,zh;q=0.9,en;q=0.8,zh-TW;q=0.7
Origin: http://localhost:8080
DNT: 1
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8
Upgrade-Insecure-Requests: 1
Content-Type: application/x-www-form-urlencoded
Cache-Control: max-age=0
Accept-Encoding: gzip, deflate, br

It might be the case with the parameters themselves, i.e. if you copied and pasted them from some source the parameters may include non-printable special characters like line breaks therefore JMeter fails to properly parse them in HTML mode. Try switching to Raw and see how does your request look like there
If the cause is different unfortunately we will not able to help without seeing the logs. If you want to check it out yourself or update the question with the logs:
Add the next lines to log4j2.xml file
<Logger name="org.apache.jmeter.protocol.http" level="debug" />
<Logger name="org.apache.http" level="debug" />
<Logger name="org.apache.http.wire" level="error" />
These lines will increase JMeter logs verbosity for HTTP protocol
Restart JMeter to pick up the changes
Search for your request parameters in jmeter.log file an see how do they look like.

Related

Special symbols are missed in name of file uploaded using Google Sites API

The goal: I want to upload file to my Google Classic Site programmatically using Google Sites API.
Used approach: To create a file in my Google Classic Site I use this request with
Content-Type: multipart/related; boundary=END_OF_PART
header and request body like this:
--END_OF_PART
Content-Type: application/atom+xml
<entry xmlns="http://www.w3.org/2005/Atom">
<category scheme="http://schemas.google.com/g/2005#kind"
term="http://schemas.google.com/sites/2008#attachment" label="attachment" />
<link rel="http://schemas.google.com/sites/2008#parent" type="application/atom+xml"
href="https://sites.google.com/feeds/content/domainName/siteName/PARENT_ENTRY_ID"/>
<title>File Title</title>
<summary />
<content type="xhtml">
<div xmlns="http://www.w3.org/1999/xhtml" />
</content>
...
<-- Other matadata here -->
</entry>
--END_OF_PART
Content-Type: <file mime type here>
... file contents here ...
--END_OF_PART--
The problem: When File Title starts with symbols like '_' (underscore), '%' (percent sign), '*' (asterisk), '&' (ampersand) request is successfully executed and file on my Google Classic Site in created with right content but with different title - no symbols listed above at the beginning of file name.
Is this known issue in Google Sites API? What I do wrong? How can I handle this problem?

Using UrlRewriteFilter to get Tomcat to return a 301 redirect from http to https

I'm trying to find out if anyone has succeeded in using the UrlRewriteFilter
availabe from http://tuckey.org/urlrewrite/ to do a 301 permanent redirect from
http to https in Apache Tomcat but I don't seem to be getting anywhere fast.
A number of people have asked the same question and AFAICS none have been answered
If I'm asking in the wrong place then maybe someone would be kind enough to 'redirect' me to the right place.
If it's not possible then perhaps someone could say so.
Thank you.
apache-tomcat-7.0.42
jdk1.8.0_77
CentOS Linux 7.2.1511
urlrewritefilter-4.0.3.jar
The 'standard' configuration as recommended by the tomcat docs is as follows
web.xml
<security-constraint>
<web-resource-collection>
<web-resource-name>Secure URLs</web-resource-name>
<url-pattern>/*</url-pattern>
</web-resource-collection>
<user-data-constraint>
<transport-guarantee>CONFIDENTIAL</transport-guarantee>
</user-data-constraint>
</security-constraint>
server.xml
<Connector port="80" protocol="HTTP/1.1"
connectionTimeout="20000"
redirectPort="443" />
<Connector port="443" maxThreads="150" scheme="https" secure="true"
SSLEnabled="true" keystoreFile="/opt/keys/tomcat.keystore"
keystorePass="*********" clientAuth="false" keyAlias="tomcat" sslProtocol="TLS" />
entering localhost in a browser results in redirection to https
checking this with curl we can see that this works as expected but we get 302 temporary redirect
root#sandbox:/tmp# curl -D /tmp/headers.txt -s http://localhost
HTTP/1.1 302 Found
Server: Apache-Coyote/1.1
Cache-Control: private
Expires: Thu, 01 Jan 1970 01:00:00 GMT
Location: https://localhost/
Content-Length: 0
Date: Fri, 29 Apr 2016 18:24:47 GMT
However this is unnacceptable to Google who prefer a 301 permanent
Is it possible to use UrlRewriteFilter to achieve this end
The following rule results in a 302 even though I'm using to type="permanent-redirect"
everything else stays the same
<rule>
<name>seo redirect</name>
<condition name="host" operator="notequal">^www_example_com</condition>
<condition name="host" operator="notequal">^localhost</condition>
<from>^/(.*)</from>
<to type="permanent-redirect" last="true">https://www_example_com/$1</to>
</rule>
I have tried various different combinations with no luck presumably because Tomcat is redirecting after the filter has been applied
Has anyone actually got this to work so that we get a 301 instead of a 302
Thank You

Custom headers removed when submitting request via proxy

I'm using Websphere Portal and trying to make an AJAX request via the Proxy but getting 404 status code.
When I issue the request below with my custom headers set I get 404:
GET http://proxy.com:10039/wps/proxy/https/server.com/cart/#self
But if I issue the same request not routed through the proxy, I get 200
GET https://server.com/cart/#self
Why is the proxy removing my headers?
I logged all headers on target server and see that the headers are actually removed when they pass through the proxy:
{accept=*/*, accept-encoding=gzip,deflate,sdch, accept-language=en-US,en;q=0.8, cache-control=no-cache, connection=keep-alive, host=server.com, pragma=no-cache, user-agent=Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/36.0.1985.143 Safari/537.36}
Here is my proxy config:
<policy url="{$digital_data_connector_policy}" basic-auth-support="true" name="digital_data_connector">
<actions>
<method>GET</method>
<method>HEAD</method>
<method>POST</method>
<method>DELETE</method>
<method>PUT</method>
</actions>
<headers>
<header>User-Agent</header>
<header>Accept*</header>
<header>Content*</header>
<header>Authorization*</header>
<header>Set-Cookie</header>
<header>If-Modified-Since</header>
<header>If-None-Match</header>
<header>If-Unmodified-Since</header>
<header>X-Method-Override</header>
<header>Set-Cookie</header>
<header>MyCustomToken</header>
<header>MyCustomTokenPart2</header>
</headers>
<meta-data>
<name>forward-http-errors</name>
<value>true</value>
</meta-data>
<meta-data>
<name>forward-credentials-from-vault</name>
<value>true</value>
</meta-data>
</policy>
My changes to proxy-config.xml where not picked up until the update-outbound-http-connection-config task was run.

GZip Compression in tomcat7 not working in IE9

I am using the below code in my server.xml file. It is working fine in firefox, where the response size was very much reduced. 200kb -> 25kb. But, not working in IE9. Any help or suggestion would be appreciated.
<Connector connectionTimeout="20000" port="8080" protocol="HTTP/1.1" redirectPort="8443"
maxHttpHeaderSize="8192"
maxThreads="150" minSpareThreads="25" maxSpareThreads="75"
enableLookups="false" acceptCount="100"
disableUploadTimeout="true"
compression="on" compressionMinSize="2048"
noCompressionUserAgents="gozilla, traviata"
compressableMimeType="text/html,text/xml,text/plain,text/css,text/javascript,text/json,application/x-javascript,application/javascript,application/json"
/>
Compression only happens if the user-agent wants it in that way.
When a user-agent makes a request it need to send its preference of compress though Accept-Encoding header.
accept-encoding:gzip,deflate,sdch
Only when this header is sent as part of the web request the web server will enable compression for the content that is sent. You can test it using curl
curl -L http://localhost/index.jsp -H 'Accept-Encoding: gzip' -o - | gzip
The above will generate a compressed data which can be uncompressed using gzip.
Mostly in your case it looks like IE is not preferring to get the content of the page as compressed and not sure why this happens.

How do I accept CORS AJAX requests on AppHarbor?

I'm using the Thinktecture.IdentityModel nuget package to enable CORS for WebAPI controllers in an MVC Web Application. At the moment, I'm only worried about POSTs but let me know about any problems with other verbs. This works when running through the IIS Express server.
When dealing with the AppHarbor deployment, it doesn't work. nginx doesn't seem to pass through the OPTIONS request to my code. What else is needed to get it running on AppHarbor?
Request
OPTIONS $path HTTP/1.1
Host: $servername
Connection: keep-alive
Access-Control-Request-Method: POST
Origin: http://www.local
User-Agent: Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.17 (KHTML, like Gecko) Chrome/24.0.1312.56 Safari/537.17
Access-Control-Request-Headers: accept, origin, content-type
Accept: /
Referer: http://www.local/wordpress/2013/01/request-url-test/
Accept-Encoding: gzip,deflate,sdch
Accept-Language: en-GB,en;q=0.8,en-US;q=0.6
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3
Response
HTTP/1.1 200 OK
Server: nginx
Date: Wed, 30 Jan 2013 02:34:14 GMT
Content-Length: 0
Connection: keep-alive
Allow: OPTIONS, TRACE, GET, HEAD, POST
Public: OPTIONS, TRACE, GET, HEAD, POST
My web.config is set up with the following default handlers:
<handlers>
<remove name="ExtensionlessUrlHandler-ISAPI-4.0_32bit" />
<remove name="ExtensionlessUrlHandler-ISAPI-4.0_64bit" />
<remove name="ExtensionlessUrlHandler-Integrated-4.0" />
<add name="ExtensionlessUrlHandler-ISAPI-4.0_32bit" path="*." verb="GET,HEAD,POST,DEBUG,PUT,DELETE,PATCH,OPTIONS" modules="IsapiModule" scriptProcessor="%windir%\Microsoft.NET\Framework\v4.0.30319\aspnet_isapi.dll" preCondition="classicMode,runtimeVersionv4.0,bitness32" responseBufferLimit="0" />
<add name="ExtensionlessUrlHandler-ISAPI-4.0_64bit" path="*." verb="GET,HEAD,POST,DEBUG,PUT,DELETE,PATCH,OPTIONS" modules="IsapiModule" scriptProcessor="%windir%\Microsoft.NET\Framework64\v4.0.30319\aspnet_isapi.dll" preCondition="classicMode,runtimeVersionv4.0,bitness64" responseBufferLimit="0" />
<add name="ExtensionlessUrlHandler-Integrated-4.0" path="*." verb="GET,HEAD,POST,DEBUG,PUT,DELETE,PATCH,OPTIONS" type="System.Web.Handlers.TransferRequestHandler" preCondition="integratedMode,runtimeVersionv4.0" />
</handlers>
Is WebDAV to blame?
After the suggestion of Jeffery To, I disabled WebDAV by following these instructions. Didn't even know WebDAV was installed on AH, but doing this did change my results.
Relevant Web.config sections
<handlers>
<remove name="ExtensionlessUrlHandler-ISAPI-4.0_32bit" />
<remove name="ExtensionlessUrlHandler-ISAPI-4.0_64bit" />
<remove name="ExtensionlessUrlHandler-Integrated-4.0" />
<remove name="WebDAV" />
<add name="ExtensionlessUrlHandler-ISAPI-4.0_32bit" path="*." verb="GET,HEAD,POST,DEBUG,PUT,DELETE,PATCH,OPTIONS" modules="IsapiModule" scriptProcessor="%windir%\Microsoft.NET\Framework\v4.0.30319\aspnet_isapi.dll" preCondition="classicMode,runtimeVersionv4.0,bitness32" responseBufferLimit="0" />
<add name="ExtensionlessUrlHandler-ISAPI-4.0_64bit" path="*." verb="GET,HEAD,POST,DEBUG,PUT,DELETE,PATCH,OPTIONS" modules="IsapiModule" scriptProcessor="%windir%\Microsoft.NET\Framework64\v4.0.30319\aspnet_isapi.dll" preCondition="classicMode,runtimeVersionv4.0,bitness64" responseBufferLimit="0" />
<add name="ExtensionlessUrlHandler-Integrated-4.0" path="*." verb="GET,HEAD,POST,DEBUG,PUT,DELETE,PATCH,OPTIONS" type="System.Web.Handlers.TransferRequestHandler" preCondition="integratedMode,runtimeVersionv4.0" />
</handlers>
<modules runAllManagedModulesForAllRequests="true">
<remove name="WebDAVModule"/>
<remove name="AspNetAppHarborIntegration" />
<add name="AspNetAppHarborIntegration" type="Premotion.AspNet.AppHarbor.Integration.AppHarborModule, Premotion.AspNet.AppHarbor.Integration" />
</modules>
Request
OPTIONS $path HTTP/1.1
Host: $servername
Connection: keep-alive
Access-Control-Request-Method: POST
Origin: http://www.local
User-Agent: Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.17 (KHTML, like Gecko) Chrome/24.0.1312.57 Safari/537.17
Access-Control-Request-Headers: accept, origin, content-type
Accept: /
Referer: http://www.local/wordpress/2013/01/request-url-test/
Accept-Encoding: gzip,deflate,sdch
Accept-Language: en-GB,en;q=0.8,en-US;q=0.6
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3
Response
HTTP/1.1 405 Method Not Allowed
Server: nginx
Date: Mon, 04 Feb 2013 17:09:19 GMT
Content-Type: application/json; charset=utf-8
Content-Length: 76
Connection: keep-alive
Cache-Control: no-cache
Pragma: no-cache
Expires: -1
WebDAV was to blame for the initial problem of not getting the full CORS headers in the response. The 405 error was a problem somewhere in the configuration of my app.
After some digging into the internals, it appears that the CORSMessageHandler for use with WebAPI (provided by Thinktecture) wasn't correctly identifying preflight requests and those requests were routed through to the WebAPI object itself.
I worked-around the issue by moving to the IIS module instead of the WebAPI one. This may make life more difficult in the future, but at least it works.
I might be missing something here, but based on the example response you've included it seems like the OPTIONS method is actually sent to your application - the Allow and Public headers are included in the response.

Resources