I have inherited a PLSQL application. The application works fine as long as the values passed to the submit procedure do not exceed a certain number. I have checked the variables through fiddler when an html 400 occurs and the number of variables passed are quite large. The procedure IN consists of 12 parameters. This one issue that causes the error the user is passing 283 sets of 12. If I manually limit the number to 195 sets or less, no error returned. I'm not running out of array memory but I think I am hitting some kind of server side limit to process the post. I have tried talking with the server team based on the information from this link
What is the maximum length of a URL in different browsers?
but they don't seem to know how to do this.
I am looking for alternate ways to pass the values to maybe a temp table or store all the variable in their respective arrays before passing instead of this one long url string of 3396 variable and values.
If anyone understand what I am talking about any advice would be greatly appreciated.
Related
In JMeter i need to perform a large search and count the number of rows which are returned. Max rows are 50000.
The number of rows which are returned are shown on the website after a search. "Number of returned rows: xx".
Or I can count the rows inside the HTTP response.
I have tried to use a regex post-processer to count the amount of rows which are returned, the problem is that JMeter freezes since the http-response is so large.
I have also tried to extract the text directly from the website unsuccesfully. I guess one cant do that since the information is not in the HTTP-response?
--So:
Is there some faster and less demanding way to counter all the returned rows inside a HTTP-response body?
Or is there some way to get the text directly from the website?
Thank you.
It looks like your application is buggy, I don't think that returning 50000 entries in a single shot is something people should be doing as there is creates extra network traffic and consumes a lot of resources on server and client(browser) side. I would rather expect some form of Pagination when it comes to operating large amounts of data.
If you're totally sure that your application works as expected you can try using Boundary Extractor which is available since JMeter 4.0
Due to the specifics of internal implementation it consumes less resources and acts faster than the Regular Expression Extractor therefore the load you will be able to conduct from a single machine will be higher.
Check out The Boundary Extractor vs. the Regular Expression Extractor in JMeter article for more information
yes you can get that count in matchNr which is coming after search string. use Regular expression to match any name or id,
do match No. -1
ex. regex variable name is totalcount so then you can fetch that count by using ${totalcount_matchNr}
I have recorded a script from login till the opening of Oracle form.
Then i split the program into two parts, one with login and other as Navigation to form and open.
Login is successfully executing but the navigation script is giving me an error HTTP-error code 500
T03_Amar_Navigation.c(95): Error -26612: HTTP Status-Code=500 (Internal Server Error) for the URL [MsgId: MERR-26612].
there is no problem while logging in and opening oracle form manually.
can someone help me what I may be missing?
I tried copying all the correlation parameters into the navigation as well, no error or mismatch with correlation parameters
Best guess, based upon seeing this 500 condition hundreds of times in my career, is that you need to check your script for the following
Explicit checking for success on each step, or expected results. This is more than just accepting an HTTP 200. This involves actually processing the content that is returned and objectively looking at the page for elements you expect to be present. If they are not present then you will want to branch your code and elegantly exit your iteration. A majority of 500 level events are simply the result of poor testing practices and not checking for expected results.
Very carefully examine your code for unhandled dynamic elements. These could be related to session, state, time or a variable related to user/business process. A mishandled or unhandled dynamic element cascading for just a few pages results in an an application where the data being submitted does match the actual state of the business process. As this condition is something that would not be possible with the actual website, you wind up with an unaddressed exception in the code and a 500 pushed back to the user. There are roughly half a dozen methods for examining your requests for dynamic elements. I find the most powerful to be the oldest, simply record the application twice for the same data, then compare the scripts. Once you have addressed the items related to session, state and time, then record with a different data set (user, account, etc...) and look at the dynamic elements related to your actual data in use.
Address the two items above and your 500 will quite likely go away.
Utilizing SonarQube 5.1, I have been attempting to utilize the API search feature to gather all of the issues pertaining to my current project to display on a radiator. On the Web interface, SonarQube indicates there are 71 major issues and 161 minor issues.
Using this search string
https://sonarqube.url.com/api/issues/search?projectKeys=myproject'skey
I get back a response with exactly 100 results. When I process those results for only OPEN items, I get back a total of 55 issues. 36 major, 19 minor.
This is being achieved through a Powershell script that authenticates to the SonarQube server and passes in the query, then deserializes the response into an array I can process. (Counting major/minor issues)
With the background out of the way, the meat of my question is: Does anyone know why the responses I am receiving are locked at 100? In my research I saw others indicating a response to an issue search would be capped at 500 due to an outstanding bug. However the expected number of issues I am looking for is far below that number. The API's instructions indicate that it would return the first 10,000 issues. Is there a server side setting that restricts the output it will return to a search query?
Thanks in advance,
The web service docs show that 100 is the default value of the ps parameter. You can set the value higher, but it will still max out.
You might have noticed a "paging" element in the JSON response. You can use it to calculate how many pages of results there are and loop through them using the p parameter to specify page number.
I want to correlate this 181-418-5889 in the following statement: regSend&transferNumber=181-418-5889".
I used the regular web_reg_save_param: But it failed... any suggestion?
You are using the statement in the wrong location, such as using it just before the request is sent containing the correlated value versus just before the location where the response containing the value is sent to the client
You are not receiving the correct page response and as a result you may not be able to collect the value. The page may be an HTTP 200 page but the content could be completely off. Always check for an appropriate expected result
Your left boundary, right boundary and other parameters are incorrect to collect the value you need
You have not been through training and you are being forced by your management to learn this tool via trial and error
1- I am not using the statement in the wrong location since I did find the needed value I want to correlate via the Tree function and put it just before the statement that hold this value
2- The Page is not an HTTP 200
3- The Left and right boundary are correct since I checked the text if it does exist twice in the response body.
4- I know the tool (Loadrunner) but in fact, the application is developed under ZK platform and I am not sure if ZK and Loadrunner are compatible knowing that I did implement the dtid function in my script to have a static desktop id each time I replay the process.
My url is as follows
/fcgi-bin/clireports.fcgi?sfPageId=param1&sfBoxId=param2&sfPagecId=param3&sfUsername=param4&sfSession=param5&sfSubmit=param6&showSampleReport=param7&saveAndAdd=param8
My questions is this:
Is there any performance issue when sending so many parameters in the query string ? In my case I am sending 8 parameters in the query string .
Will my website become slow because of this ?
Please enlighten me on this .
The performance is not about number of parameters (or just a little) but about number of bytes send.
The more data you send, the more it's costly to transfer/parse.
Remember that every browser has a hard limit on the maximum number of characters a GET request can handle. I recomment you not to use more than 255 characters to be on the safe side, but you can go higher (IE's limit is around 2000).
If you need more data, use POST.
Finally, your website will not become slow because of this. Because many other factors take much time than parsing GET request (take something like DB connection, or just php warmup)
Shouldn't be any issues, might be worth doing a POST request to clean up your request a bit, but other than that you shouldn't notice any issues as far as performance...just don't exceed the "limits" that vary depending on browser