Running the URL to clear cache gives blank page in Pentaho - caching

I am using Pentaho CDE. I am trying to clear the cache using the following URL. When I run it in the browser it doesn't give an error but shows a blank page. What can be the cause for this?
http://localhost:8080/pentaho/ServiceAction?solution=admin&path=&action=clear_mondrian_schema_cache.xaction
"admin" in the URL refers to the folder in which the clear_mondrian_schema_cache.xaction file is found. What does "&action" in the URL refer to?

Ok, everyone knows that we can flush Mondrian cache manually.
How: Inside Pentaho User Console, Tools -> Refresh -> Mondrian Schema Cache. After few moments, if everything is right we get message: 'Mondrian Schema Cache Flushed Successfully'.
Ok, what was done down there? 'Mondrian Schema Cache Flushed Successfully' is predefined string for English locale if request returns code 200 (OK). According to this we know that click on it is calling some HTTP request method.
What and where? I'm using Live HTTP Headers plugin in Google Chrome browser. It tells me that PUC is calling:
GET /pentaho/api/mantle/isAuthenticated (and if response is OK?)
GET /pentaho/api/system/refresh/mondrianSchemaCache
If second request return code 200 (OK), PUC write message.
You can try it in your browser: hostname:port/pentaho/api/system/refresh/mondrianSchemaCache
It will clear your schema cache and stay blank page. If you want message, you can write script to collect response from HTTP request.

Related

HTTP request handled by apache but not by Laravel?

I have a Laravel7 app to which I send large data files (mp3/wav) and a CSV file via XHR to a POST method.
Generally, the stuff works like a charm, but for one of my testers (slower connection), it works randomly.
The thing is that I see in my Apache logs that the request is accepted, and responded with a 200 status code. The Laravel log doesn't show anything (I put some Log::debug at the start on my controller action).
If I test by myself, the log show what it has to.
In my /etc/php.ini, I set higher values for max_execution_time (3600), max_input_time(3600), post_max_size(4G), and upload_max_filesize(4G).
In my virtualhost conf, I set a TimeOut value to 3600.
Also, it seems, that when my tester sends the csv file as raw text file (.txt), the app responds, but I cannot figureout why, as we have no blocking concerning the non-audio file ...
Have you any idea ?
THanks in advance.

jmeter not sending the parameters mentioned in the http request

I am having some problems with jmeter. When i try to submit values to a form they don't appear there in the updated list. I am trying to check a feature of adding restaurants of a web app based on a local server right now. In the view results tree it appears successful. I put the id of the parameter by getting it from the page source and then write the value for it in the value column. Can please someone guide me through the process?
JMeter actually dose not cares about what page it has got back as response, any response with status 200 will make it successful. It is users responsibility to make sure correct response is retrieved. So when you submit a form through JMeter to add a restaurant, make sure the response for it has success confirmation of restaurant added.
Following SSujesh answer, If for example you will do a false login in drupal - you still get 200OK ...
Try and find out if there is a key that is generated per user that you need to extract, for example:
Druapl will issue a form_build_id that you will need to extract (every time you initiate a login) and then pass while you do the post request...
Same goes to aspx.
gl,
Refael

How can I scrape an image that doesn't have an extension?

Sometimes I come across an image that I can't scrape so that it can be saved. An example of this is:
https://s3.amazonaws.com/plumdistrict.com-production/perks/12321/image/original.?1325898487
When I hit the url from Internet Explorer I see the image but when I try to get it from the code below I get the following error message "System.Net.WebException The remote server returned an error: (403) Forbidden" error with GetResponse:
string url = "https://s3.amazonaws.com/plumdistrict.com-production/perks/12321/image/original.?1325898487";
WebRequest request = WebRequest.Create(url);
WebResponse response = request.GetResponse();
Any ideas on how to get this image?
Edit:
I am able to get to save images that do have extensions. For example I can scrape the following image just fine:
https://s3.amazonaws.com/plumdistrict.com-production/perks/12659/image/original.jpg?1326828951
Although HTTP is originally supposed to be stateless, there are a lot of implementations that rely on it being stateless. I could configure my webserver to only accept requests for "http://mydomain.com/sexy_avatar.jpg" if you provide a cookie proving you were logged in. If not, I send you a redirect 303 to "http://mydomain.com/avatar_for_public_use.jpg".
Amazon could be doing the same. Try to load the web page using Chrome, and look at the Network view in developer mode (CTRL+SHIFT+J) to see all headers supplied to the website. Maybe you even need to do a full navigation in the same session before you are allowed to see the image. This is certainly the case in many web applications I have developed :-)
Well, it looks like it's being generated from a script (possibly being retrieved from a database). The server should be sending a file/content type to go along with that... but it doesn't seem to be, which I believe is a violation of standards.
My Linux box knows full well that that's a JPEG image once it's on my hard drive, because it examines file headers rather than relying on extensions. Perhaps there is a tool to do the same in Windows?
Edit: Actually, on further contemplation, it seems odd that you'd get a 403 for that. Perhaps the server is actually blocking you from retrieving the file in that manner.

How can I validate http response headers?

It's the first time I am doing something with headers. I am mainly concerned with Cache-Control but there may be others I will need to check as well. For example, I try to send the following header to the browser (based on tutorials I just read):
Cache-Control:private, max-age=2011-12-30 11:40:56
Google Chrome displays it this way in Network -> Headers -> Response headers, but how do I know if it's correct, that there aren't any typos, syntax errors and such? Will it really work? Will the browser behave like I want it to, or will it treat it like a gibberish (something like "unknown header/value")? I've tried sending nonsensical headers on purpose but they got displayed with the rest. Is there any Chrome tool / addon for that, or any other way? Thank you in advance!
I'm afraid you won't be able to check if the resource has been cached by proxies en route, but you can check if your browser has cached it.
While in the Network panel of Chrome DevTools, hit F5 to reload your page. You should see something like "304 Not Modified" in the status field for the resource you are treating (which means the resource has not been modified and its contents were not received from the server but rather loaded from the browser's cache.)

How to fix "Failed to load resource" bug when submit form to download file?

I try to create some script to send a lot of required data to the server after that server will send a generated file back. I know, it's possible to create some link that contain my data like the following code for sending data to server and get the file back.
<a href="domain.com/getExcel.aspx?id=ABC&title=Report1" target="_blank" />
But the maximum length of URL is around 2000 characters that I cannot guarantee the length of data will not over this limit. So I try to use "jQuery Plugin for Requesting Ajax-like File Downloads" to send data to the server by dynamic creating form with hidden input.
Everything works great on IE9. But I got "Failed to load resource" error message when I use it in Google Chrome 9. I think this problem occurs only when response header is set to Attachment.
Demo: http://fiddle.jshell.net/SP3Tx/3/show/
Source: http://jsfiddle.net/SP3Tx/3/
Do you have any walk-around to fix this bug?
Update
I just submit this bug to chromium issue tracker.
http://code.google.com/p/chromium/issues/detail?id=75384

Resources