On Xamarin iOS. I'm using HttpClient to get a JSON string. The problem is that it ignores updates and gives me the same JSON response if I query the same URL. I want it to not cache anything and always actually query the URL and give me the new response.
This sounds trivial, there must be a simple way for this. I'm using a Forms shared project.
I assume you are setting the cache-control header to no-cache?
client.DefaultRequestHeaders.CacheControl.NoCache = true;
If so but it still doesn't work - maybe the server is caching the response? If it comes down to it, you can usually defeat something like that by adding a cachebuster to the querystring though. Just append a bogus param and pass it a unique value each time. For example, if your URL is http://my.url.com/resource/someid then you can defeat the caching by using http://my.url.com/resource/someid?b=1 and then increment that "b" param with each call.
Related
this is the code for the jsonp goliath middleware:
https://github.com/postrank-labs/goliath/blob/master/lib/goliath/rack/jsonp.rb
all is fine except the headers contain a content-length less than the actual.
i'm not sure why where or why its setting a content length less than the actual perhaps its because of this:
"#{env.params['callback']}(#{response})" the extra callback method name that is included in the returning body wasn't accounted for.
The solution i could think of is modify the headers before this method post_process is called so the headers content-length would be correct.
im unsure where to do that though.
I'm not sure why you'd be seeing that issue but it sounds like a bug. Can you please make a test server that shows the problem and create a bug on github?
The content-length should be set by an auto-injected middleware that sits at the head of the chain. It will run after the JSONP middleware has executed, so it should take the new size into account.
Scenario:
I have a Board model in my Rails server side, and an Android device is trying to post some content to a specific board via a POST. Finally, the server needs to send back a response to the Android device.
How do I parse the POST manually (or do I need to)? I am not sure how to handle this kind of external request. I looked into Metal, Middleware, HttpParty; but none of them seems to fit what I am trying to do. The reason I want to parse it manually is because some of the information I want will not be part of the parameters.
Does anyone know a way to approach this problem?
I am also thinking about using SSL later on, how might this affect the problem?
Thank you in advance!! :)
I was trying to make a cross-domain request from ie9 to my rails app, and I needed to parse the body of a POST manually because ie9's XDR object restricts the contentType that we can send to text/plain, rather than application/x-www-urlencoded (see this post). Originally I had just been using the params hash provided by the controller, but once I restricted the contentType and dataType in my ajax request, that hash no longer contained the right information.
Following the URL in the comment above (link), I learned the how to recover that information. The author mentions that in a rails controller we always have access to a request variable that gives us an instance of the ActionDispatch::Request object. I tried to use request.query_string to get at the request body, but that just returned an empty string. A bit of snooping in the API, though, uncovered the raw_post method. That method returned exactly what I needed!
To "parse it manually" you could iterate over the string returned by request.raw_post and do whatever you want, but I don't recommend it. I used Rack::Utils.parse_nested_query, as suggested in Arthur Gunn's answer to this question, to parse the raw_post into a hash. Once it is in hash form, you can shove whatever else you need in there, and then merge it with the params hash. Doing this meant I didn't have to change much else in my controller!
params.merge!(Rack::Utils.parse_nested_query(request.raw_post))
Hope that helps someone!
Not sure exactly what you mean by "manually", posts are normally handled by the "create" or "update" methods in the controller. Check out the controller for your Board model, and you can add code to the appropriate method. You can access the params with the params hash.
You should be more specific about what you are trying to do. :)
I've a set of services hosted with WCF Web Api and I communicate with them in JSON from javascript. In most cases I'm okay modifying the accepts bit of the header to require a JSON response but there are some cases arising where I can't do this. This is due the the javascript framework that I'm using (Ext JS). For some things it only lets me specify a URL and not the proxy defaults such as headers.
This isn't an Ext JS question however. Web Api seems to default to returning XML, and I'd like to know whether it's possible to change this default so that it can return JSON instead. Thanks in advance!
A bit of experimentation seems to indicate that the order of the configured formatters matter (which is quite intuitive).
By default, when you create an instance of HttpConfiguration, its Formatters collection contains these formatters:
XmlMediaTypeFormatter
JsonValueMediaTypeFormatter
JsonMediaTypeFormatter
FormUrlEncodedMediaTypeFormatter
The reason why XML is the default formatting is because it's the first formatter. To make JSON the default value, you can reorder the collection to look like this:
JsonValueMediaTypeFormatter
JsonMediaTypeFormatter
XmlMediaTypeFormatter
FormUrlEncodedMediaTypeFormatter
Given an instance config of HttpConfiguration, here's one way to reorder the collection:
var jsonIndex = Math.Max(
config.Formatters.IndexOf(config.Formatters.JsonFormatter),
config.Formatters.IndexOf(config.Formatters.JsonValueFormatter));
var xmlIndex = config.Formatters.IndexOf(
config.Formatters.XmlFormatter);
config.Formatters.Insert(jsonIndex + 1, config.Formatters.XmlFormatter);
config.Formatters.RemoveAt(xmlIndex);
Whether or not this is supported I don't know, but it seems to work on WebApi 0.6.0.
I actually found a simple way of dealing with this. First make sure that the default JSON formatter is first. And then set its type to text/html. This will insure that the browser gets JSON even if it does not set the header. Nice aspect of the below is that you never have to remember to set the accept header in client code. It just works and always default to JSON.
var jsonformatter = config.Formatters.Where(t => t.GetType() == typeof(JsonMediaTypeFormatter)).FirstOrDefault());
config.Formatters.Remove(jsonformatter );
config.Formatters.Insert(0, jsonformatter);
config.Formatters[0].SupportedMediaTypes.Add(new MediaTypeHeaderValue("text/html"));
You could use the a delegating channel as described here http://blog.alexonasp.net/post/2011/07/26/Look-Ma-I-can-handle-JSONP-(aka-Cross-Domain-JSON)-with-WCF-Web-API-and-jQuery!.aspx which maps URIs like http://myserver/myresource/1/json to http://myserver/myresource/1 and sets accept header to application/json.
The delegating channel is part of the ContactManager_Advanced sample when you're downloading WCF Web API from http://wcf.codeplex.com.
It is contained in the UriFormatExtensionMessageChannel.cs file.
Look at the global.asax.cs of the sample on how to get it running.
According to the code the WCF Web API will always default to the XmlFormatter if it is in the collection of usable formatters. If it isn't the JsonFormatter is used instead if this is present. There is also a DefaultFormatter property but that is internal so you can't set that. Maybe a useful feature request to add?
Web programmer here - using AJAX (HTML, CSS, JavaScript, AJAX, PHP, MySQL), but for some reason Internet Explorer is acting up (surprise surprise).
AJAX is updating query results on the HTML page, via a PHP script that queries a MySQL Database.
Everything is working fine, except when I use Internet Explorer 8.0 .
There are several php scripts, which allow for the data to be ordered according to certain criteria, and for testing purposes I have attached the mktime field (current time, in the format HH:MM:SS) to the beginning of the results for each query.
When I use IE, these times appear to remain constant, whereas with ALL other browsers these times are correct and display the current time.
I think the issue has something to do with caching or something along those lines anyway.
Any thoughts or suggestions welcome...
Here is an article on the caching issue.
If your request is a GET change it to a POST, this will prevent the results being cached.
GET requests are cached in IE; switch it to a POST request and it won't be cached anymore.
Instead of switching to POST, which can be ugly if you're not really using it to update or create content, you should append a random number to the query string, as in http://domain.com/ajax/some-request?r=123456. If this number is unique for every request you won't have caching problems.
What I have done is, I have kept the "GET" and added new dummy query parameter to the querystring as follows,
./BaseServlet?sname=3d_motor&calcdir=20110514&dummyParam=datetime
I set dummyParam a value of date object in the javascript so that every time the url is generated browser will treat it as a new url and fetch new (fresh) results.
var d = new Date();
url = url + '&dummyParam='+d.valueOf();
So instead of generating some random numbers this is easy way!
We are using the Dynamic Script Tag with JsonP mechanism to achieve cross-domain Ajax calls. The front end widget is very simple. It just calls a search web service, passing search criteria supplied by the user and receiving and dynamically rendering the results.
Note - For those that aren’t familiar with the Dynamic Script Tag with JsonP method of performing Ajax-like requests to a service that return Json formatted data, I can explain how to utilise it if you think it could be relevant to the problem.
The service is WCF hosted on IIS. It is Restful so the first thing we do when the user clicks search is to generate a Url containing the criteria. It looks like this...
https://.../service.svc?criteria=john+smith
We then use a dynamically created Html Script Tag with the source attribute set to the above Url to make the request to our service. The result is returned and we process it to show the results.
This all works fine, but we noticed that when using IE the service receives the request from the client Twice. I used Fiddler to monitor the traffic leaving the browser and sure enough I see two requests with the following urls...
Request 1: https://.../service.svc?criteria=john+smith
Request 2: https://.../service.svc?criteria=john+smith&_=123456789
The second request has been appended with some kind of Id. This Id is different for every request.
My immediate thought is it was something to do with caching. Adding a random number to the end of the url is one of the classic approaches to disabling browser caching. To prove this I adjusted the cache settings in IE.
I set “Check for newer versions of stored pages” to “Never” – This resulted in only one request being made every time. The one with the random number on the end.
I set this setting value back to the default of “Automatic” and the requests immediately began to be sent twice again.
Interestingly I don’t receive both requests on the client. I found this reference where someone is suggesting this could be a bug with IE. The fact that this doesn’t happen for me on Firefox supports this theory.
Can anyone confirm if this is a bug with IE? It could be by design.
Does anyone know of a way I can stop it happening?
Some of the more vague searches that my users will run take up enough processing resource to make doubling up anything a very bad idea. I really want to avoid this if at all possible :-)
I just wrote an article on how to avoid caching of ajax requests :-)
It basically involves adding the no cache headers to any ajax request that comes in
public abstract class MyWebApplication : HttpApplication
{
protected MyWebApplication()
{
this.BeginRequest += new EventHandler(MyWebApplication_BeginRequest);
}
void MyWebApplication_BeginRequest(object sender, EventArgs e)
{
string requestedWith = this.Request.Headers["x-requested-with"];
if (!string.IsNullOrEmpty(requestedWith) && requestedWith.Equals(”XMLHttpRequest”, StringComparison.InvariantCultureIgnoreCase))
{
this.Response.Expires = 0;
this.Response.ExpiresAbsolute = DateTime.Now.AddDays(-1);
this.Response.AddHeader(”pragma”, “no-cache”);
this.Response.AddHeader(”cache-control”, “private”);
this.Response.CacheControl = “no-cache”;
}
}
}
I eventually established the reason for the duplicate requests. As I said, the mechanism I chose to use for making Ajax calls was with Dynamic Script Tags. I build the request Url, created a new Script element and assigned the Url to the src property...
var script = document.createElement(“script”);
script.src = https://....”;
Then to execute the script by appending it to the Document Head. Crucially, I was using the JQuery append function...
$(“head”).append(script);
Inside the append function JQuery was anticipating that I was trying to make an Ajax call. If the type of element being appended is a Script, then it executes a special routine that makes an Ajax request using the XmlHttpRequest object. But the script was still being appended to the document head, and being executed there by the browser too. Hence the double request.
The first came direct from the script – the one I intended to happen.
The second came from inside the JQuery append function. This was the request suffixed with the randomly generated query string argument in the form “&_=123456789”.
I simplified things by preventing the JQuery library side effect. I used the native append function...
document.getElementByTagName(“head”).appendChild(script);
One request now happens in the way I intended. I had no idea that the JQuery append function could have such a significant side effect built in.
See www.enhanceie.com/redir/?id=httpperf for further discussion.