It seems that HttpWebRequest caching in WP7 is enabled by default, how do I turn it off?
Adding a random
param url + "?param=" + RND.Next(10000) works, but it's quite tricky and I'm not sure if it will work
with all servers.
For future reference , this worked for me ( I could not use additional query parameter due to project requirements) :
HttpWebRequest request = HttpWebRequest.CreateHttp(url);
if (request.Headers == null)
{
request.Headers = new WebHeaderCollection();
}
request.Headers[HttpRequestHeader.IfModifiedSince] = DateTime.UtcNow.ToString();
In case of HttpClient (Portable for Windows Phone) "Cache-Control": "no-cache" on server side works only sometimes. And I cannot add query string random value to RESTful api call as well.
Solution from #frno works good and looks like for HttpClient:
client.DefaultRequestHeaders.IfModifiedSince = DateTime.UtcNow;
Thank you.
How do you know it's the phone, not the server (or a proxy somewhere between) which is caching?
Have you checked this with Fiddler2 (or equivalent)?
Have you tried setting headers to disable caching?
Something like:
myRequest = (HttpWebRequest)WebRequest.Create(myUri);
myRequest.Headers["Cache-Control"] = "no-cache";
myRequest.Headers["Pragma"] = "no-cache";
We've seen the same behaviour with Silverlight hosted in Chrome.
We add a "?nocache=" + DateTime.Now.Ticks.ToString() to our request URLs if we want to prevent caching.
I found 3 ways
Add a random Query String to the end of your URI (think Guid.NewGuid()) this will avoid caching on the client as the Query String will be different each time
string uri = "http://host.com/path?cache="+Guid.NewGuid().ToString();
Specify no cache in the OutgoingResponse header within your WCF service operation:
var __request = (HttpWebRequest)WebRequest.Create(url.ToString());
if (__request.Headers == null)
__request.Headers = new WebHeaderCollection();
__request.Headers.Add("Cache-Control", "no-cache");
markup your service operation with the AspNetCacheProfile attribute:
[AspNetCacheProfile("GetContent")]
public ResultABC GetContent(string abc)
{
__request = (HttpWebRequest)WebRequest.Create(abc);
return __request;
}
And update your web.config
<system.web>
<caching>
<outputCache enableOutputCache="true" />
<outputCacheSettings>
<outputCacheProfiles >
<add name="GetContent" duration="0" noStore="true" location="Client" varyByParam="" enabled="true"/>
</outputCacheProfiles>
</outputCacheSettings>
</caching>
...
</system.web>
Adding random number is not bad and it will work. I have used Time (in ajax call). Was placed in the url like a folder.
Yes is possible... :) I spend one week of Experiment and the answer is really simple :
HttpWebRequest _webRequest = WebRequest.CreateHttp(_currentUrl);
_webRequest.AllowReadStreamBuffering = false
_webRequest.BeginGetResponse(_onDownload,
userState);
Related
I have created a C sharp Wpf ClickOnce application which uses xml rpc for communincation. A lot of my users get there proxy settings in different ways. Some use a pac file, other from IE or dhcp etc. I want to automate this whole process of getting the proxy details in any kind of environment. I have tried a LOT of different code snippets but want to hear if something like this already exists.
I see the Xml Rpc documentation has a setProxy method but I'm not sure how to specify the username or passsword if one is used. This whole process is still a little bit confusing for me.
I have also tried many different pieces of code including the WebProxy Class and using DefaultCredentials,DefaultProxy,GetSystemWebProxy etc.
At the moment I'm going to try a dllimport using winhttp to get the proxy settings. I am not sure if one can do this in a Clickonce Deployment. Is the dllimport the same as p/invoke ?
As you can see I would appreciate some advice on how to go about getting ANY type of proxy setting.
Appreciate any feedback.
ClickOnce installation/update doesn't really support proxy authentication. It will use the information in IE, and sometimes the machine.config file. The definitive thread with all known information about this is here.
I haven't had have problems with proxy authentication from the standpoint of installing applications. When using our application, which called backend WCF services, we let the user provide his proxy authentication information, and we applied the settings programmatically when making the service calls. This has nothing to do with ClickOnce.
This worked for me :
public static IExample ProxyAndCredentials { get; set; }
public static string ProxyUrl { get; set; }
public static void SetupProxyAndCredentials() {
//Insert your website here where XmlRpc calls should go
var url = new Uri("http://www.example.com/");
try
{
ProxyUrl = WebRequest.DefaultWebProxy.GetProxy(url).ToString();
Log.Debug(url + " is using proxy " + ProxyUrl);
if (ProxyUrl == url.ToString() || ProxyUrl == url + "/"){
// A proxy is not in use here
ProxyUrl = "";
Log.Debug("No proxy is used for " + url);
}
else if (!String.IsNullOrEmpty(ProxyUrl)){
// A proxy is in use
ProxyAndCredentials.Proxy = new WebProxy(ProxyUrl);
Log.Debug("A proxy is used for " + url);
}
//Set credentials, in my experience it is better to always set these
ProxyAndCredentials.Credentials = CredentialCache.DefaultNetworkCredentials;
ProxyAndCredentials.Proxy.Credentials = CredentialCache.DefaultNetworkCredentials;
}
catch (Exception p)
{
//Handle Exception
}
}
We have a WebBrowser embedded in our Windows Phone 7x application. This WebBrowser is pointed at our web servers. We need to be able to differentiate between a request coming from the app and a request coming from the native browser (or a WebBrowser embedded in another app, for instance). To do this we'd like to modify the User-Agent of all HTTP requests coming from said WebBrowser.
However, I can't find a way to do this. My initial thought was simply to override the Navigate functions adding "additionalHeaders." Unfortunately the WebBrowser class is sealed, so that option wasn't an option at all. I've searched high and low for a property or handler that's exposed that I might be able to take advantage of to no avail.
So, in short, is there a way to modify the User-Agent for a WebBrowser for all outbound HTTP requests?
I know this question is old, but in case this is of use to anyone, you could always use this for the WebBrowser's navigating event:
void wb_Navigating(object sender, NavigatingEventArgs e)
{
if (!e.Uri.ToString().Contains("!!!"))
{
e.Cancel = true;
string url = e.Uri.ToString();
if (url.Contains("?"))
url = url + "&!!!";
else
url = url + "?!!!";
wb.Navigate(new Uri(url), null, "User-Agent: " + "Your User Agent");
}
}
You just add "!!!" to all the urls for navigations that have your custom user agent. If the URL doesn't contain "!!!", it is a request from a clicked link and the WebBrowser cancels the navigation, and re-navigates with your custom user agent and "!!!" in the query string.
I tried a similar approach to msbg, where you store the URL in memory to avoid double checking it, and avoid modifying it with !!!. However, that approach doesn't preserve POST data, so it won't help me.
List<string> recentlyRequestedUrls = new List<string>();
void wb_Navigating(object sender, NavigatingEventArgs e)
{
if(!recentlyRequestedUrls.Contains(e.Uri.ToString()))
{
//new request, reinitiate it ourselves and save that we did to avoid infinite loop.
e.Cancel = true;
string url = e.Uri.ToString();
recentlyRequestedUrls.Add(url);
webBrowser1.Navigate(new Uri(url), null, "User-Agent: Your_User_Agent");
}
}
Set the user agent through additional headers, when invoking the Navigate method. Details here.
It's very easy to change the referer by simply setting the appropriate header, however, I cannot find a way to change the user agent ("ZDM/4.0; Windows Mobile 7.0;") to any other value. I tried the following code so far:
var request = new BackgroundTransferRequest(new Uri("http://www.somedomain.net"));
request.Headers[Convert.ToString(HttpRequestHeader.UserAgent)] = "AgentSmith";
request.Headers[Convert.ToString(HttpRequestHeader.Referer)] = "MyReferer";
Any thoughts? Your help will be very much appreciated.
Convert.ToString(HttpRequestHeader.UserAgent) returns "UserAgent", but the HTTP Header is "User-Agent"; try the code like this:
var request = new BackgroundTransferRequest(new Uri("http://www.somedomain.net"));
request.Headers["User-Agent"] = "AgentSmith";
request.Headers["Referer"] = "MyReferer";
I have a simple implementation of custom protocol. It's said that newURI method takes 3 arguments (spec, charset & baseURI) and "if the protocol has no concept of relative URIs, third parameter is ignored".
So i open a page like this tada://domain/samplepage which has XML starting with this:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE Product SYSTEM "product.dtd">
But i don't see any request regarding product.dtd to my protocol (newURI is not even called). Do i miss smth in my implementation?
BTW: the page itself opens correctly, but there's no request to the DTD-file.
const
Cc = Components.classes,
Ci = Components.interfaces,
Cr = Components.results,
Cu = Components.utils,
nsIProtocolHandler = Ci.nsIProtocolHandler;
Cu.import("resource://gre/modules/XPCOMUtils.jsm");
function TadaProtocol() {
}
TadaProtocol.prototype = {
scheme: "tada",
protocolFlags: nsIProtocolHandler.URI_DANGEROUS_TO_LOAD,
newURI: function(aSpec, aOriginCharset, aBaseURI) {
let uri = Cc["#mozilla.org/network/simple-uri;1"].createInstance(Ci.nsIURI);
uri.spec = (aBaseURI === null)
? aSpec
: aBaseURI.resolve(aSpec);
return uri;
},
newChannel: function(aURI) {
let
ioService = Cc["#mozilla.org/network/io-service;1"].getService(Ci.nsIIOService),
uri = ioService.newURI("chrome://my-extension/content/about/product.xml", null, null);
return ioService.newChannelFromURI(uri);
},
classDescription: "Sample Protocol Handler",
contractID: "#mozilla.org/network/protocol;1?name=tada",
classID: Components.ID('{1BC90DA3-5450-4FAF-B6FF-F110BB73A5EB}'),
QueryInterface: XPCOMUtils.generateQI([Ci.nsIProtocolHandler])
}
let NSGetFactory = XPCOMUtils.generateNSGetFactory([TadaProtocol]);
The channel you return from newChannel has the chrome:// URI you passed to newChannelFromURI as its URI. So that's the URI the page has as its URI, and as its base URI. So the DTD load happens from "chrome://my-extension/content/about/product.dtd" directly.
What you probably want to do is to set aURI as the originalURI on the channel you return from newChannel.
As Boris mentioned in his answer, your protocol implementation doesn't set nsIChannel.originalURI property so that URLs will be resolved relative to the chrome: URL and not relative to your tada: URL. There is a second issue with your code however: in Firefox loading external DTDs only works with chrome: URLs, this check is hardcoded. There is a limited number of supported DTDs that are mapped to local files (various HTML doctypes) but that's it - Gecko doesn't support random URLs in <!DOCTYPE>. You can see the current logic in the source code. The relevant bug is bug 22942 which isn't going to be fixed.
Boris and Wladimir, thank you!
After some time i have a solution. The problem was that the DTD-file could not be loaded from my custom-created protocol. The idea was to use Proxy API to override schemeIs() method, which was called in newURI method of nsIProtocolHandler.
So now i have this snippet of code in newURI method:
let standardUrl = Cc["#mozilla.org/network/standard-url;1"].createInstance(Ci.nsIStandardURL);
standardUrl.init(standardUrl.URLTYPE_STANDARD, -1, spec, charset, baseURI);
standardUrl.QueryInterface(Ci.nsIURL);
return Proxy.create(proxyHandlerMaker(standardUrl));
proxyHandlerMaker just implements Proxy API and overrides the needed schemeIs() method. This solved the problem and now all the requests come to newChannel where we can handle them.
Important notes:
Request to DTD comes to newURI() method and does not come to newChannel(). This is the default behavior. This happens because schemeIs("chrome") method is called on the object which was returned by newURI() method. This method should return "true" for DTD-requests if you want the request to reach the newChannel() method.
newChannel() method is invoked with the {nsIURI} object which is not the same as the object which was returned by the newURI method.
If you want to handle both protocol:page & protocol://domain/page URLs by your protocol, you should use both {nsIURI} and {nsIStandardURL} objects
You can pass the created {nsIStandardUrl}-object (standardUrl in the snippet above) as a 2nd argument to the Proxy.create() function. This will make your baseURI (3rd arguments in newURI) pass "baseURI instanceof nsIStandardUrl" check. SchemeIs() method of this proxied object will also return true for the DTD-files requests. But unfortunately the requests won't reach newChannel() method. This could be a nice DTD-problem solution but I can't solve this problem.
I'm having a problem using Winnovatives PDFConverter on pages that are protected by Extranet security (which is based on ASP.Net Membership).
I've tried several different approaches, but the following I can get to work on my local machine, but not anywhere else.
Code for login page, this code should bypass the login process for:
// check that the current "user" isn't logged in and is the Winnovative UserAgent
if (!Sitecore.Context.IsLoggedIn && Request.UserAgent.Contains(".NET CLR"))
{
//Login with a dummy user I've created
Sitecore.Security.Authentication.AuthenticationManager.Login("extranet\\pdf", "pdf", true);
//redirect to former page
}
The page that generates the PDF uses this code:
private void PDFPrint(string url)
{
PdfConverter pdfConverter = new PdfConverter();
pdfConverter.LicenseKey = "our license";
url = Request.Url.Scheme + "://" + Request.Url.Host + url;
byte[] downloadBytes = pdfConverter.GetPdfFromUrlBytes(url);
HttpResponse response = HttpContext.Current.Response;
response.Clear();
response.AddHeader("Content-Type", "binary/octet-stream");
response.AddHeader("Content-Disposition", "attachment; filename=" + Sitecore.Context.Item.Name + ".pdf" + "; size=" + downloadBytes.Length.ToString());
response.Flush();
response.BinaryWrite(downloadBytes);
response.Flush();
response.End();
}
The Exception I'm getting is this:
"Could not get the metafile from url. Could not get image from url.The URL is not accessible.."
I've also tried this trick from the Winnovative FAQ to no avail:
http://www.winnovative-software.com/FAQ.aspx#authenticationQ
I've also tried to use WebClient or HttpWebRequest to retrieve the content.
But nothing I do seems to work other than locally.
Basically I want to create a way of either getting Winnovatives converter to use the current logged in user, my custom "pdf" user og some other way of getting the html from the response.
I hope this question isn't too vague, but I find it kinda hard to ask. But basically I want to get some html content from a page on a Sitecore solution I control, which is protected by Sitecore normal Extranet security. This html content should be in string or byte[] format.
Help me Stackoverflowers, you're my only hope! :P
I contacted Sitecore to ask if they had a solution.
Their solution was to create a Processor that would set an active user based on some criteria.
This is the code I made for my site (it's probably not the best solution as UserAgent can be spoofed):
public class MyResolver : HttpRequestProcessor
{
// Methods
public override void Process(HttpRequestArgs args)
{
var userAgent = args.Context.Request.UserAgent ?? "";
SiteContext site = Sitecore.Context.Site;
if (site.Name == "site_name_in_webconfig" && userAgent.Contains("this_should_only_be_in_thepdfcreators_userAgent"))
{
Sitecore.Security.Accounts.User pdfuser = Sitecore.Security.Accounts.User.FromName("extranet\\theUser", true);
AuthenticationManager.SetActiveUser(pdfuser);
}
}
}
and then add the following to the web.config, before the UserResolver:
<processor type="Namespace.MyResolver, Assembly" />
I hope this will help some others out there.
I've found a similar issue on the ASP.NET forums and the answer to that was to use a newer version of the PDF tool: SessionState Problems ?