On my site every text is served as UTF-8.
Since nowadays every browser supports unicode characters, I would like to use them as-is.
The asp.net framework is very helpful by replacing any unicode with a Numeric Character Reference, like á. For reference check: http://en.wikipedia.org/wiki/Unicode_and_HTML#HTML_document_characters
Sure, this way the webpage renders correctly in the oldest netscape possible, but for example the google analytics ecommerce module has some trouble understanding these specially coded characters.
Is there a way to globally disable the Numeric Character Reference encoding?
For example I want to write in razor:
<span class="title">#ViewBag.Title</span>
I would want this to show on the output:
<span class="title">Számítástechnika</span>
Not this:
<span class="title">Számítástechnika</span>
I'm not trying to disable the html encoding, so Html.Raw is not a solution, as for example I'm not able to ensure that the #ViewBag.Title will not content something like this:
<span class="title"><script>alert('injected hahahah');</script></span>
So I'm content with the automatic encoding of special html characters. That is not what I want to disable.
I wouldn't want to restructure all the code, and I thought that there should be a "global switch" to disable this kind of behavior in using string parameters in razor. Is there a way to do this?
Also can I explicitly forbid the numeric character references, for example with something like new MvcHtmlString(myString, some parameters) ?
I'm afraid that you cannot turn this encoding feature off. This "nice" feature is provided by the WebUtility.HtmlEncode and you cannot influence the encoding.
However with starting .net 4.0 you can customize the encoding behavior, with creating a class that inherits from the HttpEncoder and configure it in the web.cofig HttpRuntimeSection.EncoderType. But you need to implement your own custom encoding logic.
Luckily .net 4.5 ships with a new HttpEncoder which encodes the bad stuff (like <script>) however handles the Unicode characters correctly called AntiXssEncoder
So you just need to add this in your web.config:
<system.web>
<httpRuntime encoderType="System.Web.Security.AntiXss.AntiXssEncoder,
System.Web, Version=4.0.0.0, Culture=neutral,
PublicKeyToken=b03f5f7f11d50a3a"/>
</system.web>
If you are not yet on .net 4.5 you can implement your AntiXssEncoder with the help of
Microsoft Web Protection Library
Here is an article how to set it up: Using AntiXss As The Default Encoder For ASP.NET (although it might be outdated)
You can also use the #Html.Raw method of mvc.This is useful where you don't want to do it at global level sometimes on already built project.
#Html.Raw(#ViewBag.Title)
For .Net Core web application you can configure default encoding behaviour in your ConfigureServices method:
public void ConfigureServices(IServiceCollection services)
{
services.Configure<WebEncoderOptions>(options =>
{
options.TextEncoderSettings = new TextEncoderSettings(UnicodeRanges.All);
});
}
This will render non-encoded unicode characters on the html page.
Source https://github.com/aspnet/HttpAbstractions/issues/315
Related
I'm not clear on how to use the AntiXSS library in my .Net WebAPI 2 project.
I have installed the AntiXSS NuGet package (which gives me v4.3), and have set the encoderType property of httpRuntime in web.config.
Which class should I now use to take advantage of AntiXSS? Does it override the behaviour of System.Web.HttpUtility?
If using the AntiXSS NuGet package, the correct use is to use Microsoft.Security.Application.Encoder.
According to bdorrans of the MS Web Protection Library (AntiXSS) project, you can indeed use HttpUtility once encoderType is set.
However, it shouldn't normally be a concern to encode output from an API. My scenario is that I'm serving a Single Page Application, and was worried I was injecting unsafe values in to the page via my model. But unless injecting values via .html() or .eval(), any malicious values will not be executed on the client.
my Firefox addon shall add a search engine, that
provides suggestions
gets its search template URL specified on runtime (i.e.: template URL depends on the preferences of the user)
And I don't see a way to do both at the same time.
I see two options to add a search engine:
addEngineWithDetails
addEngine
addEngineWithDetails() allows me to add a search engine with the template URL. But it does (apparently?) not allow to provide a suggestions URL.
addEngine() allows me to add a search engine that is specified in an XML file. But if have that file saved locally in my addon directory (e.g. chrome://example-engine/content/search.xml), how can I change the template URL on runtime? And using an online XML is an unsafe options since the internet connection could be broken or bad during the addon install.
First fo all, you're right, addEngineWithDetails does not support suggestions.
The way to go would be to use addEngine (and removeEngine).
As for the "dynamic" part of your question: While I didn't test it, the implementation seems to happily accept data: URIs. So you could:
Construct a data URI using whatever methods you like (even constructing a full XML DOM and serializing it).
Call addEngine with the data URI.
When the user changes a pref, remove the old engine, and construct a new one.
Is there any built-in support for validating malicious input within the Web API, similar to forms with MVC?
If not, could anyone suggest a "global" filter/message inpector/whatever to validate against malicious input? I'm trying to avoid validating all of my models/parameters individually...
No, I don't believe there is such support. Here's why. The input validation support with Web Forms/MVC was a stopgap measure. But encoding output is the better XSS fix; validating input doesn't work perfectly, as what input is "bad" depends on how you'll be outputting it (as part of HTML element source, as part of JS source, in an HTML attribute value, as part of a SQL query, etc.).
So I'd recommend against generic, global input validation as the solution to XSS concerns. Instead, make sure you're always encoding input correctly before outputting it (or passing it on to another layer, such as a SQL DB). For output, if you're using the normal Web API mechanisms for returning data (model classes with content negotiation/formatters), the formatters should handle the content type-specific encoding for you.
I believe XSS is not relevant to ASP.NET Web API. Here is why I think so. Suppose, in the request body, say I get a JSON like this "input": "<script>alert('hello');</script>" and the web API stores the "input" which is bound to some property as-is into a database and retrieve it as-is in a subsequent GET request and sends that off to a client, it is still okay. It is the responsibility of the client to ensure this data is escaped correctly. So, when this input property is serialized to say a web application, before it writes to the browser, the client web app must HTML encode. Web API doing this generally does not make sense because a web API can be consumed by other clients say a WPF application where XSS may not be applicable. Or am I missing any specific case you have in mind?
Why dont you use HttpUtility.HtmlEncode?
Input should always be validated. It doesn't matter where it is going. A name field should return a name string, not a jpeg file or for example depending on your environment a SQL attack.
We have an internal server running an MVC3 application, which has been made available on our external server by using an ISA server.
However; it is not applying the CSS, as the references to the external site are not mapped correctly.
The main difference between the 2, that I can see, is internally it runs as an application on the internal server (http://InternalServer/MVCSite)
The External Server is seen as running as an application within a subsite (http://ExternalDomain/SubSite/MVCSite), this is what the world gets.
looking at the source the URL generated I can see that is behaves like the internal site, returning: #Url.Content("~/Content/Site.css") as /MVCSite/Content/Site.css.
And I have been looking at this article to see if it sheds any light: http://support.microsoft.com/kb/885186
I have thought that it might be that the redirection cannot handle differences in site structures? So, modifying the internal site to reflect the external layout will fix any uses. Internally InternalServer/MVCSite becomes ExternalDomain/SubSite/MVCSite
I will add more as I find things out.
This answer is not related to isa, and there is probably a proper way to do this configuring it. However if you cannot find the right solution, HttpRuntime.AppDomainAppVirtualPath may help you.
Instead of using: #Url.Content("~/Content/Site.css"), you can try #Url.Content(HttpRuntime.AppDomainAppVirtualPath + "Content/Site.css")
If this doesn't work, you can override the value of AppDomainAppVirtualPath using reflection. Have a look here: http://www.experts-exchange.com/Programming/Languages/.NET/ASP.NET/Q_24475811.html#a24591595.
Dim vpathTypeName As String = "System.Web.VirtualPath, System.Web, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a"
Dim virtualPath As Object = ReflectionHelper.Instantiate(vpathTypeName, New Type() {GetType(String)}, New Object() {"/"})
ReflectionHelper.SetPrivateInstanceFieldValue("_appDomainAppVPath", runtime, virtualPath)
It basically updates an static field with whatever value you need. You can validate the host header with : HttpContext.Current.Request.Url.Host and set the value to / or /SubSite/
I am developing a web app. using struts2 and jboss url rewrite valve as you see from the title. What i want to know is which are the neccessary jboss configuration files to configure rewrite valve settings?
Besides, I couldn't find any information in order to fix my non-english character problem while url rewriting. Some flags like [NE] doesn't help.
Let me clear the case for you;
There is a link on my jsp page, its value is:
http://localhost:8080/struts2Sample/redirectLogin/text/blahblah
And in my rewrite.properties file I added this code;
RewriteCond %{SERVLET_PATH} ^/redirectLogin/text/(.)
RewriteRule ^/redirectLogin/text/(.) /redirectLogin.do?text=$1 [NE,PT,L]
If the 'text' variable include non-english characters like 'şçğüıö' instead of 'blahblah' then the action gets a differently encoded value of 'şçğüıö', I mean it gets a string like '%C4%5F%' or including some another wierd characters.
If you have any idea about how I can fix this issue with, say, a rewriteMap or another flag include a piece of perl code, or (up to me it is more effective solution) configuring a charset or encoding in an xml file of jboss like server.xml etc. I will be glad to hear that.
Thanks a lot,
Baris
I used URLEncoder.encode or decode for UTF-8 in order to prevent character problems