I am in NTLM hell here, hope you can help indentify what I am missing.
I am ultimately trying to deliver SSRS reports to a frame in a browser, and only the images within the reports are giving me much grief. They don't appear unless the user has Firefox and enters their credentials 2 times, first for the report, then a second time for the images in the report.
I am using HttpWebRequest to obtain the SSRS reports.
I am sending the webserver (IIS 7.5) a credential cache with "NTLM" and valid credentials to try and obtain an images from SSRS after I receive the HTML stream so that I can store them locally and refer to those, which would alleviate the users from having to re-enter credentials again and again.
I see in Fiddler that Type 1, 2 and 3 challenges are properly met during the NTLM hand shaking, however, the final response is 500 internal service error. The response text also indicates rsStreamNotFound, however, I find next to no info on what that means and I think it's misleading as to what the real problem may be.
When I use Firefox, Firefox prompts me for my network credentials for the report and then again for the images, and it gets through bringing the images back. My HttpWebRequest fails with 500 Internal Server Error, and rsStreamNotFound.
The only difference I can see in the request headers between Firefox requests and my requests is that the "keep-alive" property gets dropped from my programmatic request, and the Firefox requests have it in there.
Why do my "keep-alive" get dropped?
At this point, that is the only difference between my request and the request from Firefox, so I would like to eliminate that difference before jumping to any other conclusion.
I tried variations of:
req.KeepAlive = true;
req.PreAuthenticate = true;
and this gem:
var sp = req.ServicePoint;
var prop = sp.GetType().GetProperty( "HttpBehaviour", BindingFlags.Instance | BindingFlags.NonPublic );
prop.SetValue( sp, (byte)0, null );
Here is the CredentialCache:
CredentialCache credentialCache = new CredentialCache();
credentialCache.Add( new Uri( path ), "NTLM", NetCredentials );
... and "keep-alive" is not present in the request for my HttpWebRequest, and Firefox has it - why does mine get dropped?
Update:
I tried:
using (WebClient client = new WebClient()) {
client.DownloadFile(url, filePath);
}
...and I got 401 Unauthorized, so I tried with credentials and get 500 Internal Server Error
It is not clear from your question where you are running the code. Is it running on the desktop as an executable? Or is it running inside firefox as some sort of activex control or something similar?
Anyway, I suggest using .net tracelog facility to get a log of your transaction, and look at the logfile.
<?xml version="1.0" encoding="UTF-8" ?>
<configuration>
<system.diagnostics>
<trace autoflush="true" />
<sources>
<source name="System.Net">
<listeners>
<add name="System.Net"/>
</listeners>
</source>
<source name="System.Net.Sockets">
<listeners>
<add name="System.Net"/>
</listeners>
</source>
<source name="System.Net.Cache">
<listeners>
<add name="System.Net"/>
</listeners>
</source>
</sources>
<sharedListeners>
<add
name="System.Net"
type="System.Diagnostics.TextWriterTraceListener"
initializeData="System.Net.trace.log"
/>
</sharedListeners>
<switches>
<add name="System.Net" value="Verbose" />
<add name="System.Net.Sockets" value="Verbose" />
<add name="System.Net.Cache" value="Verbose" />
</switches>
</system.diagnostics>
</configuration>
If your app name is app.exe, create a file called app.exe.config in the same directory as the exe, and put the above contents into it. Then run the app, and a logfile should be created.
This link should have more information on getting logfiles, in case you have a problem.
creating a system.net trace log
You can put the logfile on pastebin after deleting personal information like hostnames, ipaddresses etc.
Also, give us a snippet of code that reproduces the problem. Then it will be easier to help.
Related
I have website created under IIS 8.0 in Windows 2012. An URL rewrite with ARR has created under this site which points to a linux machine (Basically to a webservice deployed in Tomcat) and through IIS there will be "GET" request where the URL is exceeding more than 5000 characters along with query string. When the URL is hit through program or in browser, I see IIS throws "Bad Request" with Status code as 400 and no Substatus code. The same works when it is directly hit to the webservice (for tomcat Linux). I suspected the issue is because of the excess characters in the url, because when I try decreasing the url length to 3500 characters which works without any error. Below are the configuration and settings that I have tried in IIS web.config and as well as http.sys registry, but nothing seems to be work.
<configuration>
<system.webServer>
<security>
<requestFiltering allowDoubleEscaping="true">
<requestLimits maxAllowedContentLength="4294967295" maxUrl="10999" maxQueryString="2097151">
<headerLimits>
<add header="Content-Type" sizeLimit="100000000" />
</headerLimits>
</requestLimits>
</requestFiltering>
</security>
<urlCompression doDynamicCompression="false" />
</system.webServer>
<system.web>
<httpRuntime maxRequestLength="8192" maxUrlLength="8192" maxQueryStringLength="8192" requestPathInvalidCharacters="" />
</system.web>
</configuration>
http.sys Registry Settings
HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\HTTP\Parameters\MaxFieldLength - DWORD - 65534
HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\HTTP\Parameters\MaxRequestBytes - DWORD - 16777216
HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\HTTP\Parameters\UrlSegmentMaxCount - DWORD - 16383
HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\HTTP\Parameters\UrlSegmentMaxLength - DWORD - 32766
I have rebooted the Windows server after every above settings and configurations was done.
These settings and configuration are honored by IIS without url rewrite. If I try other url with length of 5000 characters for the same website its works without any error. This is appearing only for the url that has fallen under URL rewrite config. Is there any specific configurations that need to be done with respect to URL length in ARR apart from the above mentioned? Please suggest and help.
Thanks in Advance.
Answering to my question.
The issue was on the Linux Tomcat side not on IIS, the catalina.out logs was stated as below
INFO: Error parsing HTTP request header
Note: further occurrences of HTTP header parsing errors will be logged at DEBUG level.
java.lang.IllegalArgumentException: Request header is too large
I added the configuration of maxHttpHeaderSize="65536" in server.xml of tomcat/conf. After modifying this, the issue was resolved.
I am trying to upload mp4 format file to web api from rest sharp.But every time it gives me Maximum file size exceeded exception.I Put
<security>
<requestFiltering>
<requestLimits maxAllowedContentLength="1073741824" />
</requestFiltering>
</security>
this code in my web api application web config file but no result.
I am calling my web api from other application in that I use Nuget RestShap to call web api.Please help me.
Code to call web api.
var request = new RestRequest("Uploads", Method.POST);
request.AddFile("filename", Server.MapPath("/Images/videoplayback.mp4"), "multipart/form-data");
request.AddQueryParameter("participantsId", "2");
request.AddQueryParameter("taskId", "77");
request.AddQueryParameter("EnteredAnswerOptionId", "235");
request.AddQueryParameter("lat", "12.15");
request.AddQueryParameter("lon", "12.56");
request.AlwaysMultipartFormData = true;
IRestResponse response = createClient().Execute(request);
Also add this entry or add the attributes if you already have this entry in your web.config. The executionTimeout is in seconds and the maxRequestLength is in KB. Example is 2 hours and 1GB
<system.web>
<httpRuntime executionTimeout="7200" maxRequestLength="1048576" />
</system.web>
I think you should have to refer these links,you may get required stuff here...
http://www.strathweb.com/2012/09/dealing-with-large-files-in-asp-net-web-api/
https://social.msdn.microsoft.com/Forums/sqlserver/en-US/b5dbab38-9741-45ca-a791-1847b2935bb0/azure-api-app-rest-web-api-upload-limit?forum=AzureAPIApps
Perhaps I'm missing something by not wording my Google searches correctly, but I've run into an issue with IIS 8.5 and caching. I have a server set up that by all standards should be serving only static files. Obviously, when a file is changed, the new file should be served up. The issue is that even after a server restart, setting files to immediately expire, didsabling caching, disabling compression, and turning off any other caching feature, the old file with its old timestamp is still being served.
I have the following settings:
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<system.webServer>
<security>
<requestFiltering allowHighBitCharacters="false">
<verbs allowUnlisted="false">
<add verb="GET" allowed="true" />
</verbs>
</requestFiltering>
</security>
<caching enabled="false" enableKernelCache="false" />
<urlCompression doStaticCompression="false" />
</system.webServer>
<location path="" overrideMode="Deny">
<system.webServer>
</system.webServer>
</location>
<location path="" overrideMode="Allow">
<system.webServer>
</system.webServer>
</location>
</configuration>
The folder in which the files are located has read only permissions. The interesing fact is that if I go to mydomain.com, the old version shows up, but going to newmydomain.com loads the new file (even though they both point to the same IP address).
An HTTP client can use the old version of a file if the cache control header(s) sent with the response indicated that the content would not change for a given period of time. It does not matter if the content changed on the server or not.
For example, if the file is sent with the header:
Cache-Control: Max-age=86400
then for 24 hours the client can use the file without contacting the server. If the file changes on the server, the client won't know that the file changed because it won't even make a request to the server.
You can add the must-revalidate cache control attribute to force the client to always make a server request.
As noted in my reply to storsoc, our issue was that our load balancer, an F5 server, was trying offload as much as possible from our web servers by caching our site. See K13255: Displaying and deleting HTTP cache entries from the command line (11.x and later) for how to forcefully remove cached entries.
I'm working on a web page that calls a REST webservice via ajax to get and insert data.
The problem is that we need to send a base64 image in a JSON. You know, the base64 image is the imaged converted to that large text: base64/fjhd7879djkdadys7d9adsdkjasjdshk...
When we try with a 1 KB image, it works.
But with a bigger file(55kb), it doesn't.
So I assume it has something to do with the maxRequest, but the error says that is No 'Access-Control-Allow-Origin'. But we havent fount any way to configure it. Please help.
By default browsers block json requests from other domains other than the page unless the json request has the Access-Control-Allow-Origin header, so you'll need to add that header to your json requests on that service or use the same domain for both.
More info here: https://developer.mozilla.org/en-US/docs/HTTP/Access_control_CORS
You can try setting the maxJsonLength to it's maximum value in the web.config file.
<system.web.extensions>
<scripting>
<webServices>
<jsonSerialization maxJsonLength="2147483647"/>
</webServices>
</scripting>
</system.web.extensions>
I know this is an old post, but for anyone who still might be having this problem, I solved it by adding two settings to Web.config as described here: https://west-wind.com/webconnection/docs/_4lp0zgm9d.htm
<system.webServer>
<security>
<requestFiltering>
<requestLimits maxAllowedContentLength="2147483647"></requestLimits>
</requestFiltering>
</security>
<!--snip-->
</system.webServer>
and
<system.web>
<httpRuntime maxRequestLength="2147483647" />
<!--snip-->
</system.web>
I currently have a ASP.Net MVC web application that needs to upload large files using ajax. I am currently using this jQuery plugin - http://valums.com/ajax-upload/. I have also used this plugin - http://jquery.malsup.com but get the same result.
The issue that I am having for large file is that the iframe that gets generated to in order for the request to be asynchronous is not loading in time.
It always seems to point to this code:
var doc = iframe.contentDocument ? iframe.contentDocument : iframe.contentWindow.document, response;
For smaller files the script works great but for larger files the iframe nevers seems to get initialized properly.
This has been driving me crazy. Can someone please HELP.
thanks in advance
You might need to increase the maximum allowed request size on the server as well as the execution timeout of the request using the <httpRuntime> section in your web.config
<system.web>
<httpRuntime
maxRequestLength="size in kbytes"
executionTimeout="seconds"
/>
...
</system.web>
And if you are deploying your application in IIS 7.0+ you might also need to increase the maximum allowed request size using the <requestLimits> node of the <system.webServer> section:
<system.webServer>
<security>
<requestFiltering>
<requestLimits maxAllowedContentLength="size in bytes" />
</requestFiltering>
</security>
...
</system.webServer>