httpCookie cause the page not to load - visual-studio-2010

I'm using VS 2010, vb.net and asp 3.5. I have a simple default.aspx page that has
Dim ctx As HttpContext = HttpContext.Current
Dim cookie As HttpCookie = ctx.Request.Cookies("SessionGUID")
Me.lbl1.Text = cookie.Value.ToString
the page loads fine when running it from within VS, but when i build the site and run the page, it doesn't load.. it doesn't give me an error, but nothing shows up.
This is what the view source looks like
HTML>HEAD>
META content="text/html; charset=windows-1252" http-equiv=Content-Type>/HEAD>
BODY>/BODY>/HTML>
I took out the < in the tags so that it would display here...
If i take out the Me.lbl1.Text = cookie.Value.ToString the page loads fine.. All i'm putting to the page is some text and the label control.
anyone have any ideas

well.. i didn't figure it out.. but did somethig different that does work..
not sure if it's better or worse.
I took out all the plumbing for the session module and instead created a
session in the global.ascx file in the session_start... maybe that is where
it should have been all along. from that point i was able to change teh
spots where i was using the cookie to the session.
works as far as i can tell.. more testing will tell.

Related

Swashbuckle.AspNetCore .NET 6.0 blank swagger site

I am trying to setup swagger for the product I'm developing and cannot wrap my head around it.
I started with the most basic config as described here. The swagger.json was generated correctly under https://localhost/MyWebAPI/swagger/v1/swagger.json, but when navigating to https://localhost/MyWebAPI/swagger/index.html I get a blank site. Did some digging and most of the answers were revolving around setting up SwaggerEndpoint, RoutePrefix or some uri templates but none of them worked for me so I finally did what should have done in the first place and checked code of the site itself.
It is there... The url's seems correct:
var configObject = JSON.parse('{"urls":[{"url":"v1/swagger.json","name":"MyApp v1"}],"deepLinking":false,"persistAuthorization":false,"displayOperationId":false,"defaultModelsExpandDepth":1,"defaultModelExpandDepth":1,"defaultModelRendering":"example","displayRequestDuration":false,"docExpansion":"list","showExtensions":false,"showCommonExtensions":false,"supportedSubmitMethods":["get","put","post","delete","options","head","patch","trace"],"tryItOutEnabled":false}');
var oauthConfigObject = JSON.parse('{"scopeSeparator":" ","scopes":[],"useBasicAuthenticationWithAccessCodeGrant":false,"usePkceWithAuthorizationCodeGrant":false}');
// Workaround for https://github.com/swagger-api/swagger-ui/issues/5945
configObject.urls.forEach(function (item) {
if (item.url.startsWith("http") || item.url.startsWith("/")) return;
item.url = window.location.href.replace("index.html", item.url).split('#')[0];
});
The issue is and I kid you not the line with interceptors that is actually split into several lines and the browser wouldn't recognise it as a correct string.
Obviously I tried to pass null as the entire section, but that just brakes everything two lines later. I am in shambles...
I tried with several versions of Swashbuckle (currently using 6.5.0, but tried with some previous ones starting from 6.1.5). Any ideas how to fix it as I guess this must be generally working but there's just something weird/wrong that I'm missing.
Right... one of the most stupid things I've encountered lately. I started reading Swashbuckle source code and the only class that when serialised wouldn't get the JsonSerializerOptions as defined in Swashbuckle project is InterceptorFunctions, so it used mine... and mine would have WriteIndented set as true...

Prevent external files in src of iframe from being cached (with CloudFlare)

I am trying to make a playground like plunker. I just noticed a very odd behavior on production (with active mode in Cloudflare), whereas it works well in localhost.
By iframe, the playground previews index_internal.html which may contain links to other files (eg, .html, .js, .css). iframe is able to interpret external links such as <script src="script.js"></script>.
So each time a user modifies their file (eg, script.js) on the editor, my program saves the new file into a temporary folder on the server, then refresh the iframe by iframe.src = iframe.src, it works well on localhost.
However, I realized that, in production, the browser always keeps loading the initial script.js, even though users modify it in the editor and a new version is written in the folder in the server. For example, what I see in Dev Tools ==> Network is always the initial version of script.js, whereas I can check the new version of script.js saved in the server by less on the left hand.
Does anyone know why it is like this? And how to fix it?
Edit 1:
I tried the following, which did not work with script.js:
var iframe = document.getElementById('myiframe');
iframe.contentWindow.location.reload(true);
iframe.contentDocument.location.reload(true);
iframe.contentWindow.location.href = iframe.contentWindow.location.href
iframe.contentWindow.src = iframe.contentWindow.src
iframe.contentWindow.location.replace(iframe.contentWindow.location.href)
I tried to add a version, it worked with index_internal.html, but did not reload script.js:
var newSrc = iframe.src.split('?')[0]
iframe.src = newSrc + "?uid=" + Math.floor((Math.random() * 100000) + 1);
If I turn Cloudflare to development mode, script.js is reloaded, but I do want to keep Cloudflare in active mode.
I found it.
We can create a custom rule for caching in CloudFlare:
https://support.cloudflare.com/hc/en-us/articles/200168306-Is-there-a-tutorial-for-Page-Rules-#cache
For example, I could set Bypass as Cache Level for the folder www.mysite.com/tmp/*.

Wicket - Internet Explorer double submit

I have a big problem with Internet Explorer 7 and 8.
SITUATION:
I have a FORM that build a Medical Prescription. When I hit the save button, the script saves the DomainObject on DB and set a boolean property (of panel where the form is added) called "saved" to true and a byte[] property called PDF with bytestream.
On RenderHead of Panel, I read this boolean and, if is true, I force the trigger of a hidden button with this code:
String js = "$('#" + printPDF.getMarkupId() + "').click();";
response.renderOnDomReadyJavaScript(js);
The button executes this code:
ResourceStreamRequestHandler handler = new ResourceStreamRequestHandler(new ByteArrayResourceStream(pdf, "application/pdf"));
handler.setFileName("foo.pdf");
RequestCycle.get().scheduleRequestHandlerAfterCurrent(handler);
This code work perfecly on FF and Chrome. The Browser download windows appears and the user can save the PDF on HD.
Unfortunally, Internet Explorer has that damn security behavior that is triggered when a site require something to download. That warning require a user validation. A yellow Bar appear and the user is force to hit "Download".
screenshot http://imageshack.us/a/img198/1438/securityg.jpg
When I hit Download File, the form is submitted again with the exact state I had when I hit save the first time. So no previous INSERT on DB is already committed; The Session is resetted to the previous state etc...
The result is a double INSERT on DB of the Domain Ojbect.
Any clue to resolve this?
The problem is that you click download link programaticly instead to redirect browser to an URL or open an URL by JS window.open(url). Click a link looks like an unwanted operation that is sometimes restricted by browser.

jQuery load() not working in Internet Explorer

I am trying to use jQuery load() function to get content from another page via AJAX. It works on Firefox, Google Chrome, but not in Internet Explorer 7 & 8.
Here is the page I am developing: http://139.82.74.22/70anos/no-tempo
All the jQuery code is working normally in Internet Explorer, but the specific part that should bring the destination page isn't. To understand the problem, one must click the "Há 80 anos" or "Há 70 anos" block and click any of the links inside it. It should open a panel underneath the timeline with the content of the block.
Here is the code that pulls the external content:
jQuery('a.link-evento').click(function() {
var strUrl = jQuery(this).attr('href');
var objBlocoConteudo = jQuery(this).parents('div.view-content').next().find('div.conteudo-evento')
objBlocoConteudo.css('display','block').animate({ opacity: 1}, {duration: 350}).load(strUrl + ' #area-conteudo-evento');
return false;
});
With this code I am grabbing the URL of the destination page and telling the browser not to do a normal request, but to open it using jQuery load() function.
Any help appreciated fixing this IE... Thank you.
I'm pretty sure AJAX requests have to be made to a domain name in IE as a security precaution. If you map a domain to your 139.82.74.22 address your problem should go away.
You cant make an .Load(http://139.82.74.22/..), it would have to be .Load("http://mysite.com/mypage")

AJAX UpdateProgress not working on server?

I am trying to show an animated image while data is loading into a gridview after a button click. It works great on localhost, but when I deploy it, it doesn't. I have searched through posts, and I have not made any of what seem to be the most common mistakes ... ie. putting the updateprogress inside the updatepanel, etc. However, I am using a masterpage - but the masterpage doesn't have a scriptmanager on it. I noticed the following difference in my view source pages when I compare production to localhost .. Can anyone help me understand why the JavaScript to make this work might not be showing up in production?
On localhost (where it works) I see this at the bottom of the page:
[CDATA[
Sys.Application.initialize();
Sys.Application.add_init(function() {
$create(Sys.UI._UpdateProgress, {"associatedUpdatePanelId":null,"displayAfter":500,"dynamicLayout":true}, null, null, $get("ctl00_ContentPlaceHolder1_UpdateProgress1"));
});
In production (where it does NOT work), this is all I see:
Sys.Application.initialize();
I was having really hard time after converting my project from VS2008 to VS2010. The UpdateProgress stopped working suddenly, which was fine in VS2008. Spending a whole afternoon to search the answer and experimenting this and that, finally I found what went wrong from Scott Gu's posting.
It was an automatically generated web.config entry 'xhtmlConformance mode="Legacy"'.
After disabling this, it started to work again. May be not the case for you but just for guys struggling with the same problem.
Happy coding
This may not be your ideal solution, but you could show() or hide() your animated image just using javascript. Using the following javascript functions (and getting rid of the UpdateProgress control) should do the trick.
Sys.WebForms.PageRequestManager.getInstance().add_beginRequest(beginRequest);
Sys.WebForms.PageRequestManager.getInstance().add_endRequest(endRequest);
function beginRequest(sender, args) {
document.getElementById('myImageElement').style.display = 'block';
}
function endRequest(sender, args) {
document.getElementById('myImageElement').style.display = 'none';
}
Keep in mind this will happen for every postback, you may need to use the sender parameter to deduce which element called the postback and only perform the display when the correct updatepanel is hit. These events are fired at the beginning and end (respectively) of each UpdatePanel postback. Good luck.

Resources