AJAX UpdateProgress not working on server? - ajax

I am trying to show an animated image while data is loading into a gridview after a button click. It works great on localhost, but when I deploy it, it doesn't. I have searched through posts, and I have not made any of what seem to be the most common mistakes ... ie. putting the updateprogress inside the updatepanel, etc. However, I am using a masterpage - but the masterpage doesn't have a scriptmanager on it. I noticed the following difference in my view source pages when I compare production to localhost .. Can anyone help me understand why the JavaScript to make this work might not be showing up in production?
On localhost (where it works) I see this at the bottom of the page:
[CDATA[
Sys.Application.initialize();
Sys.Application.add_init(function() {
$create(Sys.UI._UpdateProgress, {"associatedUpdatePanelId":null,"displayAfter":500,"dynamicLayout":true}, null, null, $get("ctl00_ContentPlaceHolder1_UpdateProgress1"));
});
In production (where it does NOT work), this is all I see:
Sys.Application.initialize();

I was having really hard time after converting my project from VS2008 to VS2010. The UpdateProgress stopped working suddenly, which was fine in VS2008. Spending a whole afternoon to search the answer and experimenting this and that, finally I found what went wrong from Scott Gu's posting.
It was an automatically generated web.config entry 'xhtmlConformance mode="Legacy"'.
After disabling this, it started to work again. May be not the case for you but just for guys struggling with the same problem.
Happy coding

This may not be your ideal solution, but you could show() or hide() your animated image just using javascript. Using the following javascript functions (and getting rid of the UpdateProgress control) should do the trick.
Sys.WebForms.PageRequestManager.getInstance().add_beginRequest(beginRequest);
Sys.WebForms.PageRequestManager.getInstance().add_endRequest(endRequest);
function beginRequest(sender, args) {
document.getElementById('myImageElement').style.display = 'block';
}
function endRequest(sender, args) {
document.getElementById('myImageElement').style.display = 'none';
}
Keep in mind this will happen for every postback, you may need to use the sender parameter to deduce which element called the postback and only perform the display when the correct updatepanel is hit. These events are fired at the beginning and end (respectively) of each UpdatePanel postback. Good luck.

Related

History issue combining WP7.5, phonegap and jqm

I have a phonegap app that uses jqm that works fine in android and ios.
Porting to WP7 i have an issue with the history, specifically history.back() (but also .go(-1) etc). This refers to going back in history where the previous 'page' was in the same physical html file, just a different data-role=page div.
using a jwm site in a regular browser is fine (with separate 'pages' in the same html file). Also, using history.back() when we go from one html file to another in the app is fine. It's the specific combination of WP7.5, jqm and PG.
Has anyone come across a solution for this? it's driving me crazy, and has been as issue since PG 1.4.1 and jwm 1.0.
EDIT 1: It's possible that the phonegap process of initialising the webview on WP7.5 somehow overrides the jqm history overrides, after they've loaded.
EDIT 2: definitely something to do with jqm not being able to modify the history. each time there is a 'page' change, history.length is still 0.
EDIT 3: When i inspect the 'history' object, i found there is no function for replaceState or pushState - i know jqm uses this for history nav, maybe that's the problem.
ok - this isn't perfect, but here's a solution (read: hack) that works for me. It only works for page hash changes, not actual url changes (but you could add a regex check for that). Put this somewhere in the code that runs on ondeviceready:
if (device.platform == 'WinCE') {
window.history.back = function () {
var p = $.mobile.urlHistory.getPrev();
if (p) {
$.mobile.changePage("#" + p.pageUrl, { reverse: true });
$.mobile.urlHistory.stack.splice(-2, 2);
$.mobile.urlHistory.activeIndex -= 2;
}
}
}

WatiN driving the IE "Are you sure you want to leave this page?" popup

I'd like to extend my WatiN automated tests to drive a page that guards against the user accidentally leaving the page without saving changes.
The page uses the "beforeunload" technique to seek confirmation from the user:
$(window).bind('beforeunload', function (event) {
if (confirmationRequired) {
return "Sure??";
}
});
My WatIn test is driving the page using IE. I cannot find a way to get WatIn to attach to the popup dialog so I can control it from my test.
All the following have failed (where the hard-coded strings refer to strings that I can see on the popup):
Browser.AttachTo<IE>(Find.ByTitle("Windows Internet Explorer");
browser.HtmlDialog(Find.FindByTitle("Windows Internet Explorer));
browser.HtmlDialog(Find.FindByTitle("Are you sure you want to leave this page?));
browser.HtmlDialog(Find.FindFirst());
Thanks!
You'll need to create and add the dialog handler.
Example Go to example site, click link, click leave page on confirmation dialog:
IE browser = new IE();
browser.GoTo("http://samples.msdn.microsoft.com/workshop/samples/author/dhtml/refs/onbeforeunload.htm");
WatiN.Core.DialogHandlers.ReturnDialogHandlerIe9 myHandler = new WatiN.Core.DialogHandlers.ReturnDialogHandlerIe9();
browser.AddDialogHandler(myHandler);
browser.Link(Find.ByUrl("http://www.microsoft.com")).ClickNoWait();
myHandler.WaitUntilExists();
myHandler.OKButton.Click();
browser.RemoveDialogHandler(myHandler);
The above is working on WatiN2.1, IE9, Win7. If using IE8 or before, you will likely need to use the ReturnDialogHandler object instead of the Ie9 specific handler

Flash + fancy uploader.. have to click twice on firefox

http://digitarald.de/project/fancyupload/3-0/showcase/attach-a-file/
That's the uploader plugin I'm using.
If you go there in firefox, you'll notice you have to click "attach a file" twice before it works. It seems to work fine in every other browser (that I've tested).
it's creating a flash object, and I'm not sure how to go about making it so you only click once in FF.
I'm not familiar with mooTools, but have you tried something like this? (attempted to write it in mooTools, but have no idea what I'm doing).
$('uploadLink').addEvent('click', function(){
if(Browser.firefox) $('uploadLink').fireEvent('click');
});
or I suppose if it has to wait for the flash to be created, something like this:
$('uploadLink').addEvent('click', function(){
if(Browser.firefox){
var flashTimer = setTimeout(function(){
clearTimeout(flashTimer);
/// or however you make sure the flash has successfully been added to the page
if($('flashContainer').getElements().length) $('uploadLink').fireEvent('click');
},100);
}
});
There's always the possibility that FF's security measures won't let you do something like this (mouse interactions with flash can be potentially harmful, as flash has FS access and stuff).
Depending on what your backend is, I'm highly in favor of skipping flash for file uploads when possible. One very well written plugin for such a task is available here:
http://valums.com/ajax-upload/
Good luck!

How to Stop the page loading in firefox programmatically?

I am running several tests with WebDriver and Firefox.
I'm running into a problem with the following command:
WebDriver.get(www.google.com);
With this command, WebDriver blocks till the onload event is fired. While this can normally takes seconds, it can take hours on websites which never finish loading.
What I'd like to do is stop loading the page after a certain timeout, somehow simulating Firefox's stop button.
I first tried execute the following JS code every time that I tried loading a page:
var loadTimeout=setTimeout(\"window.stop();\", 10000);
Unfortunately this doesn't work, probably because :
Because of the order in which scripts are loaded, the stop() method cannot stop the document in which it is contained from loading 1
UPDATE 1: I tried to use SquidProxy in order to add connect and request timeouts, but the problem persisted.
One weird thing that I found today is that one web site that never stopped loading on my machine (FF3.6 - 4.0 and Mac Os 10.6.7) loaded normally on other browsers and/or computers.
UPDATE 2: The problem apparently can be solved by telling Firefox not to load images. hopefully, everything will work after that...
I wish WebDriver had a better Chrome driver in order to use it. Firefox is disappointing me every day!
UPDATE 3: Selenium 2.9 added a new feature to handle cases where the driver appears to hang. This can be used with FirefoxProfile as follows:
FirefoxProfile firefoxProfile = new ProfilesIni().getProfile("web");
firefoxProfile.setPreference("webdriver.load.strategy", "fast");
I'll post whether this works after I try it.
UPDATE 4: at the end none of the above methods worked. I end up "killing" the threads that are taking to long to finish. I am planing to try Ghostdriver which is a Remote WebDriver that uses PhantomJS as back-end. PhantomJS is a headless WebKit scriptable, so i expect not to have the problems of a real browser such as firefox. For people that are not obligate to use firefox(crawling purposes) i will update with the results
UPDATE 5: Time for an update. Using for 5 months the ghostdriver 1.1 instead FirefoxDriver i can say that i am really happy with his performance and stability. I got some cases where we have not the appropriate behaviour but looks like in general ghostdriver is stable enough. So if you need, like me, a browser for crawling/web scraping purposes i recomend you use ghostdriver instead firefox and xvfb which will give you several headaches...
I was able to get around this doing a few things.
First, set a timeout for the webdriver. E.g.,
WebDriver wd;
... initialize wd ...
wd.manage().timeouts().pageLoadTimeout(5000, TimeUnit.MILLISECONDS);
Second, when doing your get, wrap it around a TimeoutException. (I added a UnhandledAlertException catch there just for good measure.) E.g.,
for (int i = 0; i < 10; i++) {
try {
wd.get(url);
break;
} catch (org.openqa.selenium.TimeoutException te) {
((JavascriptExecutor)wd).executeScript("window.stop();");
} catch (UnhandledAlertException uae) {
Alert alert = wd.switchTo().alert();
alert.accept();
}
}
This basically tries to load the page, but if it times out, it forces the page to stop loading via javascript, then tries to get the page again. It might not help in your case, but it definitely helped in mine, particularly when doing a webdriver's getCurrentUrl() command, which can also take too long, have an alert, and require the page to stop loading before you get the url.
I've run into the same problem, and there's no general solution it seems. There is, however, a bug about it in their bug tracking system which you could 'star' to vote for it.
http://code.google.com/p/selenium/issues/detail?id=687
One of the comments on that bug has a workaround which may work for you - Basically, it creates a separate thread which waits for the required time, and then tries to simulate pressing escape in the browser, but that requires the browser window to be frontmost, which may be a problem.
http://code.google.com/p/selenium/issues/detail?id=687#c4
My solution is to use this class:
WebDriverBackedSelenium;
//When creating a new browser:
WebDriver driver = _initBrowser(); //Just returns firefox WebDriver
WebDriverBackedSelenium backedSelenuium =
new WebDriverBackedSelenium(driver,"about:blank");
//This code has to be put where a TimeOut is detected
//I use ExecutorService and Future<?> Object
void onTimeOut()
{
backedSelenuium.runScript("window.stop();");
}
It was a really tedious issue to solve. However, I am wondering why people are complicating it. I just did the following and the problem got resolved (perhaps got supported recently):
driver= webdriver.Firefox()
driver.set_page_load_timeout(5)
driver.get('somewebpage')
It worked for me using Firefox driver (and Chrome driver as well).
One weird thing that i found today is that one web site that never stop loading on my machine (FF3.6 - 4.0 and Mac Os 10.6.7), is stop loading NORMALy in Chrome in my machine and also in another Mac Os and Windows machines of some colleague of mine!
I think the problem is closely related to Firefox bugs. See this blog post for details. Maybe upgrade of FireFox to the latest version will solve your problem. Anyway I wish to see Selenium update that simulates the "stop" button...
Basically I set the browser timeout lower than my selenium hub, and then catch the error. And then stop the browser from loading, then continue the test.
webdriver.manage().timeouts().pageLoadTimeout(55000);
function handleError(err){
console.log(err.stack);
};
return webdriver.get(url).then(null,handleError).then(function () {
return webdriver.executeScript("return window.stop()");
});
Well , the following concept worked with me on Chrome , try the same:
1) Navigate to "about:blank"
2) get element "body"
3) on the elemënt , just Send Keys Ësc
Just in case someone else might be stuck with the same forever loading annoyance, you can use simple add-ons such as Killspinners for Firefox to do the job effortlessly.
Edit : This solution doesn't work if javascript is the problem. Then you could go for a Greasemonkey script such as :
// ==UserScript==
// #name auto kill
// #namespace default
// #description auto kill
// #include *
// #version 1
// #grant none
// ==/UserScript==
function sleep1() {
window.stop();
setTimeout(sleep1, 1500);
}
setTimeout(sleep1, 5000);

httpCookie cause the page not to load

I'm using VS 2010, vb.net and asp 3.5. I have a simple default.aspx page that has
Dim ctx As HttpContext = HttpContext.Current
Dim cookie As HttpCookie = ctx.Request.Cookies("SessionGUID")
Me.lbl1.Text = cookie.Value.ToString
the page loads fine when running it from within VS, but when i build the site and run the page, it doesn't load.. it doesn't give me an error, but nothing shows up.
This is what the view source looks like
HTML>HEAD>
META content="text/html; charset=windows-1252" http-equiv=Content-Type>/HEAD>
BODY>/BODY>/HTML>
I took out the < in the tags so that it would display here...
If i take out the Me.lbl1.Text = cookie.Value.ToString the page loads fine.. All i'm putting to the page is some text and the label control.
anyone have any ideas
well.. i didn't figure it out.. but did somethig different that does work..
not sure if it's better or worse.
I took out all the plumbing for the session module and instead created a
session in the global.ascx file in the session_start... maybe that is where
it should have been all along. from that point i was able to change teh
spots where i was using the cookie to the session.
works as far as i can tell.. more testing will tell.

Resources