Protractor - Firefox not reloading pages on browser.get(url) - firefox

I'm writing some Protractor tests, but I'm running into trouble. When running the tests on Chrome, my tests all pass. When I run the tests on Firefox though, I see that the page is not re-loading between tests. (This messes up the state of the page and causes tests to fail.)
A workaround I'm using right now is to refresh after I 'get' the page, but this is dumb.
Here's my beforeEach:
beforeEach(function() {
ptor = protractor.getInstance();
ptor.ignoreSynchronization = true;
browser.get('#/page');
// I need this to get my Firefox tests to work
browser.navigate().refresh();
});
What can I do to handle this better?
If I have to refresh the page each time, is there a clause like if(firefox) I can use?
I'm running Chrome 34.0.1847 (Mac OS X 10.9.1) successfully, and Firefox 28.0.0 (Mac OS X 10.9) is giving me trouble.

With synchronization off, all protractor does on a browser.get(url) is driver.get(destination);, so it's entirely up to the browser to determine how it wants to load the url.
On the other hand, with synchronization on, protractor first loads a blank page before renavigating to the intended url.
In addition, for firefox, reloading the same angular url on localhost does not reload the page, whereas it does for chrome. Try it: go to an angular app on localhost, enter something in and load the url again. For chrome, this will clear what you entered, but for firefox, it will not.
Turn on synchronization like #AndresD said if your app is Angular

Related

Abp Template Problem for MVC Core - JQuery 4.6.0

Using the very latest stable mvc core jquery template (4.6.0).
downloaded, ran db migration all good.
logged in using admin/123qwe and put a break point on the line AccountController line 104:
var loginResult = await GetLoginResultAsync(loginModel.UsernameOrEmailAddress, loginModel.Password, GetTenancyNameOrNull());
I'm seeing success result returned from above line.
Problem: The page stays on the login page (as though it was not authenticated) and I cannot navigate away to another page (eg \About)
No changes were made to the template code - What am I missing or should I report a bug in the repo?
This turned out to possibly be a local browser issue with a Chrome install.
To Fix the following steps were taken:
1) reinstall Chrome
2) clear all browser cookies, cache, history
3) restart Chrome, Close Chrome
4) re-run Abp app using Chrome

How does React deal with pre-compiled HTML from PhantomJS?

I compiled my reactjs using webpack and got a bundle file bundles.js. My bundles.js contains a component that make API calls to get the data.
I put this file in my html and pass the url to phantom.js to pre-compile static html for SEO reasons.
I am witnessing something strange here, the ajax calls for APIS are not getting fired at all.
For example, I have a component called Home which is called when I request for url /home. My Home component makes an ajax request to backend (django-rest) to get some data. Now when I call home page in phantomjs this api call is not getting fired.
Am I missing something here?
I have been using React based app rendering in Phantomjs since 2014. Make sure you use the latest Phantomjs version v2.x. The problems with Phantomjs occur because it uses older webkit engine, so if you have some CSS3 features used make sure they are prefixed correctly example flexbox layout.
From the JS side the PhantomJS does not support many newer APIs (example fetch etc.), to fix this add the polyfills and your fine. The most complicated thing is to track down errors, use the console.log and evaluate code inside the Phantomjs. There is also debugging mode which is actually quite difficult to use, but this could help you track down complex errors. I used webkit engine based browser Aurora to track down some of the issues.
For debugging the network traffic, try logging the requested and received events:
var page = require('webpage').create();
page.onResourceRequested = function(request) {
console.log('Request ' + JSON.stringify(request, undefined, 4));
};
page.onResourceReceived = function(response) {
console.log('Receive ' + JSON.stringify(response, undefined, 4));
};

Best way to control firefox via webdriver

I need to control Firefox browser via webdriver. Note, I'm not trying to control page elements (i.e. find element, click, get text, etc); rather I need access to Firefox's profiler and force gc (i.e. I need firefox's Chrome Authority and sdk). For context, I'm creating a micro benchmark framework, not running a normal webdriver test.
Obviously raw webdriver won't work, so what I've been trying to do is
1) Create a firefox extension/add-on that does what I need: i.e.
var customActions = function() {
console.log('calling customActions.')
// I need to access chrome authority:
var {Cc,Ci,Cu} = require("chrome");
Cc["#mozilla.org/tools/profiler;1"].getService(Ci.nsIProfiler);
Cu.forceGC();
var file = require('sdk/io/file');
// And do some writes:
var textWriter = file.open('a/local/path.txt', 'w');
textWriter.write('sample data');
textWriter.close();
console.log('called customActions.')
};
2) Expose my customActions function to a page:
var mod = require("sdk/page-mod");
var data = require("sdk/self").data;
mod.PageMod({
include: ['*'],
contentScriptFile: data.url("myscript.js"),
onAttach: function(worker) {
worker.port.on('callCustomActions', function() {
customActions();
});
}
});
and in myscript.js:
exportFunction(function() {
self.port.emit('callCustomActions');
}, unsafeWindow, {defineAs: "callCustomActions"});
3) Load the xpi during my webdriver test, and call out to global function callCustomActions
So two questions about this process.
1) This entire process is very roundabout. Is there a better practice for talking to a firefox extension via webdriver?
2) My current solution isn't working well. If I run my extension via cfx run directly (without webdriver) it works as expected. However, neither the sdk nor chrome authority do anything when running via webdriver.
By the way, I know my function is being called because the log line "calling customActions." and "called customActions." both do print.
Maybe there are some firefox preferences that I need to set but haven't?
It may be that you do not need the add-on at all. Mozilla uses Marionette for test automation of Firefox OS amongst other things:
Marionette is an automation driver for Mozilla's Gecko engine. It can
remotely control either the UI or the internal JavaScript of a Gecko
platform, such as Firefox or Firefox OS. It can control both the
chrome (i.e. menus and functions) or the content (the webpage loaded
inside the browsing context), giving a high level of control and
ability to replicate user actions. In addition to performing actions
on the browser, Marionette can also read the properties and attributes
of the DOM.
If this sounds similar to Selenium/WebDriver then you're correct!
Marionette shares much of the same ethos and API as
Selenium/WebDriver, with additional commands to interact with Gecko's
chrome interface. Its goal is to replicate what Selenium does for web
content: to enable the tester to have the ability to send commands to
remotely control a user agent.

How do we effectively click on a link using Selenium + php_webdriver + Firefox?

I was having lots of glitches with my tests using Codeception + WebDriver + Selenium + Firefox, so I decided to remove Codeception from the variables and do some tests directly using php_webdriver, to discover the problem wasn't in Codeception.
I tried to go to a page using 3 different methods:
// Click a link
$driver->findElement(WebDriverBy::id('inbox-navigation'))->click();
// Directly go to the address via URL
$driver->get('http://testing.mysite.net/#/messages');
// Click via javascript
$driver->executeScript("jQuery('#inbox-navigation').click();");
Of course none of them work and, doing it manually in my browser it works fine. The link appear as clicked, the url changes, but the related page (loaded via ajax) never gets loaded.
Here's the full source code I'm using to test this:
<?php
require __DIR__."/vendor/autoload.php";
$host = 'http://localhost:4444/wd/hub';
$driver = RemoteWebDriver::create($host, DesiredCapabilities::firefox());
$driver->manage()->window()->setSize(new \WebDriverDimension(1280, 720));
$driver->get('http://testing.mysite.net/');
$driver->findElement(WebDriverBy::id('email'))->sendKeys('john#doe.com');
$driver->findElement(WebDriverBy::id('password'))->sendKeys('secret');
$driver->findElement(WebDriverBy::id('sign-in-button'))->click();
$driver->wait(10)->until(WebDriverExpectedCondition::presenceOfElementLocated(WebDriverBy::id('inbox-navigation')));
$driver->findElement(WebDriverBy::id('inbox-navigation'))->click();
sleep(15); /// to make sure it wasn't too fast
$driver->takeScreenshot('screen.png');
var_dump($driver->getCurrentURL());
And this is the link changed to Inbox:
Also, manually executing jQuery('#inbox-navigation').click(); in console also works fine.

How to Stop the page loading in firefox programmatically?

I am running several tests with WebDriver and Firefox.
I'm running into a problem with the following command:
WebDriver.get(www.google.com);
With this command, WebDriver blocks till the onload event is fired. While this can normally takes seconds, it can take hours on websites which never finish loading.
What I'd like to do is stop loading the page after a certain timeout, somehow simulating Firefox's stop button.
I first tried execute the following JS code every time that I tried loading a page:
var loadTimeout=setTimeout(\"window.stop();\", 10000);
Unfortunately this doesn't work, probably because :
Because of the order in which scripts are loaded, the stop() method cannot stop the document in which it is contained from loading 1
UPDATE 1: I tried to use SquidProxy in order to add connect and request timeouts, but the problem persisted.
One weird thing that I found today is that one web site that never stopped loading on my machine (FF3.6 - 4.0 and Mac Os 10.6.7) loaded normally on other browsers and/or computers.
UPDATE 2: The problem apparently can be solved by telling Firefox not to load images. hopefully, everything will work after that...
I wish WebDriver had a better Chrome driver in order to use it. Firefox is disappointing me every day!
UPDATE 3: Selenium 2.9 added a new feature to handle cases where the driver appears to hang. This can be used with FirefoxProfile as follows:
FirefoxProfile firefoxProfile = new ProfilesIni().getProfile("web");
firefoxProfile.setPreference("webdriver.load.strategy", "fast");
I'll post whether this works after I try it.
UPDATE 4: at the end none of the above methods worked. I end up "killing" the threads that are taking to long to finish. I am planing to try Ghostdriver which is a Remote WebDriver that uses PhantomJS as back-end. PhantomJS is a headless WebKit scriptable, so i expect not to have the problems of a real browser such as firefox. For people that are not obligate to use firefox(crawling purposes) i will update with the results
UPDATE 5: Time for an update. Using for 5 months the ghostdriver 1.1 instead FirefoxDriver i can say that i am really happy with his performance and stability. I got some cases where we have not the appropriate behaviour but looks like in general ghostdriver is stable enough. So if you need, like me, a browser for crawling/web scraping purposes i recomend you use ghostdriver instead firefox and xvfb which will give you several headaches...
I was able to get around this doing a few things.
First, set a timeout for the webdriver. E.g.,
WebDriver wd;
... initialize wd ...
wd.manage().timeouts().pageLoadTimeout(5000, TimeUnit.MILLISECONDS);
Second, when doing your get, wrap it around a TimeoutException. (I added a UnhandledAlertException catch there just for good measure.) E.g.,
for (int i = 0; i < 10; i++) {
try {
wd.get(url);
break;
} catch (org.openqa.selenium.TimeoutException te) {
((JavascriptExecutor)wd).executeScript("window.stop();");
} catch (UnhandledAlertException uae) {
Alert alert = wd.switchTo().alert();
alert.accept();
}
}
This basically tries to load the page, but if it times out, it forces the page to stop loading via javascript, then tries to get the page again. It might not help in your case, but it definitely helped in mine, particularly when doing a webdriver's getCurrentUrl() command, which can also take too long, have an alert, and require the page to stop loading before you get the url.
I've run into the same problem, and there's no general solution it seems. There is, however, a bug about it in their bug tracking system which you could 'star' to vote for it.
http://code.google.com/p/selenium/issues/detail?id=687
One of the comments on that bug has a workaround which may work for you - Basically, it creates a separate thread which waits for the required time, and then tries to simulate pressing escape in the browser, but that requires the browser window to be frontmost, which may be a problem.
http://code.google.com/p/selenium/issues/detail?id=687#c4
My solution is to use this class:
WebDriverBackedSelenium;
//When creating a new browser:
WebDriver driver = _initBrowser(); //Just returns firefox WebDriver
WebDriverBackedSelenium backedSelenuium =
new WebDriverBackedSelenium(driver,"about:blank");
//This code has to be put where a TimeOut is detected
//I use ExecutorService and Future<?> Object
void onTimeOut()
{
backedSelenuium.runScript("window.stop();");
}
It was a really tedious issue to solve. However, I am wondering why people are complicating it. I just did the following and the problem got resolved (perhaps got supported recently):
driver= webdriver.Firefox()
driver.set_page_load_timeout(5)
driver.get('somewebpage')
It worked for me using Firefox driver (and Chrome driver as well).
One weird thing that i found today is that one web site that never stop loading on my machine (FF3.6 - 4.0 and Mac Os 10.6.7), is stop loading NORMALy in Chrome in my machine and also in another Mac Os and Windows machines of some colleague of mine!
I think the problem is closely related to Firefox bugs. See this blog post for details. Maybe upgrade of FireFox to the latest version will solve your problem. Anyway I wish to see Selenium update that simulates the "stop" button...
Basically I set the browser timeout lower than my selenium hub, and then catch the error. And then stop the browser from loading, then continue the test.
webdriver.manage().timeouts().pageLoadTimeout(55000);
function handleError(err){
console.log(err.stack);
};
return webdriver.get(url).then(null,handleError).then(function () {
return webdriver.executeScript("return window.stop()");
});
Well , the following concept worked with me on Chrome , try the same:
1) Navigate to "about:blank"
2) get element "body"
3) on the elemënt , just Send Keys Ësc
Just in case someone else might be stuck with the same forever loading annoyance, you can use simple add-ons such as Killspinners for Firefox to do the job effortlessly.
Edit : This solution doesn't work if javascript is the problem. Then you could go for a Greasemonkey script such as :
// ==UserScript==
// #name auto kill
// #namespace default
// #description auto kill
// #include *
// #version 1
// #grant none
// ==/UserScript==
function sleep1() {
window.stop();
setTimeout(sleep1, 1500);
}
setTimeout(sleep1, 5000);

Resources