How to make Behat wait for Angular ajax calls? - ajax

I have a reporting page that is basically a table you can add and remove columns from. When you add a column, the data for that column is fetched and loaded with ajax, using angular.
Consider this Behat scenario:
Given I have a user named "Dillinger Four"
And I am on "/reports"
When I add the "User's Name" column
Then I should see "Dillinger Four"
How can I make Behat wait until angular's ajax call completes? I would like to avoid using a sleep, since sleeps add unnecessary delay and will fail if the call takes too long.
I used the following to wait for jquery code:
$this->getSession()->wait($duration, '(0 === jQuery.active)');
I haven't found a similar value to check with angular.

Your link above was helpful, just to expand on it and save someone else a little time.
/**
* #Then /^I should see "([^"]*)" if I wait "([^"]*)"$/
*/
public function iShouldSeeIfIWait($text, $time)
{
$this->spin(function($context) use ($text) {
$this->assertPageContainsText($text);
return true;
}, intval($time) );
}
/**
* Special function to wait until angular has rendered the page fully, it will keep trying until either
* the condition is meet or the time runs out.
*
* #param function $lambda A anonymous function
* #param integer $wait Wait this length of time
*/
public function spin ($lambda, $wait = 60)
{
for ($i = 0; $i < $wait; $i++)
{
try {
if ($lambda($this)) {
return true;
}
} catch (Exception $e) {
// do nothing
}
sleep(1);
}
$backtrace = debug_backtrace();
throw new Exception(
"Timeout thrown by " . $backtrace[1]['class'] . "::" . $backtrace[1]['function'] . "()\n" .
$backtrace[1]['file'] . ", line " . $backtrace[1]['line']
);
}
Then in your Scenario use:
Then I should see "Something on the page." if I wait "5"

You can use code from Angular's Protractor library to wait for loading. Here you can find a function waitForAngular(). It simply waits for a client-side function with the same name
Here's working PHP code.
class WebContext implements Context
{
/**
* #Then the list of products should be:
*/
public function theListOfProductsShouldBe(TableNode $table)
{
$this->waitForAngular();
// ...
}
private function waitForAngular()
{
// Wait for angular to load
$this->getSession()->wait(1000, "typeof angular != 'undefined'");
// Wait for angular to be testable
$this->getPage()->evaluateScript(
'angular.getTestability(document.body).whenStable(function() {
window.__testable = true;
})'
);
$this->getSession()->wait(1000, 'window.__testable == true');
}
}

Related

How to download a (large) file with a typo3 extbase controller action

I have a controller with a download action in typo3. For some time I have implemented it like this and it is working:
function downloadAction() {
// ...
// send headers ...
// ...
if ($fh = fopen($this->file, 'r')) {
while (!feof($fh)) {
echo fread($fh, $chunkSize); // send file in little chunks to output buffer
flush();
}
fclose($fh);
}
exit; // Stopp middlewares and so on.
}
I am wondering if I should/could return an object of type ResponseInterface in typo3 11. So it is obviously that exit stopps the middleware pipeline and other things and I don't really know if there are any side effects.
I tried the following to return a ResponseInterface :
function downloadAction(): ResponseInterface {
// ...
return $this->responseFactory->createResponse();
->withAddedHeader(...)
// ...
->withBody($this->streamFactory->createStreamFromFile($this->file))
->withStatus(200, 'OK');
}
The problem is that the solution with the ResponseInterface works only with small files. The problem seems to be in Bootstrap::handleFrontendRequest().
protected function handleFrontendRequest(ServerRequestInterface $request): string
{
// ...
if (headers_sent() === false) {
// send headers
}
$body = $response->getBody(); // get the stream
$body->rewind();
$content = $body->getContents(); // Problem: Read the hole stream into RAM instead of
// sending it in chunks to the output buffer
// ...
return $content;
}
typo3 tries to read the whole stream/file into RAM. That crashes the application.
So how should I trigger a file download these days with typo3?

Laravel 8 - Conditionally remember a value in cache [duplicate]

I'm developing one of my first applications with the Laravel 4 framework (which, by the way, is a joy to design with). For one component, there is an AJAX request to query an external server. The issue is, I want to cache these responses for a certain period of time only if they are successful.
Laravel has the Cache::remember() function, but the issue is there seems to be no "failed" mode (at least, none described in their documentation) where a cache would not be stored.
For example, take this simplified function:
try {
$server->query();
} catch (Exception $e) {
return Response::json('error', 400);
}
I would like to use Cache::remember on the output of this, but only if no Exception was thrown. I can think of some less-than-elegant ways to do this, but I would think that Laravel, being such an... eloquent... framework, would have a better way. Any help? Thanks!
This is what worked for me:
if (Cache::has($key)) {
$data = Cache::get($key);
} else {
try {
$data = longQueryOrProcess($key);
Cache::forever($key, $data); // only stored when no error
} catch (Exception $e) {
// deal with error, nothing cached
}
}
And of course you could use Cache::put($key, $data, $minutes); instead of forever
I found this question, because I was looking for an answer about this topic.
In the meanwhile I found a solution and like to share it here:
(also check out example 2 further on in the code)
<?php
/**
* Caching the query - Example 1
*/
function cacheQuery_v1($server)
{
// Set the time in minutes for the cache
$minutes = 10;
// Check if the query is cached
if (Cache::has('db_query'))
{
return Cache::get('db_query');
}
// Else run the query and cache the data or return the
// error response if an exception was catched
try
{
$query = $server->query(...);
}
catch (Exception $e)
{
return Response::json('error', 400);
}
// Cache the query output
Cache::put('db_query', $query, $minutes);
return $query;
}
/**
* Caching the query - Example 2
*/
function cacheQuery_v2($server)
{
// Set the time in minutes for the cache
$minutes = 10;
// Try to get the cached data. Else run the query and cache the output.
$query = Cache::remember('db_query', $minutes, function() use ($server) {
return $server->query(...);
});
// Check if the $query is NULL or returned output
if (empty($query))
{
return Response::json('error', 400);
}
return $query;
}
I recommend you to use Laravel's Eloquent ORM and/or the Query Builder to operate with the Database.
Happy coding!
We're working around this by storing the last good value in Cache::forever(). If there's an error in the cache update callback, we just pull the last value out of the forever key. If it's successful, we update the forever key.

How to 'unset' session save handler?

For some reason I have to initialize session with default save handler.
Previous code explicitly sets custom handler with session_set_save_handler().
Changing previous code is not a realistic option in my situation, so does anyone know how to restore handler to default eg is there session_restore_save_handler or session_unset_save_handler functions or equivalents?
As of PHP 5.4 you can revert to the default session handler by instantiating the SessionHandler class directly:
session_set_save_handler(new SessionHandler(), true);
Here I have to answer on my own question since no one said anything:
First, there is no session_restore_save_handler or session_unset_save_handler given from PHP and (by so far) no native way to get things back as they were before. For some reason, PHP team didn't give us option to juggle with session handlers in this way.
Second, native session mechanism can be emulated with following code
class FileSessionHandler
{
private $savePath;
function open($savePath, $sessionName)
{
$this->savePath = $savePath;
if (!is_dir($this->savePath)) {
mkdir($this->savePath, 0777);
}
return true;
}
function close()
{
return true;
}
function read($id)
{
return (string)#file_get_contents("$this->savePath/sess_$id");
}
function write($id, $data)
{
return file_put_contents("$this->savePath/sess_$id", $data) === false ? false : true;
}
function destroy($id)
{
$file = "$this->savePath/sess_$id";
if (file_exists($file)) {
unlink($file);
}
return true;
}
function gc($maxlifetime)
{
foreach (glob("$this->savePath/sess_*") as $file) {
if (filemtime($file) + $maxlifetime < time() && file_exists($file)) {
unlink($file);
}
}
return true;
}
}
$handler = new FileSessionHandler();
session_set_save_handler(
array($handler, 'open'),
array($handler, 'close'),
array($handler, 'read'),
array($handler, 'write'),
array($handler, 'destroy'),
array($handler, 'gc')
);
register_shutdown_function('session_write_close');
This logic is closest to PHP's native session dealing one, but with , of course, unpredictable behavior in different circumstances. All I can right now conclude is that basic session operations is full covered with it.

Why is AngularFire so much slower than plain Firebase API

In testing out Firebase with AngularFire, I was surprised at how slow it is. After further testing, I discovered that it isn't Firebase that is slow, but AngularFire that is slow (incredibly slow in Firefox v26.0).
My use case is where I need to access a number of children for a given parent. The total number of children will potentially be in the thousands, so fetching them all at once is not an option. In addition, they will need to be accessed from grandparents, so querying by priority is not always an option.
Is there something I'm doing wrong in this sample with AngularFire (slow):
http://plnkr.co/edit/eML3HF3RtchIU26EGVaw?p=preview
Gist of accessing children with AngularFire:
function getChild(childID) {
recordCount++;
myC.children[childID] = $firebase(new Firebase(childrenUrl + childID));
myC.children[childID].$on('loaded', function () {
returnCount++;
checkReturnCount();
});
}
function checkReturnCount() {
if (recordCount != 0 && recordCount == returnCount) {
var diff = (new Date).getTime() - start;
myC.log.push("Loaded " + parent.FirstName + "'s children in " + diff + "ms.");
$scope.$apply();
}
}
For comparison, see this sample which isn't using any Angular plugin (fast):
http://plnkr.co/edit/GA17FEnHu7p8wAiDXA5b?p=preview
Gist of accessing children without AngularFire
function getChild(childID) {
recordCount++;
var tempRef = new Firebase(childrenUrl + childID);
tempRef.on('value', function (data) {
myC.children[childID] = data.val();
returnCount++;
checkReturnCount();
});
}
function checkReturnCount() {
if (recordCount != 0 && recordCount == returnCount) {
var diff = (new Date).getTime() - start;
myC.log.push("Loaded " + parent.FirstName + "'s children in " + diff + "ms.");
$scope.$apply();
}
}
OK, I may have found a solution. Apparently Firefox used to add random times to it's setTimeouts, but it doesn't any longer (see https://developer.mozilla.org/en-US/docs/Web/API/Window.setTimeout). However, Firefox (as well as other browsers) apparently still have a minimum timeout delay (which in FF is apparently 4ms).
This page proposes a solution: http://dbaron.org/log/20100309-faster-timeouts
Here is the setZeroTimeout method from that blog post:
// Only add setZeroTimeout to the window object, and hide everything
// else in a closure.
(function() {
var timeouts = [];
var messageName = "zero-timeout-message";
// Like setTimeout, but only takes a function argument. There's
// no time argument (always zero) and no arguments (you have to
// use a closure).
function setZeroTimeout(fn) {
timeouts.push(fn);
window.postMessage(messageName, "*");
}
function handleMessage(event) {
if (event.source == window && event.data == messageName) {
event.stopPropagation();
if (timeouts.length > 0) {
var fn = timeouts.shift();
fn();
}
}
}
window.addEventListener("message", handleMessage, true);
// Add the one thing we want added to the window object.
window.setZeroTimeout = setZeroTimeout;
})();
When I use this setZeroTimeout method, using AngularFire doesn't seem to be noticeably slower than using the base API.
For comparison, I've created a new Plnkr using it instead of the $timeout service.
AngularFire with setZeroTimout: http://plnkr.co/edit/nywEJpLcPwEJjXzipS4n?p=preview
AngularFire - http://plnkr.co/edit/nywEJpLcPwEJjXzipS4n?p=preview
Base Firebase API - http://plnkr.co/edit/GA17FEnHu7p8wAiDXA5b?p=preview
Could this be included in AngularFire? Or should I just modify my version for now?
OK, I think I've come up with a further improvement on the solution I started to come up with above, which also triggers the angular digest cycle as needed:
I overwrote the _timeout function in the AngularFire function as follows:
this._timeout = function (fn) {
fn();
throttledApply();
};
throttledApply is defined in the $firebase factory as:
var throttledApply = _.throttle(apply, 100);
function apply() {
$rootScope.$apply();
}
and is then passed to the AngularFire function instead of the $timeout service. It is making use of underscore's throttle function to call $apply immediately, and then at most once every 100ms thereafter. For my purposes, this is sufficient. It could easily be reduced to something more like 50ms, or 25ms though.
Are there any repercussions of these modifications that I'm not seeing?

requesting two Ajax

I'm trying to make two Ajax calls to get data to populate different bits of a web page, and as you'll already know, only the second happens.
So I thought I'd do this:
callAjax1('a'); callAjax2('b');
function callAjax1(data) {
ajax(data);
}
function callAjax2(data) {
ajax(data);
}
function ajax(data) {
// calls XMLHttpRequestObject etc
}
The idea was that instead of calling ajax() twice, now, I'd have two independent instances of ajax that would run independently.
It works .. but only if I put in an alert at the top of ajax() to let me know I've arrived.
So I'm thinking that alert gives the first request time to finish before the second is called. Therefore, I've not managed to separate them properly into separate instances. Is that not possible?
What am I missing?
All the best
J
UPDATE:
I'm thinking this, do I stand a chance?
tParams = new Array (2); // we intend to call ajax twice
tParams[0] = new Array('ajaxGetDataController.php', 'PROJECT', 'id');
tParams[1] = new Array('ajaxGetFileController.php', 'FILE', 'projectId');
<select name='projectSelector' onchange=\"saveData(tParams, this.value);\">\n";
// gets called, twice
function saveData(pParams, pData) // pParams are: PageToRun, Table, Field
{
if (XMLHttpRequestObject)
{
tPage = pParams[0][0]+'?table='+pParams[0][1]+'&pField='+pParams[0][2]+'&pData='+pData;
XMLHttpRequestObject.open('GET', tPage);\n
XMLHttpRequestObject.onreadystatechange = callAjax(pParams, pData);
XMLHttpRequestObject.send(null);
}
}
function callAjax(pParams, pData)
{
if (XMLHttpRequestObject.readyState == 4 && XMLHttpRequestObject.status == 200)
{
var tReceived = XMLHttpRequestObject.responseXML;
options = tReceived.getElementsByTagName('option'); // fields and their values stored in simplest XML as options
popForm(options, pParams[0][1]); // goes off to use the DOM to populate the onscreen form
pParams.shift(); // cuts off pParams[0] and moves all elements up one
if (pParams.length>0)
{
saveData(pParams, pData);
}
}
}
I would create a ready state variable for the AJAX function:
function ajax(data) {
readyState = false;
// calls XMLHttpRequestObject etc
}
And then check for the ready state before executing the second call:
function callAjax2(data) {
if(readyState == true) {
ajax(data);
readyState = true;
}
}
And make sure to change the readyState back to false after the AJAX calls have executed. This will ensure the first AJAX call has finished executing before the second one tries to fire.

Resources