How to include external library in testing library/storybook - react-redux

I'm using Storybook to create stories. I want to create tests of the stories using #storybook/testing-react. However, I'm also using materialize-react which requires that materialize.min.js is loaded.
I am loading this in .storybook/preview-head.html and it works for storybook.
But when I run a test (created as in https://storybook.js.org/addons/#storybook/testing-react), I'm getting Error: Uncaught [ReferenceError: M is not defined] which I'm pretty sure means that materialize.min.js is not loaded.
How do I load an external JS file in testing library?

I can achieve the above by appending the following to package.json:
"scripts": {
"test": "react-scripts test --setupFiles ./public/materialize.min.js",
},
I'm not sure if this is the "best" way.

Related

How to handle .gql file imports in Jest tests

I'm trying to test a component that imports a .gql file. When I try to build the component in a Jest file, I receive this error:
( object. anonymous function(module exports require __dirname __filename global jest) {
query getUser {
ˆˆˆˆˆˆˆ
<script>
import GET_USER from 'PATH';
ˆ
Does anyone have any idea of how to ignore the import? Because I don't need to test the GraphQL call.
GraphQL documents (which typically have a .gql extension) can be imported directly if you use webpack and utilize the loader that comes with graphql-tag. Jest does not work with webpack out of the box and needs to be configured to handle any imports of asset files like stylesheets, images, etc. This process is outlined in the docs.
According to the graphql-tag documentation:
Testing environments that don't support Webpack require additional configuration. For Jest use jest-transform-graphql.
So you can utilize jest-transform-graphql along with the babel-jest plugin, which you're probably already using:
"jest": {
"transform": {
"\\.(gql|graphql)$": "jest-transform-graphql",
".*": "babel-jest"
}
}
Mocking the file is technically possible by adding the moduleNameMapper config option as shown in the docs, however, doing so is likely to break your components.

"window is not defined" with nuxtjs. Trying to build pages with generate

I get the error message "window is not defined" if I use "window" in my default.vue template inside layouts folder. Building static pages with "npm run generate" creates the error message. Mode is set to "universal"
...
created() {
window.addEventlistener('scroll', e => {
console.log('scrollleeed')
})
}
...
What I'm doing wrong? I thought "generate" will create static pages and everything will be executed on the client side.
Solution
I have to use the mounted hook, not created for window or document related stuff
generate will render pages on server, but only once e.g. at build time. But still to create static pages you need to render them on server, and hence that error
To avoid it you need to use window inside mounted, or wrap it with if (process.client) {}

Laravel, Datatable, Composer and Webpack : Good practices to allow developpers to customize my library in their projects

To set the context I am creating a CRUD application for Laravel. It is installed via composer and the sources are therefore in the vendor/organization/package directory.
In my project, I use Datatable. So I use Laravel Mix to compile my sources and a command line allows to copy JS and CSS compiled files into the public directory of the Laravel Host application.
I would like however that the developers who will use my library can customize the display of some Datatable cells. To do this you must use Datatable's createdCell configuration.
$('#example').dataTable( {
"columnDefs": [ {
"targets": 3,
"createdCell": function (td, cellData, rowData, row, col) {
if ( cellData < 1 ) {
$(td).css('color', 'red')
}
}
} ]
});
The problem is that the JS sources of my project are already compiled...
For the moment I found a temporary solution that consists in leaving the JS sources in vendor/organization/package but copying the webpack.mix.js configuration into the Host application and asking the developers to compile themselves. The problem is that all JS dependencies must also be installed and it doesn't take very seriously to force the developers to compile sources before being able to use my library.
What are good practices to achieve this objective?
The following source may help, but I confess I don't know how to apply it to Laravel:
How to bundle vendor scripts separately and require them as needed with Webpack?
Thank you for your help.

Vuejs 2 + laravel elixir + lodash Uncaught ReferenceError: _ is not defined

I'm using the tools specified in the title. I import lodash in my bootstrap.js
window._ = require('lodash');
However, when I try to use something like this (similar example to here), I get the error discribed in the title
created() {
this.test();
},
methods: {
test: _.debounce(function () {
console.log('calculating', true);
setTimeout(function () {
console.log('calculating', false);
}.bind(this), 1000)
}, 500),
}
However, if I remove the window._ = require('lodash'); and insert lodash manually in the page it works fine, like
<script src="https://cdn.jsdelivr.net/lodash/4.13.1/lodash.js"></script>
What I'm missing?
Also, What is the advantage of importing the libraries by require instead of using Gulp to merge and uglyfy everything?
I am not sure why you are getting that error related to _.
On your other question, I can think following advantage of using require over gulp or any other build tool:
All build tools comes with their own set of dependencies and other accessories, which bloats your overall code size.
You may start to rely on their plugins and time can come, when you need to use grunt for one task, while for other tasks you are using grunt, quting from gulp-grunt
What if your favorite grunt plugin isn't available for gulp yet? Don't fret, there is nothing to worry about! Why don't you just hook in your grunt configuration?
You will have to manage updating of different plugins of gulp and make sure all are working with new updated versions.
These are some of the things I have experienced or read about, but these tools do remove the pain of building, hot-reloading, compressing or obfuscating your code, but definetely there can be other raw way to do these, as what these are doing is providing you an abstraction.

Unable to get jasmine-jquery fixtures to load in Visual Studio with Chutzpah, or even in browser

I'm prototyping a MVC.NET 4.0 application and am defining our Javascript test configuration. I managed to get Jasmine working in VS2012 with the Chutzpah extensions, and I am able to run pure Javascript tests successfully.
However, I am unable to load test fixture (DOM) code and access it from my tests.
Here is the code I'm attempting to run:
test.js
/// various reference paths...
jasmine.getFixtures().fixturesPath = "./";
describe("jasmine tests:", function () {
it("Copies data correctly", function () {
loadFixtures('testfixture.html');
//setFixtures('<div id="wrapper"><div></div></div>');
var widget = $("#wrapper");
expect(widget).toExist();
});
});
The fixture is in the same folder as the test file. The setFixtures operation works, but when I attempt to load the HTML from a file, it doesn't. Initially, I tried to use the most recent version of jasmine-jquery from the repository, but then fell back to the over 1 year old download version 1.3.1 because it looked like there was a bug in the newer one. Here is the message I get with 1.3.1:
Test 'jasmine tests::Copies data correctly' failed
Error: Fixture could not be loaded: ./testfixture.html (status: error, message: undefined) in file:///C:/Users/db66162/SvnProjects/MvcPrototype/MvcPrototype.Tests/Scripts/jasmine/jasmine-jquery-1.3.1.js (line 103)
When I examine the source, it is doing an AJAX call, yet I'm not running in a browser. Instead, I'm using Chutzpah, which runs a headless browser (PhantomJS). When I run this in the browser with a test harness, it does work.
Is there someone out there who has a solution to this problem? I need to be able to run these tests automatically both in Visual Studio and TeamCity (which is why I am using Chutzpah). I am open to solutions that include using another test runner in place of Chutzpah. I am also going to evaluate the qUnit testing framework in this effort, so if you know that qUnit doesn't have this problem in my configuration, I will find that useful.
I fixed the issue by adding the following setting to chutzpah.json:
"TestHarnessLocationMode": "SettingsFileAdjacent",
where chutzpah.json is in my test app root
I eventually got my problem resolved. Thank you Ian for replying. I am able to use PhantomJS in TeamCity to run the tests through the test runner. I contacted the author of Chutzpah and he deployed an update to his product that solved my problem in Visual Studio. I can now run the Jasmine test using Chutzpah conventions to reference libraries and include fixtures while in VS, and use the PhantomJS runner in TeamCity to use the test runner (html).
My solution on TeamCity was to run a batch file that launches tests. So, the batch:
#echo off
REM -- Uses the PhantomJS headless browser packaged with Chutzpah to run
REM -- Jasmine tests. Does not use Chutzpah.
setlocal
set path=..\packages\Chutzpah.2.2.1\tools;%path%;
echo ##teamcity[message text='Starting Jasmine Tests']
phantomjs.exe phantom.run.js %1
echo ##teamcity[message text='Finished Jasmine Tests']
And the Javascript (phantom.run.js):
// This code lifted from https://gist.github.com/3497509.
// It takes the test harness HTML file URL as the parameter. It launches PhantomJS,
// and waits a specific amount of time before exit. Tests must complete before that
// timer ends.
(function () {
"use strict";
var system = require("system");
var url = system.args[1];
phantom.viewportSize = {width: 800, height: 600};
console.log("Opening " + url);
var page = new WebPage();
// This is required because PhantomJS sandboxes the website and it does not
// show up the console messages form that page by default
page.onConsoleMessage = function (msg) {
console.log(msg);
// Exit as soon as the last test finishes.
if (msg && msg.indexOf("Dixi.") !== -1) {
phantom.exit();
}
};
page.open(url, function (status) {
if (status !== 'success') {
console.log('Unable to load the address!');
phantom.exit(-1);
} else {
// Timeout - kill PhantomJS if still not done after 2 minutes.
window.setTimeout(function () {
phantom.exit();
}, 10 * 1000); // NB: use accurately, tune up referring to your needs
}
});
}());
I've got exactly the same problem. AFAIK it's to do with jasmine-jquery trying to load the fixtures via Ajax when the tests are run via the file:// URI scheme.
Apparently Chrome doesn't allow this (see https://stackoverflow.com/a/5469527/1904 and http://code.google.com/p/chromium/issues/detail?id=40787) and support amongst other browsers may vary.
Edit
You might have some joy by trying to set some PhantomJS command-line options such as --web-security=false. YMMV though: I haven't tried this myself yet, but thought I'd mention it in case it's helpful (or in case anyone else know more about this option and whether it will help).
Update
I did manage to get some joy loading HTML fixtures by adding a /// <reference path="relative/path/to/fixtures" /> comment at the top of my Jasmine spec. But I still have trouble loading JSON fixtures.
Further Update
Loading HTML fixtures by adding a /// <reference path="relative/path/to/fixtures" /> comment merely loads in your HTML fixtures to the Jasmine test runner, which may or may not be suitable for your needs. It doesn't load the fixtures into the jasmine-fixtures element, and consequently your fixtures don't get cleaned up after each test.

Resources