I'm just starting with require.js. Here's the simple code that I have in my main.js
require.config({
baseUrl: '/js',
});
require(['lib/jquery-min'], function(jq){
console.log (jq); // Always returns "undefined"
});
In the require function above, the argument jq is always returned as empty. Can you help me identify the issue here?
Answer: RequireJs only loading JQuery when I configure the path option
require.config({
baseUrl: '/js',
paths: {
jquery: 'lib/jquery-min'
}
});
require(['lib/jquery-min'], function(jq){
console.log (jq); // Works now!!!
});
Related
I am trying to display results in my page, When i log response. I can see the response, I have tried passing the response to vue js data, but each time it displays an empty array
var app = new Vue({
el: '#app',
data: {
results: []
},
mounted(){
var url = "{{ url('fetch/messages') }}";
axios.get(url)
.then(function (response) {
console.log(response.data);
this.results = response.data;
})
.catch(function (error) {
console.log(error);
});
}
I am trying to display the response in the page #{{ results }}
The reason your data is not getting added to the Vue instance is because this will not refer to the Vue instance inside a closure (traditional function () {}). Please have a look at this article for more information on scope and this keyword in javascript.
Below I refer to ES quite a bit which stands for ECMAScript. Here is an article to show the difference between it and vanilla javascript.
I can see from your post that you're using ES6/ES2015 syntax for your method definitions so I'm assuming you're happy with just using modern browsers for now.
If this is the case then you can use Arrow Functions to get past the scope issue:
.then(response => {
console.log(response.data);
this.results = response.data;
})
If not, then as mentioned in the scope article above, you would need to assign this to a variable:
var self = this;
axios.get(url)
.then(function (response) {
console.log(response.data);
self.results = response.data;
});
If this is the case, then I would also suggest not using the ES6 method definitions either so change mounted(){ to mounted: function () {.
Finally, if you want to be able to use modern/new javascript e.g. ES6/ES2015 and above, but have your code work consistently with older browsers then I would suggest using something like Laravel mix to compile your javascript for you.
Laravel Mix Documentation
Laravel Mix tutorial series
You cannot have the following line of code, it will break.
var url = "{{ url('fetch/messages') }}";
I am trying to make an API call and the following errors with Ajax not being a function:
import $ from 'jquery';
const apiCall = function () {
var url = 'https://images.nasa.gov/#/search-results';
var request = {
q: 'sun',
media_type: 'image'
};
var result = $.ajax({
url: url,
dataType: 'json',
data: request,
type: 'GET'
})
.done(function() {
alert(JSON.stringify(result));
})
.fail(function() {
alert('failed');
})
}
module.exports = apiCall;
I am importing the above in another module and calling it on a button click in the react's render() function like:
import apiCall from './../api/apiCall';
class Gallery extends React.Component {
render () {
return (
<section id="gallery" className="g-site-container gallery">
<div className="grid grid--full">
<div className="gallery__intro">
<Button extraClass=""
type="button"
handleButtonClick={apiCall} />
</div>
</div>
</section>
)
}
}
module.exports = Gallery;
Any thoughts with what am I doing wrong?
In my experience, this type of issue is most often because your transpilation is not working as you might expect - or you are transpiling your code while also using jquery (or any other lib) by including it with a CDN link. If this is the case, here's some info that might help you sort it out:
First, check that your transpiler is actually pulling jquery in. Just having it on the page won't necessarily allow this code to work - because when your transpiler operates on:
import $ from 'jquery'
It's going to expect to first load the jquery package from node_modules and then create an internal name for it, such as $_1 which will be used inside your bundle. If you intend to include jquery on the page via CDN, rather than bundling it in this fashion, you need to mark it as external in your webpack or rollup config. If using webpack, it would look something like:
{
entry: '/path/to/your/application.js',
externals: {
'jquery': '$',
}
}
This essentially tells webpack, "when I import from 'jquery', don't look in node_modules - instead, just assume jquery already exists on the page as window.$. Now, webpack won't attempt to include and bundle all of the jquery lib - and instead of creating $_1 it will actually honor what $ is.
If you do intend to load and bundle jquery as part of your build (not recommended, due to the incredible size-bloat it will entail) - I suggest ensuring that it's installed in node_modules. You can do this with:
npm install -S jquery
or, if you're using yarn:
yarn add jquery
Now, your import statement should load the lib correctly when you re-transpile.
First, ensure you're not using jquery-lite as it excludes ajax features.
Btw, it's not recommended to use both exports.module together with ES6's import / export. Try to just use one of them. Not pretty sure, but it may cause some module troubles that hard to understand.
Additionally, based on $.ajax official document, you have to process data in the callback
$.ajax({
url: url,
dataType: 'json',
data: request,
type: 'GET'
})
.done(function(data) {
// Process data provided from callback function
alert(data);
})
.fail(function() {
alert('failed');
})
Personally I do prefer isomorphic-fetch to make ajax call in React application.
I'm new to Gulp and I wanted to make use of its automatic scss compiling and browser sync. But I can't get it to work.
I stripped everything down to leave only the contents of the example on the Browsersync website:
http://www.browsersync.io/docs/gulp/#gulp-sass-css
var gulp = require('gulp');
var browserSync = require('browser-sync').create();
var sass = require('gulp-sass');
// Static Server + watching scss/html files
gulp.task('serve', ['sass'], function() {
browserSync.init({
server: "./app"
});
gulp.watch("app/scss/*.scss", ['sass']);
gulp.watch("app/*.html").on('change', browserSync.reload);
});
// Compile sass into CSS & auto-inject into browsers
gulp.task('sass', function() {
return gulp.src("app/scss/*.scss")
.pipe(sass())
.pipe(gulp.dest("app/css"))
.pipe(browserSync.stream());
});
gulp.task('default', ['serve']);
I can call gulp serve. The site is showing and I get a message from Browsersync. When I modify the HTML, the page is reloaded. When however I modify the scss, I can see this:
[BS] 1 file changed (test.css)
[15:59:13] Finished 'sass' after 18 ms
but I have to reload manually. What am I missing?
I also faced a similar problem when I was new to browser-sync usage, the command-line was saying "reloading browsers" but the browser was not refreshed at all, the problem was I had not included body tag in my HTML page where the browser-sync can inject script for its functionality, make sure your HTML page has body tag.
You can just inject the changes instead of having to force a full browser refresh on SASS compile if you like.
browserSync.init({
injectChanges: true,
server: "./app"
});
gulp.task('sass', function() {
return gulp.src("app/scss/*.scss")
.pipe(sass())
.pipe(gulp.dest("app/css"))
.pipe(browserSync.stream({match: '**/*.css'}));
});
This is because you're calling browserSync.reload on the html watch and not on the scss watch.
Try this:
gulp.watch("app/scss/*.scss", ['sass']).on('change', browserSync.reload);
gulp.watch("app/*.html").on('change', browserSync.reload);
This is what I use and it work's fine in sass or any other files
gulp.task('browser-sync', function () {
var files = [
'*.html',
'css/**/*.css',
'js/**/*.js',
'sass/**/*.scss'
];
browserSync.init(files, {
server: {
baseDir: './'
}
});
});
I include this on my html, right below the body tag. It works.
<script type='text/javascript' id="__bs_script__">//<![CDATA[
document.write("<script async src='http://HOST:3000/browser-sync/browser-sync-client.2.11.1.js'><\/script>".replace("HOST", location.hostname));//]]>
</script>
Ran into this same problem trying to reload php and js files and stream css files. I was able to use stream only by using a pipe method, which makes sense. Anyway, here's what worked for me:
gulp.watch(['./**/*.css']).on('change', function (e) {
return gulp.src( e.path )
.pipe( browserSync.stream() );
});
But, I actually prefer #pixie's answer modified:
gulp.task('default', function() {
var files = [
'./**/*'
];
browserSync.init({
files : files,
proxy : 'localhost',
watchOptions : {
ignored : 'node_modules/*',
ignoreInitial : true
}
});
});
I also had the same issue. It worked when I called the reload method as a separate task.
gulp.task('browserSync', function() {
browserSync.init(null, {
server: {
baseDir: './'
},
});
})
gulp.task('reload', function(){
browserSync.reload()
})
gulp.task('watch', ['sass', 'css', 'browserSync'], function(){
gulp.watch('*.html', ['reload']);
})
Sometimes when using the CLI you don't have the script inserted in your HTML main files so you should manually add this or use gulp.
<!-- START: BrowserSync Reloading -->
<script type='text/javascript' id="__bs_script__">
//<![CDATA[
document.write("<script async src='/browser-sync/browser-sync-client.js'><\/script>".replace("HOST", location.hostname));
//]]>
</script>
<!-- END: BrowserSync Reloading -->
Does anyone have any idea how you would handle errors in gulp-ruby-sass. I have noticed that it outputs directly to console when there is an error, but I would like to handle it myself.
gulp.task('styles', function () {
return gulp.src(sources.sass.files)
.pipe(plumber({errorHandler: notify.onError("Error: <%= error.message %>")}))
.pipe(
plugins.rubySass({
lineNumbers: true,
style: 'expanded',
sourcemap: true,
sourcemapPath: '../../dev/sass'
})
)
.on("error", function(err) {
console.log(err);
})
.pipe(gulp.dest(sources.sass.dest));
});
As you can see, I am trying two different ways to handle my error here but neither of them work. If I try .on("error", console.log('error')) that works, but it tells me the 'listener must be a function'.
Thanks in advance.
You can also use a package I published, pipe-error-stop as follows:
pipeErrorStop = require('pipe-error-stop');
// in the task
.pipe(pipeErrorStop(rubySass()))
If rubySass emits an error, no data will be passed onward further down the pipe. This prevents you from making a syntax error in your sass and finding a stack trace on your home page.
pipeErrorStop has callbacks that you can use to hook into, as well.
Using gulp-plumber might help you, have a look at this blogpost !
It would allow you do:
var onError = function (err) {
// whatever you need to do
};
gulp.task('styles', function () {
gulp.src('scss/style.scss')
.pipe(plumber({
errorHandler: onError
}))
.pipe(rubysass())
.pipe(gulp.dest('dist/'));
});
https://github.com/mikaelbr/gulp-notify/issues/37#issuecomment-48457930
Issue has been fixed by another GitHub user.
describe('my homepage', function() {
var ptor = protractor.getInstance();
beforeEach(function(){
// ptor.ignoreSynchronization = true;
ptor.get('http://localhost/myApp/home.html');
// ptor.sleep(5000);
})
describe('login', function(){
var email = element.all(protractor.By.id('email'))
, pass = ptor.findElement(protractor.By.id('password'))
, loginBtn = ptor.findElement(protractor.By.css('#login button'))
;
it('should input and login', function(){
// email.then(function(obj){
// console.log('email', obj)
// })
email.sendKeys('josephine#hotmail.com');
pass.sendKeys('shakalakabam');
loginBtn.click();
})
})
});
the above code returns
Error: Error while waiting for Protractor to sync with the page: {}
and I have no idea why this is, ptor load the page correctly, it seem to be the selection of the elements that fails.
TO SSHMSH:
Thanks, your almost right, and gave me the right philosophy, so the key is to ptor.sleep(3000) to have each page wait til ptor is in sync with the project.
I got the same error message (Angular 1.2.13). My tests were kicked off too early and Protractor didn't seem to wait for Angular to load.
It appeared that I had misconfigured the protractor config file. When the ng-app directive is not defined on the BODY-element, but on a descendant, you have to adjust the rootElement property in your protractor config file to the selector that defines your angular root element, for example:
// protractor-conf.js
rootElement: '.my-app',
when your HTML is:
<div ng-app="myApp" class="my-app">
I'm using ChromeDriver and the above error usually occurs for the first test. I've managed to get around it like this:
ptor.ignoreSynchronization = true;
ptor.get(targetUrl);
ptor.wait(
function() {
return ptor.driver.getCurrentUrl().then(
function(url) {
return targetUrl == url;
});
}, 2000, 'It\'s taking too long to load ' + targetUrl + '!'
);
Essentially you are waiting for the current URL of the browser to become what you've asked for and allow 2s for this to happen.
You probably want to switch the ignoreSynchronization = false afterwards, possibly wrapping it in a ptor.wait(...). Just wondering, would uncommenting the ptor.sleep(5000); not help?
EDIT:
After some experience with Promise/Deferred I've realised the correct way of doing this would be:
loginBtn.click().then(function () {
ptor.getCurrentUrl(targetUrl).then(function (newURL){
expect(newURL).toBe(whatItShouldBe);
});
});
Please note that if you are changing the URL (that is, moving away from the current AngularJS activated page to another, implying the AngularJS library needs to reload and init) than, at least in my experience, there's no way of avoiding the ptor.sleep(...) call. The above will only work if you are staying on the same Angular page, but changing the part of URL after the hashtag.
In my case, I encountered the error with the following code:
describe("application", function() {
it("should set the title", function() {
browser.getTitle().then(function(title) {
expect(title).toEqual("Welcome");
});
});
});
Fixed it by doing this:
describe("application", function() {
it("should set the title", function() {
browser.get("#/home").then(function() {
return browser.getTitle();
}).then(function(title) {
expect(title).toEqual("Welcome");
});
});
});
In other words, I was forgetting to navigate to the page I wanted to test, so Protractor was having trouble finding Angular. D'oh!
The rootElement param of the exports.config object defined in your protractor configuration file must match the element containing your ng-app directive. This doesn't have to be uniquely identifying the element -- 'div' suffices if the directive is in a div, as in my case.
From referenceConf.js:
// Selector for the element housing the angular app - this defaults to
// body, but is necessary if ng-app is on a descendant of <body>
rootElement: 'div',
I got started with Protractor by watching the otherwise excellent egghead.io lecture, where he uses a condensed exports.config. Since rootElement defaults to body, there is no hint as to what is wrong with your configuration if you don't start with a copy of the provided reference configuration, and even then the
Error while waiting for Protractor to sync with the page: {}
message doesn't give much of a clue.
I had to switch from doing this:
describe('navigation', function(){
browser.get('');
var navbar = element(by.css('#nav'));
it('should have a link to home in the navbar', function(){
//validate
});
it('should have a link to search in the navbar', function(){
//validate
});
});
to doing this:
describe('navigation', function(){
beforeEach(function(){
browser.get('');
});
var navbar = element(by.css('#nav'));
it('should have a link to home in the navbar', function(){
//validate
});
it('should have a link to search in the navbar', function(){
//validate
});
});
the key diff being:
beforeEach(function(){
browser.get('');
});
hope this may help someone.
I was getting this error:
Failed: Error while waiting for Protractor to sync with the page: "window.angular is undefined. This could be either because this is a non-angular page or because your test involves client-side navigation, which can interfere with Protractor's bootstrapping. See http://git.io/v4gXM for details"
The solution was to call page.navigateTo() before page.getTitle().
Before:
import { AppPage } from './app.po';
describe('App', () => {
let page: AppPage;
beforeEach(() => {
page = new AppPage();
});
it('should have the correct title', () => {
expect(page.getTitle()).toEqual('...');
})
});
After:
import { AppPage } from './app.po';
describe('App', () => {
let page: AppPage;
beforeEach(() => {
page = new AppPage();
page.navigateTo();
});
it('should have the correct title', () => {
expect(page.getTitle()).toEqual('...');
})
});
If you are using
browser.restart()
in your spec some times, it throws the same error.
Try to use
await browser.restart()