gulp keeping a sane source map after multiple processing - sass

I've optimized my css compilation for speed in gulp and essentially queued 3 operations together in one task. It works well but now I'm a bit stuck on generating proper sourceMaps.

I simplified a bit your sample but you can try a solution similar to this one:
var gulp = require('gulp'),
filter = require('gulp-filter'),
less = require('gulp-less'),
concat = require('gulp-concat'),
sourcemaps = require('gulp-sourcemaps');
gulp.task('default', function(){
var lessFilter = filter(['**/*.less']);
var cssFilter = filter(['**/*.css']);
return gulp.src('src/**/*')
.pipe(sourcemaps.init())
.pipe(lessFilter)
.pipe(less())
.pipe(lessFilter.restore())
.pipe(cssFilter) // Needed only in case the src folder contains files that are not compiled to css at this stage of the pipeline
.pipe(concat('./final.css'))
.pipe(sourcemaps.write())
.pipe(gulp.dest('dest'));
});

Related

protractor does not recognize '\' at uploading a file

I've been reading a few answers to this and implemented one of those. Here is my code:
var path = require('path');
var fileToUpload = "C:\Users\patricio.lussenhoff\Desktop\test.txt",
absolutePath = path.resolve(__dirname, fileToUpload);
var type3 = browser.element(by.css('[type="file"]'));
type3.sendKeys(absolutePath);
The protractor apparently is not recognizing the slashes (I've tried '/' way too)
and the control shows like this:
here is the example I'm talking about
Any thoughts ?
Try it:
var fileToUpload = "C:\\Users\\patricio.lussenhoff\\Desktop\\test.txt";
var type3 = browser.element(by.css('[type="file"]'));
type3.sendKeys(fileToUpload);
This way you don't need to use 'path.resolve' because you are passing the complete and correct path.
Based on the OS Use single forward slash(Linux, Unix, etc.,) or double backward slash(Windows) to read files:
var fileToUpload = "C:/Users/patricio.lussenhoff/Desktop/test.txt",
var fileToUpload = "C:\\Users\\patricio.lussenhoff\\Desktop\\test.txt",

Website File and Folder Browser: Should I use Static or Ajax?

Subjective question time!
I'm coding a website that hosts a large amount of files and folders for an open organization that must post all documents online for public scrutiny. I have not yet began coding the actual viewer, as I'm wondering what the standard, most accessible approach is.
The site must be easy to access and available to all devices from desktops to phones. That said, I don't have to code in mind of older, outdated browsers. The previous site used a static approach on Python and Django. This is my first real node.js + Express job, and I'm not sure of performance differences.
At present, I see two ways to accomplish my task:
1. Use Ajax
I know I can shove everyone onto a generic /documents page, and allow them to navigate through the folders themselves. However, I want document links to work if shared, so I'll have to be changing the URL manually as users move around, and submitting plenty of Ajax requests back to the server
I like this approach in that it will likely give a nicer user interaction. I don't like the amount of Ajax requests, and I fear that on less powerful devices like phones and tablets, all that Ajax and DOM manipulation will slow down or not work. Additionally, I'd have to parse the url to a resource with either the back end or front end for retrieval.
2. Go 'Static'
I'm using node.js and Jade on the back end, so I know I can just break apart a url, find the folder hierarchy, and give a whole new page to the user.
I like this approach because it doesn't require the user's machine to do any computation (and will likely be faster on slower devices), and it means not doing a ton of url work. I don't like that desktop users will end up waiting for a bunch of synchronous operations that I'll have to use to prepare the pages with, nor the server load or responsiveness.
Currently
I'm looking into the static approach right now for what I perceive to be a bit more accessibility (even at the cost of page load times), but I'm here for more information to guide the right choice. I'm looking for answers that explain the why of which way to go will be better, and are impartial or share experiences. Thank you in advance for your help!
Right. So no one else responded yet, so I just went ahead and made the file browser anyway.
I ended up doing a static method. It turned out to be relatively easy, besides having to manipulate a bunch of strings, and I can only imagine that twice the work would have been necessary for Ajax.
The response times are fairly long: a generic static page that does no computation on my site takes about 40-70ms, while the new documents one takes twice that at ~150ms. Although in practice 150ms isn't anything to get upset over for my needs, in a large scale environment I'm sure my glob functions in the documents folder would just bog down the system.
For anyone wondering, here's what I did
Code
The hierarchy looks like this
|app
|controllers
|-document.js
|views
|-document.jade
|public
|docs
|
|//folders
|
documents.js
var express = require('express');
var router = express.Router();
var glob = require('glob');
module.exports = function(app) {
app.use('/', router);
};
router.get('/documents*', function serveDocsHome(req, res) {
//this removes %20 from the requested url to match files with spaces
req.originalUrl = req.originalUrl.replace('%20', ' ');
//fun string stuff to make links work
var dir = '/docs' + req.originalUrl.substr(10);
var url = req.originalUrl + '/';
//for moving up a directory
var goUp = false;
var folderName = 'Home';
if (req.originalUrl != '/documents') {
var end = req.originalUrl.lastIndexOf('/');
folderName = req.originalUrl.substr(end + 1);
goUp = true;
}
//get all the folders
var folders = glob.sync('*/', {
cwd : 'public' + dir
});
for (var i = 0; i < folders.length; i++) {
folders[i] = folders[i].substr(0, folders[i].length - 1);
}
//get all the files
var files = glob.sync('*', {
cwd : 'public' + dir,
nodir : true
});
//attach the files and folders
res.locals.folders = folders;
res.locals.files = files;
res.locals.loc = dir + '/';
res.locals.goUp = goUp;
res.locals.url = url;
res.locals.folderName = folderName;
//render the doc
res.render('documents', {
title : 'Documents',
});
});
documents.jade
extends layout
append css
link(rel='stylesheet', href='/css/docs.css')
append js
script(src='/js/docs.js')
block content
.jumbotron(style='background: url(/img/docs.jpg); background-position: center 20%; background-repeat: no-repeat; background-size: cover;')
.container
h1= title
p View minutes, policies, and guiding papers of the [name]
.container#docs
.row
.col-xs-12.col-sm-3.sidebar.sidebar-wrap
h3= folderName
ul.no-style.jumplist
hr
if goUp
li#go-up: a.message(href='./') #[img(src='/img/icons/folderOpen.png')] Up One Folder
each val in folders
li: a(href='#{url + val}'): #[img(src='/img/icons/folder.png')] #{val}
.col-xs-12.col-sm-9
h3 Files
ul.no-style
if files.length != 0
each val in files
li: a(href='#{loc + val}')= val
else
li.message No Files Here
And heres part of the page

Which way to require('koa-router')?

I have seen lots of code that does this one of two ways:
var router = require('koa-router')(); // makes it a function?
OR
var router = require('koa-router'); // doesn't make it a function?
Which is it? Or does it actually depend on the code in the file?

Gulp - Handling multiple themes and folders

I am trying to create an ultimate gulpfile that we can use on one of our big sites (one with multiple themes depending on the section of the site you are in). I'm trying to get it to only run the process it needs to run and not recompile everything.
Let me layout exactly what i'm trying to achieve:
Folder Structure
src/
master-theme/
css/
style.scss
partials/
_a.scss
_b.scss
img/
a.jpg
b.jpg
sub-theme/
css/
style.scss
partials/
_c.scss
_d.scss
img/
c.png
d.jpg
I want these files to be compressed/compiled and to end up in the destination folder with the same folder structure (just replace src with dest in your mind)
The Problem
At the moment i can get it to do what I want - but the gulpfile compiles and compresses everything. E.g. if I add an image tosub-theme/img it will run the image compression for all the "themes". I am using gulp-changed but it still means that it is looking at all the images accross the site.
The same is also for the sass - if I update _c.scss, but the master css and the sub-theme css get compiled which is obviously not desired.
Current Solution
I don't really have one at the moment. Right now I am using gulp-file-tree to generate a json file of the folder structure, then whenever a file is changed, looping through that
with a function (which I know is horrible - but a solution which currently works)
var tree = require('./build/tree.json');
var children = tree.children;
for (var i = children.length - 1; i >= 0; i--) {
var child = children[i];
if(child.isDirectory)
task(child)
}
There task() is a gulp tasks passed in (e.g. Sass compilation)
The folder structure is not up for discussion - I don't want this to turn into a 'structure your files differently' kind of thing. There are several other factors involved which are not related to this issue as to why we are this way (Sorry I had to say that...)
I'm open to trying anything as i've stared at this file for days now.The tasks I am trying to run are:
Sass compilation
Sprite generation
SVG sprite to PNG sprite
Image compression
Javascript compression
Thanks in advance for your help. If a solution is found, I'll write a proper post about it so that others will hopefully not feel my pain...
I'm doing pretty much the same thing, and I think I've nailed it.
gulpfile.js:
var gulp = require('gulp'),
debug = require('gulp-debug'),
merge = require('merge-stream'),
sass = require('gulp-sass'),
less = require('gulp-less'),
changed = require('gulp-changed'),
imagemin = require('gulp-imagemin'),
prefix = require('gulp-autoprefixer'),
minifyCSS = require('gulp-minify-css'),
browserSync = require('browser-sync'),
reload = browserSync.reload,
path = require('path'),
glob = require('glob');
// Log errors to the console
function errorHandler(error) {
console.log(error.toString());
this.emit('end');
}
function processThemeFolder(src) {
function debugTheme(type) {
return debug({ title: 'theme ' + theme + ' ' + type});
}
var theme = path.basename(src);
var dest = 'public/themes/' + theme;
return merge(
gulp
.src([src + '/sass/**/*.scss'])
.pipe(changed(dest + '/css', { extension: '.css' }))
.pipe(debugTheme('sass'))
.pipe(sass())
.pipe(minifyCSS())
.pipe(gulp.dest(dest + '/css')),
gulp
.src([src + '/less/**/*.less'])
.pipe(changed(dest + '/css', { extension: '.css' }))
.pipe(debugTheme('less'))
.pipe(less())
.pipe(minifyCSS())
.pipe(gulp.dest(dest + '/css')),
gulp
.src([src + '/js/**/*.js'])
.pipe(changed(dest + '/js'))
.pipe(debugTheme('js'))
.pipe(uglify())
.pipe(gulp.dest(dest + '/js')),
gulp
.src([src + '/img/**/*.{png,jpg,gif}'])
.pipe(changed(dest + '/img'))
.pipe(debugTheme('img'))
.pipe(imagemin())
.pipe(gulp.dest(dest + '/img'))
).on('change', reload);
}
gulp.task('themes', function() {
var srcThemes = glob.sync('resources/themes/*');
return merge(srcThemes.map(processThemeFolder));
});
// ...
The key here is to use gulp-changed to only pass through the changed files. The rest is cream on top.
The compilation streams all show a debug line detailing what files are going into the stream. On a change in the stream,
the browserSync is notified to reload the browsers, using streaming (if possible). The theme task is only completed once
all its compilation streams are done, and the over-all themes task will only be marked as done when all the themes are done.
The theme's source files are stored in resources/themes/themename, and writes its output to public/themes/themename.
This is working very well for me, YMMV. :-)
I would use the following plugin to manage a cache for your processed files. It will then use the cache and determine what needs to be changed and what has already been processed prior to this.
https://github.com/wearefractal/gulp-cached
HTH
You can create a function with parameters that compiles only the changed file then you can call another one that combines the result. For example generate a.css and b.css and when a.scss is updated only a.css should be updated. After each call, trigger a combine function that puts a and b together. Google too see how you get the path of the changed file. Idon't remember which plugin I used

Anyway to load an external script once on an event in an all ajax web page?

The page i have is very simple, but all of the links are loaded with ajax, what i am wondering is there anyway to load an external script once one of the links is clicked, but make sure that the script is only loaded once? Basically just for the sake of performance.
You can load an external script by creating a script element and appending it to head:
var script=$("<script>");
script.attr("type", "text/javascript");
script.attr("src", "some_external_script.js");
script.appendTo("head");
You can use a simple variable to make sure it's only included once.
var addedExternalScript=false;
// ...somewhere else where loadedExternalScript is still in scope...
if(!addedExternalScript) {
// load the script
}
// script is in the document
Note that having the script element in the document won't necessarily mean it's loaded. You may need to bind to the load event if you want to know when the script has loaded.
Is this code helpful to you?
var s = document.createElement('script');
s.type = 'text/javascript';
s.async = true;
s.src = 'script.js';
(document.getElementsByTagName('head')[0] || document.getElementsByTagName('body')[0]).appendChild(s);
I used this code before, and it worked:
var t=document;
var o=t.createElement('script');
o=t.standardCreateElement('script');
o.setAttribute('type','text/javascript');
o.setAttribute('src','http://www.example.com/js/jquery-1.3.2.js');
t.lastChild.firstChild.appendChild(o);

Resources