How to list folders and files in a directory using ReactiveX - rxjs

When using Observables for certain tasks that involve a lot of chaining and a lot of asynchronous operations, such as listing all the items in a folder and checking all of the folders in it for a specific file, I often end up either needing to build the complex chain for each task (return Observable.of(folder)...) or having some kind of special value that gets forwarded to the end to signal the end of a batch (every operator starts with if(res === false) return Observable.of(false)).
Sort of like that stick that you put between your groceries and those of the person in front of you at the checkout.
It seems like there should be a better way that doesn't involve forwarding a stop value through all kinds of callbacks and operators.
So what is a good way to call a function that takes a folder path string and returns a list of all the files and folders in it. It also specifies whether the files are HTML files or not, and whether or not the folders contain a file called tiddlywiki.json.
The only requirement is that it can't return anything like Observable.of(...).... It should probably have a subject at the top of the chain, but that is not a requirement.
function listFolders(folder) {
return [
{ type: 'folder', name: 'folder1' },
{ type: 'datafolder', name: 'folder2' }, //contains "tiddlywiki.json" file
{ type: 'folder', name: 'folder3' },
{ type: 'htmlfile', name: 'test.html' },
{ type: 'other', name: 'mytest.txt' }
]
}

Here is one that does not follow the rules I layed out (see below for one that does), but it took about ten minutes, using the first one as a guide.
export function statFolder(subscriber, input: Observable<any>) {
return input.mergeMap(([folder, tag]) => {
return obs_readdir({ folder, tag })(folder);
}).mergeMap(([err, files, { folder, tag }]) => {
if (err) { return Observable.of({ error: err }) as any; }
else return Observable.from(files).mergeMap(file => {
return obs_stat([file,folder])(path.join(folder, file as string));
}).map(statFolderEntryCB).mergeMap<any, any>((res) => {
let [entry, [name, folder]] = res as [any, [string, string, number, any]];
if (entry.type === 'folder')
return obs_readdir([entry])(path.join(entry.folder, entry.name));
else return Observable.of([true, entry]);
}, 20).map((res) => {
if (res[0] === true) return (res);
let [err, files, [entry]] = res as [any, string[], [FolderEntry, number, any]];
if (err) {
entry.type = "error";
} else if (files.indexOf('tiddlywiki.json') > -1)
entry.type = 'datafolder';
return ([true, entry]);
}).reduce((n, [dud, entry]) => {
n.push(entry);
return n;
}, []).map(entries => {
return { entries, folder, tag };
}) as Observable<{ entries: any, folder: any, tag: any }>;
}).subscribe(subscriber);
}
Original: This took a few hours to write...and it works...but...it uses concatMap, so it can only take one request at a time. It uses a custom operator that I wrote for the purpose.
export function statFileBatch(subscriber, input: Observable<any>) {
const signal = new Subject<number>();
var count = 0;
//use set timeout to fire after the buffer recieves this item
const sendSignal = (item) => setTimeout(() => { count = 0; signal.next(item); });
return input.concatMap(([folder, tag]) => {
return obs_readdir({ folder, tag })(folder);
}).lift({
call: (subs: Subscriber<any>, source: Observable<any>) => {
const signalFunction = (count) => signal.mapTo(1), forwardWhenEmpty = true;
const waiting = [];
const _output = new Subject();
var _count = new Subject<number>()
const countFactory = Observable.defer(() => {
return Observable.create(subscriber => {
_count.subscribe(subscriber);
})
});
var isEmpty = true;
const sourceSubs = source.subscribe(item => {
if (isEmpty && forwardWhenEmpty) {
_output.next(item);
} else {
waiting.push(item)
}
isEmpty = false;
})
const pulse = new Subject<any>();
const signalSubs = pulse.switchMap(() => {
return signalFunction(countFactory)
}).subscribe(count => {
//act on the closing observable value
var i = 0;
while (waiting.length > 0 && i++ < count)
_output.next(waiting.shift());
//if nothing was output, then we are empty
//if something was output then we are not
//this is meant to be used with bufferWhen
if (i === 0) isEmpty = true;
_count.next(i);
_count.complete();
_count = new Subject<number>();
pulse.next();
})
pulse.next(); //prime the pump
const outputSubs = Observable.create((subscriber) => {
return _output.subscribe(subscriber);
}).subscribe(subs) as Subscription;
return function () {
outputSubs.unsubscribe();
signalSubs.unsubscribe();
sourceSubs.unsubscribe();
}
}
}).mergeMap(([err, files, { folder, tag }]) => {
if (err) { sendSignal(err); return Observable.empty(); }
return Observable.from(files.map(a => [a, folder, files.length, tag])) as any;
}).mergeMap((res: any) => {
let [file, folder, fileCount, tag] = res as [string, string, number, any];
return obs_stat([file, folder, fileCount, tag])(path.join(folder, file))
}, 20).map(statFolderEntryCB).mergeMap<any, any>((res) => {
let [entry, [name, folder, fileCount, tag]] = res as [any, [string, string, number, any]];
if (entry.type === 'folder')
return obs_readdir([entry, fileCount, tag])(path.join(entry.folder, entry.name));
else return Observable.of([true, entry, fileCount, tag]);
}, 20).map((res) => {
//if (res === false) return (false);
if (res[0] === true) return (res);
let [err, files, [entry, fileCount, tag]] = res as [any, string[], [FolderEntry, number, any]];
if (err) {
entry.type = "error";
} else if (files.indexOf('tiddlywiki.json') > -1)
entry.type = 'datafolder';
return ([true, entry, fileCount, tag]);
}).map(([dud, entry, fileCount, tag]) => {
count++;
if (count === fileCount) {
sendSignal([count, tag]);
}
return entry;
}).bufferWhen(() => signal).withLatestFrom(signal).map(([files, [sigResult, tag]]: any) => {
return [
typeof sigResult !== 'number' ? sigResult : null, //error object
files, //file list
typeof sigResult === 'number' ? sigResult : null, //file count
tag //tag
];
}).subscribe(subscriber);
}

Related

execute a sequence of GET calls to an API wait and do some treatments on the results then give the result as argumant to another methode for a POST

I am new to Angular and i am facing some difficulties with a task. I have an array of IDs that i want to execute the same GET Call over. And for every GET call result i have to do some operations and then add the result of every operation to some arrays. I managed to find a way to do it correctly. But my problem is, i can't manage to wait for the final result to be ready (after all the GET calls are done and the operations too) before giving it as an argument to another method that will send it with a POST call.
the method where i do the GET calls and the operations over every call's result (the problem occurs when i am in the rollBackSPN condition).
async getComponentIds(taskName: String, selectedComponents: IComponent[]) {
const componentsId: number[] = [];
const componentsWithoutParams: IComponent[] = [];
let sendPortaPrecedente : boolean;
if(taskName == "rollBackSPN"){
from(selectedComponents).pipe(
concatMap(component =>{
return this.http.get<any>("Url"+component.idComponent).pipe(
tap(val => {
sendPortaPrecedente = true;
for(const obj of val){
if((obj.name == "z0bpqPrevious" && obj.value == null) || (obj.name == "datePortaPrevious" && obj.value == null) || (obj.name == "typePortaPrevious" && obj.value == null)){
sendPortaPrecedente = false;
}
}
if(sendPortaPrecedente){
componentsId.push(component.idComponent);
}else{
componentsWithoutParams.push(component);
}
}),
catchError(err => {
return of(err);
})
)
})
).subscribe(val => {
return { componentsId : componentsId, componentsWithoutParams : componentsWithoutParams, sendPortaPrecedente : sendPortaPrecedente};
});
}else{
for (const component of selectedComponents) {
componentsId.push(component.idComponent)
return { componentsId : componentsId, componentsWithoutParams : componentsWithoutParams, sendPortaPrecedente : sendPortaPrecedente};
}
}
}
The method where i pass the getComponentIds(taskName: String, selectedComponents: IComponent[]) result so it can be send with a POST call (again when i am in the rollBackSPN condition)
executeTask(serviceIdSi: string, actionIdSi: string, actionClassName: string, componentName: string, taskName: string,
componentsId: number[], componentsWithoutParams: IComponent[], sendPortaPrecedente: boolean): Observable<any> {
const url = this.taskUrl + `?serviceId=${serviceIdSi}` + `&actionId=${actionIdSi}` + `&actionClassName=${actionClassName}`
+ `&componentName=${componentName}` + `&taskName=${taskName}`;
if(taskName == "rollBackSPN"){
if(sendPortaPrecedente && componentsWithoutParams.length == 0){
return this.http.post<any>(url, componentsId);
}else{
let errMessage = "Some Error Message"
for(const component of componentsWithoutParams){
errMessage = errMessage + component.idComponent +"\n";
}
throw throwError(errMessage);
}
}else{
return this.http.post<any>(url, componentsId);
}
}
Both these methods are defined in a service called TaskService.
And the service is called like this in a component UnitTaskButtonsComponent.
async launchUnitTask() {
this.isLoading = true;
this.isClosed = false;
this.appComponent.currentComponentIndex = this.componentIndex;
let res = await this.taskService.getComponentIds(this.unitTaskLabel, this.selectedComponents);
this.taskService.executeTask(this.appComponent.currentService.identifiantSi,
this.appComponent.currentAction.identifiantSi,
this.appComponent.currentAction.className,
this.selectedComponents[0].name,
this.unitTaskLabel,
res.componentsId,
res.componentsWithoutParams,
res.sendPortaPrecedente).subscribe(
data => this.executeTaskSuccess(),
error => this.executeTaskError());
}
"res" properties are always undefined when it's a rollBackSPN task.
The main issue here is that getComponentIds does not return a Promise. So awaiting does not work. I would suggest to change getComponentIds so that it returns an Observable instead.
getComponentIds(taskName: string, selectedComponents: IComponent[]) {
// ^^^^^^ use string instead of String
return forkJoin(
selectedComponents.map((component) => {
return this.http.get<any>("Url" + component.idComponent).pipe(
map((val) => {
let sendPortaPrecedente = true;
for (const obj of val) {
if (
(obj.name == "z0bpqPrevious" && obj.value == null) ||
(obj.name == "datePortaPrevious" && obj.value == null) ||
(obj.name == "typePortaPrevious" && obj.value == null)
) {
sendPortaPrecedente = false;
}
}
return { component, sendPortaPrecedente }
}),
catchError((err) => of(err))
);
})
).pipe(
map((result) => {
const componentsId: number[] = [];
const componentsWithoutParams: IComponent[] = [];
for (const val of result) {
if (val.sendPortaPrecedente) {
componentsId.push(val.component.idComponent);
} else {
componentsWithoutParams.push(val.component);
}
}
return { componentsId, componentsWithoutParams };
})
);
}
Instead of using concatMap, let's use a forkJoin. The forkJoin allows sending all requests in parallel and returns the result in an array. But we have to pass in an array of Observables. That's why we map over the selectedComponents.
In the lower map, we can now get the complete result of the http calls in the result parameter. Here we do the processing of the data. I was not really sure how to handle the sendPortaPrecedente. You will have to fill that in.
We simply return the whole Observable
async launchUnitTask() {
this.taskService
.getComponentIds(this.unitTaskLabel, this.selectedComponents)
.pipe(
switchMap((res) => {
this.taskService
.executeTask(
this.appComponent.currentService.identifiantSi,
this.appComponent.currentAction.identifiantSi,
this.appComponent.currentAction.className,
this.selectedComponents[0].name,
this.unitTaskLabel,
res.componentsId,
res.componentsWithoutParams,
res.sendPortaPrecedente
)
})
).subscribe(
(data) => this.executeTaskSuccess(),
(error) => this.executeTaskError()
);
}
In the launchUnitTask method, we don't use await anymore. Instead, we call getComponentIds and chain the call of executeTask with a switchMap.

How to update CSV in each it blocks in Cypress

I want to update CSV file data (Ex: columns data like orderId, date etc...) in each it blocks of spec file. I have written code to update CSV inside Cypress.config.js file and calling cy.task from spec file. But everytime first it block updated CSV is passed to all other it blocks. Please let me know how can I achieve this
Under uploadOrders.cy.js
/// <reference types='Cypress'/>
import { ORDERS } from '../../selector/orders';
import BU from '../../fixtures/BU.json';
import Helper from '../../e2e/utils/Helper';
const helper = new Helper();
let getTodaysDate = helper.getTodaysDate(); // get today's date and store in getTodaysDate variable
let getTomorrowssDate = helper.getTomorrowsDate(); // get tomorrows's date and store in getTomorrowssDate variable
let getYesterdaysDate = helper.getYesterdaysDate(); // get tomorrows's date and store in getTomorrowssDate variable
describe('Upload Orders', () => {
// Before start executing all it blocks
before(() => {
cy.login(); //login code written in cypress commands.js file
cy.get(ORDERS.ORDER_ICON).click();
});
// Before start executing each it block
beforeEach(() => {
cy.writeFile('cypress/fixtures/finalCsvToBeUploaded.csv', ''); // clears the file before each it block executes
});
// First it block
it('Upload orders and check its successful', () => {
let csvData = {
orderId: 'OrderId_' + helper.getCurrentDateAndTimeInMiliseconds(),
orderDate: getTomorrowssDate,
homebaseExecutionDate: getTomorrowssDate,
customerExecutionDate: getTomorrowssDate,
}
cy.readFile('cypress/fixtures/referenceCsvFile.csv')
.then((data) => {
cy.task('csvToJson', data)
.then(finalJsonArray => {
cy.task('updateCsvData', { csvData, finalJsonArray })
.then(finalUpdatedJsonArray => {
cy.log("Update JSON array: " + finalUpdatedJsonArray[0]['Order ID']);
cy.task('finalCsv', finalUpdatedJsonArray);
});
})
})
cy.get(ORDERS.ORDER_UPLOAD).click();
cy.attachCsvFile('finalCsvToBeUploaded.csv');
cy.validateToastMsg('Orders uploaded successfully');
cy.log('Order is uploaded successfully via csv file and orderId is ');
});
// Second it block
it('Upload orders and check validation for past customer execution date', () => {
cy.wait(5000);
let csvData = {
orderId: 'OrderId_' + helper.getCurrentDateAndTimeInMiliseconds(),
orderDate: getTomorrowssDate,
homebaseExecutionDate: getYesterdaysDate,
customerExecutionDate: getYesterdaysDate,
}
cy.readFile('cypress/fixtures/finalCsvToBeUploaded.csv')
.then((data) => {
cy.task('csvToJson', data)
.then(finalJsonArray => {
cy.task('updateCsvData', { csvData, finalJsonArray })
.then(finalUpdatedJsonArray => {
cy.log("Update JSON array: " + finalUpdatedJsonArray);
cy.task('finalCsv', finalUpdatedJsonArray);
});
})
})
cy.get(ORDERS.ORDER_UPLOAD).click();
cy.attachCsvFile('finalCsvToBeUploaded.csv');
cy.validateToastMsg('Orders uploaded successfully');
cy.log('Order is uploaded successfully via csv file and orderId is');
});
// After exection of all it blocks
after(() => {
// clear cookies and localStorage
cy.clearCookies();
cy.clearLocalStorage();
});
});
Under cypress.config.js file
const { defineConfig } = require("cypress");
const converter = require('json-2-csv');
const csv = require('csv-parser');
const fs = require('fs');
const { default: Helper } = require("./cypress/e2e/utils/Helper");
const csvToJson1 = require('convert-csv-to-json');
const helper = require("csvtojson");
module.exports = defineConfig({
chromeWebSecurity: false,
watchFileForChanges: false,
defaultCommandTimeout: 10000,
pageLoadTimeout: 50000,
viewportWidth: 1280,
viewportHeight: 800,
video: false,
screenshotOnRunFailure: true,
"reporter": "mochawesome",
"reporterOptions": {
"charts": true,
"overwrite": false,
"html": false,
"json": true,
"timestamp": 'dd_mm_yy_HH_MM_ss',
"reportDir": "cypress/reports/mochawesome-report"
},
e2e: {
//To invoke test runner to pick files from the below path
specPattern: 'cypress/e2e/**/*.cy.js',
setupNodeEvents(on, config) {
// implement node event listeners here
// return require('./cypress/plugins/index.js')(on, config)
// `on` is used to hook into various events Cypress emits
// `config` is the resolved Cypress config
require('#cypress/code-coverage/task')(on, config)
//Start full screen
on('before:browser:launch', (browser = {}, launchOptions) => {
console.log(launchOptions.args);
if (browser.family === 'chromium' && browser.name !== 'electron') {
launchOptions.args.push('--start-fullscreen');
}
if (browser.name === 'electron') {
launchOptions.preferences.fullscreen = true;
}
return launchOptions;
});
//Convert CSV to JSON
on('task', {
csvToJson(data) {
var lines = data.split("\n");
var result = [];
var headers = lines[0].split(",");
for (var i = 1; i < (lines.length); i++) {
var obj = {};
var currentline = lines[i].split(/,(?=(?:(?:[^"]*"){2})*[^"]*$)/);
for (var j = 0; j < headers.length; j++) {
obj[headers[j]] = currentline[j].replace(/["']/g, "");
}
result.push(obj);
}
console.log(result);
return result;
}
})
// Write updated csv data into file
on('task', {
finalCsv(updatedJSON) {
converter.json2csvAsync(updatedJSON).then(updatedCsv => {
fs.writeFileSync('cypress/fixtures/finalCsvToBeUploaded.csv', updatedCsv);
}).catch(err => console.log(err));
return null;
}
});
//For log purpose; prints message in the console
on('task', {
log(message) {
console.log(message);
return null;
},
});
on('task', {
updateCsvData({ csvData, finalJsonArray }) {
let updatedJSON,orderIds = [];
for (let i = 0; i < (finalJsonArray.length); i++) {
if ('orderId' in csvData) {
orderIds[i] = csvData.orderId;
finalJsonArray[i]['Order ID'] = csvData.orderId;
}
else {
// orderIds[i] = 'OrderId_' + helper.getCurrentDateAndTimeInMiliseconds();
orderIds[i] = 'OrderId_' + Date.now();
finalJsonArray[i]['Order ID'] = orderIds[i];
}
if ('orderDate' in csvData) {
finalJsonArray[i]['Order Date'] = csvData.orderDate;
}
if ('homebaseExecutionDate' in csvData) {
finalJsonArray[i]['Homebase Execution Date'] = csvData.homebaseExecutionDate;
}
if ('customerExecutionDate' in csvData) {
finalJsonArray[i]['Customer Execution Date'] = csvData.customerExecutionDate;
}
}
updatedJSON = finalJsonArray;
return updatedJSON;
}
})
on('task', {
updateCsvFile(csvData) {
const finalJsonArray = [];
let updatedJSON, orderIds = [];
//readFile
fs.createReadStream('cypress/fixtures/referenceCsvFile.csv')
.pipe(csv())
.on('data', (data) => finalJsonArray.push(data))
.on('end', () => {
console.log(finalJsonArray); // CSV converted to json object
//Logic to update json objects; for loop to update csv columns in json array
for (let i = 0; i < (finalJsonArray.length); i++) {
if ('orderId' in csvData) {
orderIds[i] = csvData.orderId;
finalJsonArray[i]['Order ID'] = csvData.orderId;
}
else {
orderIds[i] = 'OrderId_' + this.getCurrentDateAndTimeInMiliseconds();
finalJsonArray[i]['Order ID'] = orderIds[i];
}
if ('orderDate' in csvData) {
finalJsonArray[i]['Order Date'] = csvData.orderDate;
}
if ('homebaseExecutionDate' in csvData) {
finalJsonArray[i]['Homebase Execution Date'] = csvData.homebaseExecutionDate;
}
if ('customerExecutionDate' in csvData) {
finalJsonArray[i]['Customer Execution Date'] = csvData.customerExecutionDate;
}
}
updatedJSON = finalJsonArray;
converter.json2csvAsync(updatedJSON).then(csvFile => {
fs.writeFileSync('cypress/fixtures/' + fileName, csvFile)
}).catch(err => console.log(err));
})
return orderIds;
}
})
return config;
}
}
});
In first it block, CSV file is updating but when controller comes to second it block - considering the same csv file which is updated in first it block but I need to update the csv file separately in second it block
Finally got the answer to my question:
csvData is an object where column data are stored
let csvData = {
orderDate: getTodaysDate,
homebaseExecutionDate: getTomorrowsDate,
customerExecutionDate: getTomorrowsDate
};
Common method to upload the file under commands.js.
Cypress.Commands.add('updateCsvFileData', (csvData, referenceFileName, updatedCsvFileName) => {
cy.readFile('cypress/fixtures/' + referenceFileName)
.then((data) => {
cy.wait(100).then(() => {
cy.task('csvToJson', data)
.then((finalJsonArray) => {
cy.wait(100).then(() => {
cy.task('updateCsvData', { csvData, finalJsonArray })
.then((finalUpdatedJsonArray) => {
cy.wait(100).then(() => {
cy.task('finalCsv', { finalUpdatedJsonArray, updatedCsvFileName })
});
})
})
})
})
})
})
node.js methods:
Convert CSV to JSON object
on('task', {
csvToJson(data) {
var lines = data.split("\n");
var result = [];
var headers = lines[0].split(",");
for (var i = 1; i < (lines.length); i++) {
var obj = {};
var currentline = lines[i].split(/,(?=(?:(?:[^"]*"){2})*[^"]*$)/);
for (var j = 0; j < headers.length; j++) {
obj[headers[j]] = currentline[j].replace(/["']/g, "");
}
result.push(obj);
}
return result;
}
})
Write updated csv data into CSV file
on('task', {
finalCsv({ finalUpdatedJsonArray, updatedCsvFileName }) {
converter.json2csvAsync(finalUpdatedJsonArray).then(updatedCsv => {
fs.writeFile('cypress/fixtures/' + updatedCsvFileName, updatedCsv);
}).catch(err => console.log(err));
return null;
}
});
Update required fields in CSV file
on('task', {
updateCsvData({ csvData, finalJsonArray }) {
let updatedJSON, orderIds = [];
for (let i = 0; i < (finalJsonArray.length); i++) {
if (csvData.hasOwnProperty('orderId')) {
orderIds[i] = csvData.orderId;
finalJsonArray[i]['Order ID'] = csvData.orderId;
}
else {
let orderIdRandomNum = Date.now() + "_" + Math.floor((Math.random() * 9999) + 1);
orderIds[i] = 'OrderId_' + orderIdRandomNum;
orders[i] = orderIds[i];
finalJsonArray[i]['Order ID'] = orderIds[i];
}
if ('orderDate' in csvData) {
finalJsonArray[i]['Order Date'] = csvData.orderDate;
}
if ('homebaseExecutionDate' in csvData) {
finalJsonArray[i]['Homebase Execution Date'] = csvData.homebaseExecutionDate;
}
if ('customerExecutionDate' in csvData) {
finalJsonArray[i]['Customer Execution Date'] = csvData.customerExecutionDate;
}
if ('teamId' in csvData) {
finalJsonArray[i]['Team ID'] = csvData.teamId;
}
}
updatedJSON = finalJsonArray;
return updatedJSON;
}
})

GraphQL relay connectionFromArraySlice

There isn't any documentation for how the array meta info (arrayLength and sliceStart) should be implemented using facebook's graphql-relay-js helper library.
https://github.com/graphql/graphql-relay-js/issues/199
I managed to get it to work using the following implemention however I am guessing there is an easier/more correct way to do this.
Retrieve rows and row count from database
function transformRole(role: Role) {
return { ...role, roleId: role.id };
}
async function getRolesSlice({ roleId, after, first, last, before }: any): Promise<[Role[], number]> {
const queryBuilder = repository.createQueryBuilder();
if (roleId !== undefined) {
queryBuilder.where('id = :roleId', { roleId });
}
if (before) {
const beforeId = cursorToOffset(before);
queryBuilder.where('id < :id', { id: beforeId });
}
if (after) {
const afterId = cursorToOffset(after);
queryBuilder.where({
id: MoreThan(Number(afterId))
});
}
if (first === undefined && last === undefined) {
queryBuilder.orderBy('id', 'ASC');
}
if (first) {
queryBuilder.orderBy('id', 'ASC').limit(first);
}
if (last) {
queryBuilder.orderBy('id', 'DESC').limit(last);
}
return Promise.all([
queryBuilder.getMany()
.then(roles => roles.map(transformRole)),
repository.count() // Total number of roles
]);
}
Roles resolver
resolve: (_, args) =>
getRolesSlice(args)
.then(([results, count]) => {
const firstId = results[0] && results[0].roleId;
let sliceStart = 0;
if (args.first) {
sliceStart = firstId;
}
if (args.last) {
sliceStart = Math.max(firstId - args.last, 0);
}
if (args.after && args.last) {
sliceStart += 1;
}
return connectionFromArraySlice(
results,
args,
{
arrayLength: count + 1,
sliceStart
}
);
})
},
Edit:
This is what I came up with which is a little cleaner and seems to be working correctly.
const initialize = () => {
repository = getConnection().getRepository(Role);
}
function transformRole(role: Role) {
return { ...role, roleId: role.id };
}
function getRolesSlice(args: any):
Promise<[
Role[],
any,
{ arrayLength: number; sliceStart: number; }
]> {
if (!repository) initialize();
const { roleId, after, first, last, before } = args;
const queryBuilder = repository.createQueryBuilder();
if (roleId !== undefined) {
queryBuilder.where('id = :roleId', { roleId });
}
if (before !== undefined) {
const beforeId = cursorToOffset(before);
queryBuilder.where({
id: LessThan(beforeId)
});
}
if (after !== undefined) {
const afterId = cursorToOffset(after);
queryBuilder.where({
id: MoreThan(Number(afterId))
});
}
if (first !== undefined) {
queryBuilder.orderBy('id', 'ASC').limit(first);
} else if (last !== undefined) {
queryBuilder.orderBy('id', 'DESC').limit(last);
} else {
queryBuilder.orderBy('id', 'ASC');
}
return Promise.all([
queryBuilder.getMany()
.then(roles => roles.map(transformRole))
.then(roles => last !== undefined ? roles.slice().reverse() : roles),
repository.count()
]).then(([roles, totalCount]) =>
[
roles,
args,
{
arrayLength: totalCount + 1,
sliceStart: roles[0] && roles[0].roleId
}
]
);
}
// Resolver
roles: {
type: rolesConnection,
args: {
...connectionArgs,
roleId: {
type: GraphQLString
}
},
resolve: (_, args) =>
getRolesSlice(args)
.then((slice) => connectionFromArraySlice(...slice))
},

How to check the maximum number of selectors in stylelint?

I need to check that there is one root class in one file.
Is it possible?
// Error
.a { }
.b { }
Expected
// Success
.a {}
It is not possible to do this with the rules built into stylelint.
However, it is possible to create a stylelint plugin to do this.
The plugin would look something like:
// ./plugins/stylelint-root-max-rules/index.js
const isNumber = require("lodash/isNumber");
const {
createPlugin,
utils: { report, ruleMessages, validateOptions }
} = require("stylelint");
const ruleName = "plugin/root-max-rules";
const messages = ruleMessages(ruleName, {
expected: max =>
`Expected no more than ${max} ${max === 1 ? "rule" : "rules"}`
});
const rule = quantity => {
return (root, result) => {
const validOptions = validateOptions(result, ruleName, {
actual: quantity,
possible: isNumber
});
if (!validOptions) return;
const { length } = root.nodes.filter(node => node.type === "rule");
if (length <= quantity) return;
report({
message: messages.expected(quantity),
node: root,
result,
ruleName
});
};
};
module.exports = createPlugin(ruleName, rule);
module.exports.ruleName = ruleName;
module.exports.messages = messages;
You would then use the plugin like so:
{
"plugins": ["./plugins/stylelint-root-max-rules"],
"rules": [
"plugin/root-max-rules": 1
]
}

RxJS Filter Array of Arrays

I'm trying to perform a filter on a array of arrays in rxjs. Consider the following:
function guard1(): boolean | Observable<boolean> {}
function guard2(): boolean | Observable<boolean> {}
function guard3(): boolean | Observable<boolean> {}
const routes = [
{ name: 'Foo', canActivate: [guard1, guard2] },
{ name: 'Bar', canActivate: [guard3] },
{ name: 'Moo' }
];
I want to filter the routes array to only routes that return true
from the combination of results of inner array canActivate or if it doesn't have canActivate, I want it to be NOT filtered out.
Lets say guard1 returned true and guard2 returned false, I'd expect route Foo to not be in the filtered list.
I took a stab at this but its not quite doing what I expect:
this.filteredRoutes = forkJoin(of(routes).pipe(
flatMap((route) => route),
filter((route) => route.canActivate !== undefined),
mergeMap((route) =>
of(route).pipe(
mergeMap((r) => r.canActivate),
mergeMap((r) => r()),
map((result) => {
console.log('here', result, route);
return route;
})
)
)));
If I were writing this outside of RXJS, the code might look something like this:
this.filteredRoutes = [];
for (const route of this.routes) {
if (route.canActivate) {
let can = true;
for (const act of route.canActivate) {
let res = inst.canActivate();
if (res.subscribe) {
res = await res.toPromise();
}
can = res;
if (!can) {
break;
}
}
if (can) {
this.filteredRoutes.push(route);
}
} else {
this.filteredRoutes.push(route);
}
}
Thanks!
I'm sure there's other (and likely better ways to handle this, but it works...
from(routes).pipe(
concatMap((route) => {
// handle if nothing is in canActivate
if (!route.canActivate || route.canActivate.length === 0) {
// create an object that has the route and result for filtering
return of({route, result: true})
};
const results = from(route.canActivate).pipe(
// execute the guard
switchMap(guard => {
const result: boolean | Observable<boolean> = guard();
if (result instanceof Observable) {
return result;
} else {
return of(result);
}
}),
// aggregate the guard results for the route
toArray(),
// ensure all results are true
map(results => results.every(r => r)),
// create an object that has the route and result for filtering
map(result => ({route, result})),
);
return results;
}),
// filter out the invalid guards
filter(routeCanActivateResult => routeCanActivateResult.result),
// return just the route
map(routeCanActivateResult => routeCanActivateResult.route),
// turn it back into an array
toArray()
)
// verify it works
.subscribe(routes => routes.forEach(r => console.log(r.name)));
Also, here is a working example in stackblitz.

Resources