Searching a CSV File Using Grep - shell

Lets say I have a csv file like this:
a,b1,12,
a,b1,42,
d,e1,12,
r,12,33,
I want to use grep to return only only the rows where the third column = 12. So it would return:
a,b1,12,
d,e1,12,
but not:
r,12,33,
Any ideas for a regular expression that will allow me to do this?

I'd jump straight to awk to test the value exactly
awk -F, '$3 == 12' file.csv
This, and any regexp-based solution, assumes that the values of the first two fields do not contain commas

grep "^[^,]\+,[^,]\+,12," file.csv

Here's a variation:
egrep "^([^,]+,){2}12," file.csv
The advantage is that you can select the field simply by changing the number enclosed in curly braces without having to add or subtract literal copies of the pattern manually.

csvkit is a great toolkit for stuff like this, especially on the larger scale. After installing csvkit, follow these instructions to isolate the rows you want:
# Find rows that have the value 12 in the 3rd column
> csvgrep -c 3 -m 12 | csvlook
This should prettily print out the rows you want. The full documentation for csvkit (and a well-writen tutorial) can be found here.

when you have csv files, where you have distinct delimiters such as commas, use the splitting on field/delimiters approach, not regular expression. Tools to break strings up like awk, Perl/Python does the job easily for you (Perl/Python has support for csv modules for more complex csv parsing)
Perl,
$ perl -F/,/ -alne 'print if $F[2]==12;' file
a,b1,12,
d,e1,12,
$ awk -F"," '$3==12' file
a,b1,12,
d,e1,12,
or with just the shell
while IFS="," read a b c d
do
case "$c" in
12) echo "$a,$b,$c,$d"
esac
done <"file"

I don’t know about efficiency (would love to know), this works
cat path/to/file.csv | grep <some-text>

Linux tools cannot practically process csv, because quoted fields can contain newline characters according to rfc 1480 Most dedicated utilities are garbage for various reasons.
Here’s a Node.js v7.10+ single-file executable that “just works” and produces converted json objects, one per line. Should run Linux macOS Windows
Usage for a file with header line:
cat infinite.csv | csv1480json --header
{"some header": "field value"}
Without header line:
echo abc | csv1480json
{1: "abc"}
The grep becomes:
grep '3: "12"'
On the irect text you can do
Paste this as csv1480json accessible via your PATH and give executable permissions:
#!/usr/bin/env node
/******/ (function(modules) { // webpackBootstrap
/******/ // The module cache
/******/ var installedModules = {};
/******/
/******/ // The require function
/******/ function __webpack_require__(moduleId) {
/******/
/******/ // Check if module is in cache
/******/ if(installedModules[moduleId]) {
/******/ return installedModules[moduleId].exports;
/******/ }
/******/ // Create a new module (and put it into the cache)
/******/ var module = installedModules[moduleId] = {
/******/ i: moduleId,
/******/ l: false,
/******/ exports: {}
/******/ };
/******/
/******/ // Execute the module function
/******/ modules[moduleId].call(module.exports, module, module.exports, __webpack_require__);
/******/
/******/ // Flag the module as loaded
/******/ module.l = true;
/******/
/******/ // Return the exports of the module
/******/ return module.exports;
/******/ }
/******/
/******/
/******/ // expose the modules object (__webpack_modules__)
/******/ __webpack_require__.m = modules;
/******/
/******/ // expose the module cache
/******/ __webpack_require__.c = installedModules;
/******/
/******/ // define getter function for harmony exports
/******/ __webpack_require__.d = function(exports, name, getter) {
/******/ if(!__webpack_require__.o(exports, name)) {
/******/ Object.defineProperty(exports, name, {
/******/ configurable: false,
/******/ enumerable: true,
/******/ get: getter
/******/ });
/******/ }
/******/ };
/******/
/******/ // getDefaultExport function for compatibility with non-harmony modules
/******/ __webpack_require__.n = function(module) {
/******/ var getter = module && module.__esModule ?
/******/ function getDefault() { return module['default']; } :
/******/ function getModuleExports() { return module; };
/******/ __webpack_require__.d(getter, 'a', getter);
/******/ return getter;
/******/ };
/******/
/******/ // Object.prototype.hasOwnProperty.call
/******/ __webpack_require__.o = function(object, property) { return Object.prototype.hasOwnProperty.call(object, property); };
/******/
/******/ // __webpack_public_path__
/******/ __webpack_require__.p = "";
/******/
/******/ // Load entry module and return exports
/******/ return __webpack_require__(__webpack_require__.s = 0);
/******/ })
/************************************************************************/
/******/ ([
/* 0 */
/***/ (function(module, exports, __webpack_require__) {
"use strict";
var _extends = Object.assign || function (target) { for (var i = 1; i < arguments.length; i++) { var source = arguments[i]; for (var key in source) { if (Object.prototype.hasOwnProperty.call(source, key)) { target[key] = source[key]; } } } return target; };
var _CsvJsonConverter = __webpack_require__(1);
var _CsvJsonConverter2 = _interopRequireDefault(_CsvJsonConverter);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
launch({ fn: _CsvJsonConverter2.default, getOptions, errorHandler }).catch(errorHandler);
function getOptions() {
const { argv } = process;
console.log('getOptions argv', argv);
const useHeader = argv[2] === '--header';
if (argv.length !== (useHeader ? 3 : 2)) throw new Error('usage: csv1480json [--header]');
return { readStream: process.stdin, writeStream: process.stdout, useHeader };
}
async function launch({ fn, getOptions, errorHandler }) {
process.on('uncaughtException', errorHandler).on('unhandledRejection', errorHandler);
new fn(_extends({}, getOptions(), { errorHandler }));
}
function errorHandler(e) {
console.error(e instanceof Error ? e /*TODO .message*/ : `errorHandler value: ${typeof e} ${e}`);
process.exit(1);
}
/***/ }),
/* 1 */
/***/ (function(module, exports, __webpack_require__) {
"use strict";
Object.defineProperty(exports, "__esModule", {
value: true
});
var _Pipeline = __webpack_require__(2);
var _Pipeline2 = _interopRequireDefault(_Pipeline);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
// getField result
const FIELD_EOF = 1; // end of file
const FIELD_NONE = 2; // data for complete field not seen yet
const FIELD_RECORD = 3; // got a complete record
const separators = Array.from(',\r\n');
class CsvJsonConverter extends _Pipeline2.default {
constructor(o) {
super(o || false);
this.addData = string => this.csv += string;
this.isField = s => typeof s === 'string';
const { useHeader } = o || false;
this.useHeader = !!useHeader;
this.csv = '';
this.recordNo = 1;
console.log(this.useHeader, Object.keys(o || false));
}
getOutput(isEnd) {
if (isEnd) this.isEnd = true;
if (this.useHeader && !this.headers) if (!this.getHeader()) return;
let output = '';
for (let record; record = this.getRecord(); output += record + '\n');
return output || undefined;
}
getRecord() {
const fields = this.getFieldList();
if (fields) {
// got a record
const count = fields.length;
const { fieldCount, recordNo, useHeader, headers } = this;
if (!fieldCount) this.fieldCount = count;else if (count !== fieldCount) throw new Error(`Record ${recordNo} bad field count: ${count} expected ${fieldCount}`);
this.recordNo++;
return `{${fields.map((v, index) => `${useHeader ? headers[index] : index + 1}: ${JSON.stringify(v)}`).join(', ')}}`;
} else return false;
}
getHeader() {
const list = this.getFieldList();
if (list) {
this.headers = list.map(v => JSON.stringify(v));
this.fieldCount = list.length;
}
}
getFieldList() {
// array of string or false
let fields = this.fields || (this.fields = []);
let field;
while (this.isField(field = this.getField())) fields.push(field);
console.log('getFieldList end:', field, fields);
if (field === FIELD_RECORD) {
this.fields = null;
return fields;
} else return false; // need to wait for more data or end of records
}
getField() {
// string or FIELD_*
const { isEnd, recordNo } = this;
const fields = this.fields.length;
let { csv } = this;
let csvCh = csv[0];
if (csvCh === '\r' || csvCh === '\n') {
// skip the end of line terminating a previous record
if (csv.length < 2 && !isEnd) return FIELD_NONE; // must have two characters to find \r\n
const chs = csv.substring(0, 2) === '\r\n' ? 2 : 1;
this.csv = csv = csv.substring(chs);
return FIELD_RECORD; // we have a complete record
}
if (!csv && isEnd) return fields ? FIELD_RECORD : FIELD_EOF;
const m = `Record ${recordNo} field ${fields + 1}`;
if (fields) if (csvCh === ',') csvCh = (this.csv = csv = csv.substring(1))[0];else throw new Error(`${m} missing field-separating comma`); // TODO insert location
if (csvCh === '"') {
// double-quoted field
let quoteSearchIndex = this.quoteSearchIndex || 1; // where to start looking
let index;
for (;;) {
let index = csv.indexOf('"', quoteSearchIndex);
if (!~index) // no end-quote yet
if (!isEnd) {
this.quoteSearchIndex = quoteSearchIndex;
return FIELD_NONE; // no matching quote in data thus far
} else throw new Error(`${m} unmatched double quote`);
if (index - quoteSearchIndex < 2 || csv[index - 1] !== '\\') {
// found unescaped ending double quote
this.quoteSearchIndex = 0;
this.csv = csv.substring(index + 1);
return csv.substring(1, index);
}
quoteSearchIndex = index + 1; // skip escaped double quote
}
}
// it is an unquoted field
const index = separators.map(ch => csv.indexOf(ch)).reduce((r, index) => !~index ? r : !~r ? index : Math.min(r, index));
if (!~index) // none of the separators appeared
if (isEnd) {
this.csv = '';
return csv; // field is rest of line
} else return FIELD_NONE; // need more data
this.csv = csv.substring(index);
return csv.substring(0, index);
}
}
exports.default = CsvJsonConverter;
/***/ }),
/* 2 */
/***/ (function(module, exports, __webpack_require__) {
"use strict";
Object.defineProperty(exports, "__esModule", {
value: true
});
var _stream = __webpack_require__(3);
class PipeLine extends _stream.Transform {
constructor({ readStream, writeStream, errorHandler }) {
super({ decodeStrings: false, encoding: 'utf8' });
this._flush = callback => callback(null, this.getOutput(true));
const eh = typeof errorHandler;
if (eh !== 'function') throw new Error(`PipeLine: errorHandler not function: ${eh}`);
readStream.on('error', errorHandler).setEncoding('utf8').pipe(this.on('error', errorHandler)).pipe(writeStream.on('error', errorHandler));
}
_transform(chunk, encoding, callback) {
// callback(err, chunk)
if (chunk.length) this.addData(chunk);
callback(null, this.getOutput());
}
}
exports.default = PipeLine; /*
© 2017-present Harald Rudell <harald.rudell#gmail.com> (http://www.haraldrudell.com)
This source code is licensed under the ISC-style license found in the LICENSE file in the root directory of this source tree.
*/
/***/ }),
/* 3 */
/***/ (function(module, exports) {
module.exports = require("stream");
/***/ })
/******/ ]);
//# sourceMappingURL=csv1480json.js.map

Related

execute a sequence of GET calls to an API wait and do some treatments on the results then give the result as argumant to another methode for a POST

I am new to Angular and i am facing some difficulties with a task. I have an array of IDs that i want to execute the same GET Call over. And for every GET call result i have to do some operations and then add the result of every operation to some arrays. I managed to find a way to do it correctly. But my problem is, i can't manage to wait for the final result to be ready (after all the GET calls are done and the operations too) before giving it as an argument to another method that will send it with a POST call.
the method where i do the GET calls and the operations over every call's result (the problem occurs when i am in the rollBackSPN condition).
async getComponentIds(taskName: String, selectedComponents: IComponent[]) {
const componentsId: number[] = [];
const componentsWithoutParams: IComponent[] = [];
let sendPortaPrecedente : boolean;
if(taskName == "rollBackSPN"){
from(selectedComponents).pipe(
concatMap(component =>{
return this.http.get<any>("Url"+component.idComponent).pipe(
tap(val => {
sendPortaPrecedente = true;
for(const obj of val){
if((obj.name == "z0bpqPrevious" && obj.value == null) || (obj.name == "datePortaPrevious" && obj.value == null) || (obj.name == "typePortaPrevious" && obj.value == null)){
sendPortaPrecedente = false;
}
}
if(sendPortaPrecedente){
componentsId.push(component.idComponent);
}else{
componentsWithoutParams.push(component);
}
}),
catchError(err => {
return of(err);
})
)
})
).subscribe(val => {
return { componentsId : componentsId, componentsWithoutParams : componentsWithoutParams, sendPortaPrecedente : sendPortaPrecedente};
});
}else{
for (const component of selectedComponents) {
componentsId.push(component.idComponent)
return { componentsId : componentsId, componentsWithoutParams : componentsWithoutParams, sendPortaPrecedente : sendPortaPrecedente};
}
}
}
The method where i pass the getComponentIds(taskName: String, selectedComponents: IComponent[]) result so it can be send with a POST call (again when i am in the rollBackSPN condition)
executeTask(serviceIdSi: string, actionIdSi: string, actionClassName: string, componentName: string, taskName: string,
componentsId: number[], componentsWithoutParams: IComponent[], sendPortaPrecedente: boolean): Observable<any> {
const url = this.taskUrl + `?serviceId=${serviceIdSi}` + `&actionId=${actionIdSi}` + `&actionClassName=${actionClassName}`
+ `&componentName=${componentName}` + `&taskName=${taskName}`;
if(taskName == "rollBackSPN"){
if(sendPortaPrecedente && componentsWithoutParams.length == 0){
return this.http.post<any>(url, componentsId);
}else{
let errMessage = "Some Error Message"
for(const component of componentsWithoutParams){
errMessage = errMessage + component.idComponent +"\n";
}
throw throwError(errMessage);
}
}else{
return this.http.post<any>(url, componentsId);
}
}
Both these methods are defined in a service called TaskService.
And the service is called like this in a component UnitTaskButtonsComponent.
async launchUnitTask() {
this.isLoading = true;
this.isClosed = false;
this.appComponent.currentComponentIndex = this.componentIndex;
let res = await this.taskService.getComponentIds(this.unitTaskLabel, this.selectedComponents);
this.taskService.executeTask(this.appComponent.currentService.identifiantSi,
this.appComponent.currentAction.identifiantSi,
this.appComponent.currentAction.className,
this.selectedComponents[0].name,
this.unitTaskLabel,
res.componentsId,
res.componentsWithoutParams,
res.sendPortaPrecedente).subscribe(
data => this.executeTaskSuccess(),
error => this.executeTaskError());
}
"res" properties are always undefined when it's a rollBackSPN task.
The main issue here is that getComponentIds does not return a Promise. So awaiting does not work. I would suggest to change getComponentIds so that it returns an Observable instead.
getComponentIds(taskName: string, selectedComponents: IComponent[]) {
// ^^^^^^ use string instead of String
return forkJoin(
selectedComponents.map((component) => {
return this.http.get<any>("Url" + component.idComponent).pipe(
map((val) => {
let sendPortaPrecedente = true;
for (const obj of val) {
if (
(obj.name == "z0bpqPrevious" && obj.value == null) ||
(obj.name == "datePortaPrevious" && obj.value == null) ||
(obj.name == "typePortaPrevious" && obj.value == null)
) {
sendPortaPrecedente = false;
}
}
return { component, sendPortaPrecedente }
}),
catchError((err) => of(err))
);
})
).pipe(
map((result) => {
const componentsId: number[] = [];
const componentsWithoutParams: IComponent[] = [];
for (const val of result) {
if (val.sendPortaPrecedente) {
componentsId.push(val.component.idComponent);
} else {
componentsWithoutParams.push(val.component);
}
}
return { componentsId, componentsWithoutParams };
})
);
}
Instead of using concatMap, let's use a forkJoin. The forkJoin allows sending all requests in parallel and returns the result in an array. But we have to pass in an array of Observables. That's why we map over the selectedComponents.
In the lower map, we can now get the complete result of the http calls in the result parameter. Here we do the processing of the data. I was not really sure how to handle the sendPortaPrecedente. You will have to fill that in.
We simply return the whole Observable
async launchUnitTask() {
this.taskService
.getComponentIds(this.unitTaskLabel, this.selectedComponents)
.pipe(
switchMap((res) => {
this.taskService
.executeTask(
this.appComponent.currentService.identifiantSi,
this.appComponent.currentAction.identifiantSi,
this.appComponent.currentAction.className,
this.selectedComponents[0].name,
this.unitTaskLabel,
res.componentsId,
res.componentsWithoutParams,
res.sendPortaPrecedente
)
})
).subscribe(
(data) => this.executeTaskSuccess(),
(error) => this.executeTaskError()
);
}
In the launchUnitTask method, we don't use await anymore. Instead, we call getComponentIds and chain the call of executeTask with a switchMap.

How can I add / replace / remove a type from a GraphQLSchema?

I am working on a GraphQL schema validation tool. I would like to update in memory my GraphQLSchema object.
For instance to replace a type I tried to do:
const replaceType = (schema: GraphQLSchema, oldType: GraphQLNamedType, newType: GraphQLNamedType) => {
const config = schema.toConfig();
config.types = config.types.filter(t => t.name !== oldType.name);
config.types.push(newType);
return new GraphQLSchema(config);
}
This however fails here with
Schema must contain uniquely named types but contains multiple types named "MyType".
at typeMapReducer (../../node_modules/graphql/type/schema.js:262:13)
at Array.reduce (<anonymous>)
at new GraphQLSchema (../../node_modules/graphql/type/schema.js:145:28)
It looks like there are existing references to the old type that I am not updating,
If this is useful to someone else, the following update of type references seem to work
const replaceType = (schema: GraphQLSchema, oldType: GraphQLNamedType, newType: GraphQLNamedType) => {
const config = schema.toConfig();
config.types = config.types.filter(t => t.name !== oldType.name);
config.types.push(newType);
makeConfigConsistent(config);
return new GraphQLSchema(config);
}
/**
* As we add types that originally come from a different schema, we need to update all the references to maintain consistency
* within the set of types we are including.
*
* Types from the original schema need to update their references to point to the new types,
* and types from the new schema need to update their references to point to the original types that were not replaces.
*/
const makeConfigConsistent = (config: SchemaConfig) => {
const typeMap: { [typeName: string]: GraphQLNamedType } = {};
// Update references for root types
config.query = null;
config.mutation = null;
config.subscription = null;
config.types.forEach(type => {
typeMap[type.name] = type;
if (isObjectType(type)) {
if (type.name === 'Query') {
config.query = type;
} else if (type.name === 'Mutation') {
config.mutation = type;
} else if (type.name === 'Subscription') {
config.subscription = type;
}
}
});
// Update references to only point to the final set of types.
const finalTypes = config.types;
if (config.query) {
finalTypes.push(config.query);
}
if (config.mutation) {
finalTypes.push(config.mutation);
}
if (config.subscription) {
finalTypes.push(config.subscription);
}
const updatedType = (type: any): any | undefined => {
if (isNamedType(type)) {
if (type === typeMap[type.name]) {
return type;
}
}
if (isListType(type)) {
const subType = updatedType(type.ofType);
if (!subType) {
return undefined;
}
return new GraphQLList(subType);
}
if (isNonNullType(type)) {
const subType = updatedType(type.ofType);
if (!subType) {
return undefined;
}
return new GraphQLNonNull(subType);
}
if (isScalarType(type)) {
if (type === typeMap[type.name]) {
return type;
}
if (['Int', 'String', 'Float', 'Boolean', 'ID'].includes(type.name)) {
// This is a default scalar type (https://graphql.org/learn/schema/#scalar-types)
return type;
}
}
if (isNamedType(type)) {
const result = typeMap[type.name];
if (!result) {
return undefined;
}
return result;
}
throw new Error(`Unhandled cases for ${type}`);
};
finalTypes.forEach(type => {
if (isObjectType(type) || isInterfaceType(type)) {
const anyType = type as any;
anyType._fields = arraytoDict(
Object.values(type.getFields())
.filter(field => updatedType(field.type) !== undefined)
.map(field => {
field.type = updatedType(field.type);
field.args = field.args
.filter(arg => updatedType(arg.type) !== undefined)
.map(arg => {
arg.type = updatedType(arg.type);
return arg;
});
return field;
}),
field => field.name,
);
if (isObjectType(type)) {
anyType._interfaces = type.getInterfaces().map(int => updatedType(int));
}
} else if (isInputObjectType(type)) {
const anyType = type as any;
anyType._fields = arraytoDict(
Object.values(type.getFields())
.filter(field => updatedType(field.type) !== undefined)
.map(field => {
field.type = updatedType(field.type);
return field;
}),
field => field.name,
);
} else if (isUnionType(type)) {
const anyType = type as any;
anyType._types = type
.getTypes()
.map(t => updatedType(t))
.filter(t => t !== undefined);
}
});
};
function arraytoDict<T>(array: T[], getKey: (element: T) => string): { [key: string]: T } {
const result: { [key: string]: T } = {};
array.forEach(element => {
result[getKey(element)] = element;
});
return result;
};

How to check the maximum number of selectors in stylelint?

I need to check that there is one root class in one file.
Is it possible?
// Error
.a { }
.b { }
Expected
// Success
.a {}
It is not possible to do this with the rules built into stylelint.
However, it is possible to create a stylelint plugin to do this.
The plugin would look something like:
// ./plugins/stylelint-root-max-rules/index.js
const isNumber = require("lodash/isNumber");
const {
createPlugin,
utils: { report, ruleMessages, validateOptions }
} = require("stylelint");
const ruleName = "plugin/root-max-rules";
const messages = ruleMessages(ruleName, {
expected: max =>
`Expected no more than ${max} ${max === 1 ? "rule" : "rules"}`
});
const rule = quantity => {
return (root, result) => {
const validOptions = validateOptions(result, ruleName, {
actual: quantity,
possible: isNumber
});
if (!validOptions) return;
const { length } = root.nodes.filter(node => node.type === "rule");
if (length <= quantity) return;
report({
message: messages.expected(quantity),
node: root,
result,
ruleName
});
};
};
module.exports = createPlugin(ruleName, rule);
module.exports.ruleName = ruleName;
module.exports.messages = messages;
You would then use the plugin like so:
{
"plugins": ["./plugins/stylelint-root-max-rules"],
"rules": [
"plugin/root-max-rules": 1
]
}

How to list folders and files in a directory using ReactiveX

When using Observables for certain tasks that involve a lot of chaining and a lot of asynchronous operations, such as listing all the items in a folder and checking all of the folders in it for a specific file, I often end up either needing to build the complex chain for each task (return Observable.of(folder)...) or having some kind of special value that gets forwarded to the end to signal the end of a batch (every operator starts with if(res === false) return Observable.of(false)).
Sort of like that stick that you put between your groceries and those of the person in front of you at the checkout.
It seems like there should be a better way that doesn't involve forwarding a stop value through all kinds of callbacks and operators.
So what is a good way to call a function that takes a folder path string and returns a list of all the files and folders in it. It also specifies whether the files are HTML files or not, and whether or not the folders contain a file called tiddlywiki.json.
The only requirement is that it can't return anything like Observable.of(...).... It should probably have a subject at the top of the chain, but that is not a requirement.
function listFolders(folder) {
return [
{ type: 'folder', name: 'folder1' },
{ type: 'datafolder', name: 'folder2' }, //contains "tiddlywiki.json" file
{ type: 'folder', name: 'folder3' },
{ type: 'htmlfile', name: 'test.html' },
{ type: 'other', name: 'mytest.txt' }
]
}
Here is one that does not follow the rules I layed out (see below for one that does), but it took about ten minutes, using the first one as a guide.
export function statFolder(subscriber, input: Observable<any>) {
return input.mergeMap(([folder, tag]) => {
return obs_readdir({ folder, tag })(folder);
}).mergeMap(([err, files, { folder, tag }]) => {
if (err) { return Observable.of({ error: err }) as any; }
else return Observable.from(files).mergeMap(file => {
return obs_stat([file,folder])(path.join(folder, file as string));
}).map(statFolderEntryCB).mergeMap<any, any>((res) => {
let [entry, [name, folder]] = res as [any, [string, string, number, any]];
if (entry.type === 'folder')
return obs_readdir([entry])(path.join(entry.folder, entry.name));
else return Observable.of([true, entry]);
}, 20).map((res) => {
if (res[0] === true) return (res);
let [err, files, [entry]] = res as [any, string[], [FolderEntry, number, any]];
if (err) {
entry.type = "error";
} else if (files.indexOf('tiddlywiki.json') > -1)
entry.type = 'datafolder';
return ([true, entry]);
}).reduce((n, [dud, entry]) => {
n.push(entry);
return n;
}, []).map(entries => {
return { entries, folder, tag };
}) as Observable<{ entries: any, folder: any, tag: any }>;
}).subscribe(subscriber);
}
Original: This took a few hours to write...and it works...but...it uses concatMap, so it can only take one request at a time. It uses a custom operator that I wrote for the purpose.
export function statFileBatch(subscriber, input: Observable<any>) {
const signal = new Subject<number>();
var count = 0;
//use set timeout to fire after the buffer recieves this item
const sendSignal = (item) => setTimeout(() => { count = 0; signal.next(item); });
return input.concatMap(([folder, tag]) => {
return obs_readdir({ folder, tag })(folder);
}).lift({
call: (subs: Subscriber<any>, source: Observable<any>) => {
const signalFunction = (count) => signal.mapTo(1), forwardWhenEmpty = true;
const waiting = [];
const _output = new Subject();
var _count = new Subject<number>()
const countFactory = Observable.defer(() => {
return Observable.create(subscriber => {
_count.subscribe(subscriber);
})
});
var isEmpty = true;
const sourceSubs = source.subscribe(item => {
if (isEmpty && forwardWhenEmpty) {
_output.next(item);
} else {
waiting.push(item)
}
isEmpty = false;
})
const pulse = new Subject<any>();
const signalSubs = pulse.switchMap(() => {
return signalFunction(countFactory)
}).subscribe(count => {
//act on the closing observable value
var i = 0;
while (waiting.length > 0 && i++ < count)
_output.next(waiting.shift());
//if nothing was output, then we are empty
//if something was output then we are not
//this is meant to be used with bufferWhen
if (i === 0) isEmpty = true;
_count.next(i);
_count.complete();
_count = new Subject<number>();
pulse.next();
})
pulse.next(); //prime the pump
const outputSubs = Observable.create((subscriber) => {
return _output.subscribe(subscriber);
}).subscribe(subs) as Subscription;
return function () {
outputSubs.unsubscribe();
signalSubs.unsubscribe();
sourceSubs.unsubscribe();
}
}
}).mergeMap(([err, files, { folder, tag }]) => {
if (err) { sendSignal(err); return Observable.empty(); }
return Observable.from(files.map(a => [a, folder, files.length, tag])) as any;
}).mergeMap((res: any) => {
let [file, folder, fileCount, tag] = res as [string, string, number, any];
return obs_stat([file, folder, fileCount, tag])(path.join(folder, file))
}, 20).map(statFolderEntryCB).mergeMap<any, any>((res) => {
let [entry, [name, folder, fileCount, tag]] = res as [any, [string, string, number, any]];
if (entry.type === 'folder')
return obs_readdir([entry, fileCount, tag])(path.join(entry.folder, entry.name));
else return Observable.of([true, entry, fileCount, tag]);
}, 20).map((res) => {
//if (res === false) return (false);
if (res[0] === true) return (res);
let [err, files, [entry, fileCount, tag]] = res as [any, string[], [FolderEntry, number, any]];
if (err) {
entry.type = "error";
} else if (files.indexOf('tiddlywiki.json') > -1)
entry.type = 'datafolder';
return ([true, entry, fileCount, tag]);
}).map(([dud, entry, fileCount, tag]) => {
count++;
if (count === fileCount) {
sendSignal([count, tag]);
}
return entry;
}).bufferWhen(() => signal).withLatestFrom(signal).map(([files, [sigResult, tag]]: any) => {
return [
typeof sigResult !== 'number' ? sigResult : null, //error object
files, //file list
typeof sigResult === 'number' ? sigResult : null, //file count
tag //tag
];
}).subscribe(subscriber);
}

jquery plugin creation issue

I have created a plugin with following codes:
var myplugin = {
init: function(options) {
$.myplugin.settings = $.extend({}, $.myplugin.defaults, options);
},
method1: function(par1) {
.....
},
method2: function(par1) {
.....
}
};
$.myplugin = function(method){
if ( myplugin[method] ) {
return myplugin[ method ].apply( this, Array.prototype.slice.call( arguments, 1 ));
} else if (typeof method === 'object' || !method) {
return myplugin.init.apply(this, arguments);
} else {
$.error( 'Method "' + method + '" does not exist in myplugin!');
}
};
$.myplugin.defaults = {
option1: 'test',
option2: '',
option3: ''
};
$.myplugin.settings = {};
$.myplugin();
This works well but the issue is that when I try to set more than 1 option and try to return its values afterwards, it gives empty; setting one option works well. For eg.
If on changing the first combo box value I call this:
$.myplugin({option1: 'first test'});
it works, but when I try to call another on second combo box it doesn't save the option, instead it reset to empty.
Is there any fix?
I would re-organize the plugin to use this structure:
var methods = {
settings: {
foo: "foo",
bar: "bar"
},
init: function(options) {
this.settings = $.extend({}, this.settings, options);
},
method1: function(par1) {
alert(this.settings.foo);
},
method2: function(par1) {
alert(this.settings.bar);
}
};
function MyPlugin(options) {
this.init(options);
return this;
}
$.extend(MyPlugin.prototype, methods);
$.myPlugin = function(options) {
return new MyPlugin(options);
}
/* usage */
// without parameters
var obj1 = $.myPlugin();
// with parameters
var obj2 = $.myPlugin({foo: "foobar"});
// each has it's own settings
obj1.method1();
obj2.method1();
Demo: http://jsfiddle.net/ypXdS/
Essentially $.myPlugin simply creates and returns a new instance of the MyPlugin class. You could get rid of it completely and use new myPlugin(options) in it's place.

Resources