Produce a stream of values with data-driven delays in RxJS - rxjs

Given an array of objects which contain a message payload and time parameter like this:
var data = [
{ message:"Deliver me after 1000ms", time:1000 },
{ message:"Deliver me after 2000ms", time:2000 },
{ message:"Deliver me after 3000ms", time:3000 }
];
I would like to create an observable sequence which returns the message part of each element of the array and then waits for the corresponding amount of time specified in the object. I'm open to reorganising the data structure of the array if that is necessary.
I've seen Observable.delay but can't see how it could be used with a dynamic value in this way. I'm working in RxJS 5.

You could use delayWhen:
var data = [
{ message:"Deliver me after 1000ms", time:1000 },
{ message:"Deliver me after 2000ms", time:2000 },
{ message:"Deliver me after 3000ms", time:3000 }
];
Rx.Observable
.from(data)
.delayWhen(datum => Rx.Observable.timer(datum.time))
.do(datum => console.log(datum.message))
.subscribe();
<script src="https://unpkg.com/#reactivex/rxjs#5.0.3/dist/global/Rx.js"></script>

Related

How to modify just a property from a dexie store without deleting the rest?

I'm having the dexie stores showed in the print screen below:
Dexie stores print screen
My goal is to update a dexie field row from a store without losing the rest of the data.
For example: when I edit and save the field "com_name" from the second row (key={2}) I want to update "com_name" only and not lose the rest of the properties, see first and the third row.
I already tried with collection.modify and table.update but both deleted the rest of the properties when used the code below:
dexieDB.table('company').where('dexieKey').equals('{1}')
//USING table.update
//.update(dexieRecord.dexiekey, {
// company: {
// com_name: "TOP SERVE 2"
// }
//})
.modify(
{
company:
{
com_name: TOP SERVE 2
}
}
)
.then(function (updated) {
if (updated)
console.log("Success.");
else
console.log("Nothing was updated.");
})
.catch(function (err) { console.log(err); });
Any idea how can I accomplish that?
Thanks
Alex
You where right to use Table.update or Collection.modify. They should never delete other properties than the ones specified. Can you paste a jsitor.com or jsfiddle repro of that and someone may help you pinpoint why the code doesn't work as expected.
Now that you are saying I realised that company and contact stores are created dynamically and editedRecords store has the indexes explicitly declared therefore when update company or contact store, since dexie doesn't see the indexes will overwrite. I haven't tested it yet but I suspect this is the behaviour.
See the print screen below:
Dexie stores overview
Basically I have json raw data from db and in the browser I create the stores and stores data based on it, see code below:
function createDexieTables(jsonData) { //jsonData - array, is the json from db
const stores = {};
const editedRecordsTable = 'editedRecords';
jsonData.forEach((jsonPackage) => {
for (table in jsonPackage) {
if (_.find(dexieDB.tables, { 'name': table }) == undefined) {
stores[table] = 'dexieKey';
}
}
});
stores[editedRecordsTable] = 'dexieKey, table';
addDataToDexie(stores, jsonData);
}
function addDataToDexie(stores, jsonData) {
dbv1 = dexieDB.version(1);
if (jsonData.length > 0) {
dbv1.stores(stores);
jsonData.forEach((jsonPackage) => {
for (table in jsonPackage) {
jsonPackage[table].forEach((tableRow) => {
dexieDB.table(table).add(tableRow)
.then(function () {
console.log(tableRow, ' added to dexie db.');
})
.catch(function () {
console.log(tableRow, ' already exists.');
});
});
}
});
}
}
This is the json, which I convert to object and save to dexie in the value column and the key si "dexieKey":
[
{
"company": [
{
"dexieKey": "{1}",
"company": {
"com_pk": 1,
"com_name": "CloudFire",
"com_city": "Round Rock",
"serverLastEdit": [
{
"com_pk": "2021-06-02T11:30:24.774Z"
},
{
"com_name": "2021-06-02T11:30:24.774Z"
},
{
"com_city": "2021-06-02T11:30:24.774Z"
}
],
"userLastEdit": []
}
}
]
}
]
Any idea why indexes were not populated when generating them dynamically?
Given the JSON data, i understand what's going wrong.
Instead of passing the following to update():
{
company:
{
com_name: "TOP SERVE 2"
}
}
You probably meant to pass this:
{
"company.com_name": "TOP SERVE 2"
}
Another hint is to do the add within an rw transaction, or even better if you can use bulkAdd() instead to optimize the performance.

Wait for Subscription set Recursively to Complete

I have an array of objects with children and have a need to set a field (hidden) in each of those objects recursively. The value for each is set in a subscription. I want to wait until each item in the array is recursively updated before the subscription is complete.
The hidden field will be set based on roles and permissions derived from another observable. In the example I added a delay to simulate that.
Here's my first pass at it. I'm certain there is a much cleaner way of going about this.
https://codesandbox.io/s/rxjs-playground-hp3wr
// Array structure. Note children.
const navigation = [
{
id: "applications",
title: "Applications",
children: [
{
id: "dashboard",
title: "Dashboard"
},
{
id: "clients",
title: "Clients"
},
{
id: "documents",
title: "Documents",
children: [
{
id: "dashboard",
title: "Dashboard"
},...
]
},
{
id: "reports",
title: "Reports"
},
{
id: "resources",
title: "Resources"
}
]
}
];
In the code sandbox example, looking at the console messages, I get the correct result. However, I would like to avoid having to subscribe in setHidden and recursivelySetHidden. I would also like to avoid using Subject if possible.
Here is my approach:
const roleObservable = timer(1000).pipe(mapTo("**************"));
function populateWithField(o, field, fieldValue) {
if (Array.isArray(o)) {
return from(o).pipe(
concatMap(c => populateWithField(c, field, fieldValue)),
toArray()
);
}
if (o.children) {
return roleObservable.pipe(
tap(role => (fieldValue = role)),
concatMap(role => populateWithField(o.children, field, role)),
map(children => ({
...o,
[field]: fieldValue,
children
}))
);
}
return roleObservable.pipe(
map(role => ({
[field]: role,
...o
}))
);
}
of(navigation)
.pipe(concatMap(o => populateWithField(o, "hidden")))
.subscribe(console.log, e => console.error(e.message));
The main thing to notice is the frequent use of concatMap. It it a higher-order mapping operator which means, among other things, that it will automatically subscribe to/unsubscribe from its inner observable.
What differentiates concatMap from other operators, is that it keeps a buffer of emitted values, which means that it will wait for the current inner observable to complete before subscribing to the next one.
In this case, you'd have to deal with a lot of Observables-of-Observables(higher-order observables), which is why you have to use concatMap every time you encounter a children property. Any child in that property could have their own children property, so you must make sure an Observable contains only first-order Observables.
You can read more about higher-order and first-order observables here.
Here is a CodeSandbox example

RxJS Map array to observable and back to plain object in array

I have an array of objects from which I need to pass each object separately into async method (process behind is handled with Promise and then converted back to Observable via Observable.fromPromise(...) - this way is needed because the same method is used in case just single object is passed anytime; the process is saving objects into database). For example, this is an array of objects:
[
{
"name": "John",
...
},
{
"name": "Anna",
...
},
{
"name": "Joe",,
...
},
{
"name": "Alexandra",
...
},
...
]
Now I have the method called insert which which inserts object into database. The store method from database instance returns newly created id. At the end the initial object is copied and mapped with its new id:
insert(user: User): Observable<User> {
return Observable.fromPromise(this.database.store(user)).map(
id => {
let storedUser = Object.assign({}, user);
storedUser.id = id;
return storedUser;
}
);
}
This works well in case I insert single object. However, I would like to add support for inserting multiple objects which just call the method for single insert. Currently this is what I have, but it doesn't work:
insertAll(users: User[]): Observable<User[]> {
return Observable.forkJoin(
users.map(user => this.insert(user))
);
}
The insertAll method is inserting users as expected (or something else filled up the database with that users), but I don't get any response back from it. I was debugging what is happening and seems that forkJoin is getting response just from first mapped user, but others are ignored. Subscription to insertAll does not do anything, also there is no any error either via catch on insertAll or via second parameter in subscribe to insertAll.
So I'm looking for a solution where the Observable (in insertAll) would emit back an array of new objects with users in that form:
[
{
"id": 1,
"name": "John",
...
},
{
"id": 2,
"name": "Anna",
...
},
{
"id": 3,
"name": "Joe",,
...
},
{
"id": 4,
"name": "Alexandra",
...
},
...
]
I would be very happy for any suggestion pointing in the right direction. Thanks in advance!
To convert from array to observable you can use Rx.Observable.from(array).
To convert from observable to array, use obs.toArray(). Notice this does return an observable of an array, so you still need to .subscribe(arr => ...) to get it out.
That said, your code with forkJoin does look correct. But if you do want to try from, write the code like this:
insertAll(users: User[]): Observable<User[]> {
return Observable.from(users)
.mergeMap(user => this.insert(user))
.toArray();
}
Another more rx like way to do this would be to emit values as they complete, and not wait for all of them like forkJoin or toArray does. We can just omit the toArray from the previous example and we got it:
insertAll(users: User[]): Observable<User> {
return Observable.from(users)
.mergeMap(user => this.insert(user));
}
As #cartant mentioned, the problem might not be in Rx, it might be your database does not support multiple connections. In that case, you can replace the mergeMap with concatMap to make Rx send only 1 concurrent request:
insertAll(users: User[]): Observable<User[]> {
return Observable.from(users)
.concatMap(user => this.insert(user))
.toArray(); // still optional
}

Perform sequential api calls with RxJs?

Is there a way in RxJs to perform two api calls where the second requires data from the first and return a combined result as a stream? What I'm trying to do is call the facebook API to get a list of groups and the cover image in various sizes. Facebook returns something like this:
// call to facebook /1234 to get the group 1234, cover object has an
// image in it, but only one size
{ id: '1234', cover: { id: '9999' } }
// call to facebook /9999 to get the image 9999 with an array
// with multiple sizes, omitted for simplicity
{ images: [ <image1>, <image2>, ... ] }
// desired result:
{ id: '1234', images: [ <image1>, <image2>, ... ] }
So I have this:
var result = undefined;
rxGroup = fbService.observe('/1234');
rxGroup.subscribe(group => {
rxImage = fbService.observe(`/${group.cover.id}`);
rxImage.subscribe(images => {
group.images = y;
result = group;
}
}
I want to create a method that accepts a group id and returns an Observable that will have the combined group + images (result here) in the stream. I know I can create my own observable and call the next() function in there where I set 'result' above, but I'm thinking there has to be an rx-way to do this. select/map lets me transform, but I don't know how to shoe-in the results from another call. when/and/then seems promising, but also doesn't look like it supports something like that. I could map and return an observable, but the caller would then have to do two subscribes.
Looks like flatMap is the way to go (fiddle). It is called like subscribe and gives you a value from a stream. You return an observable from that and it outputs the values from all the created observables (one for for each element in the base stream) into the resulting stream.
var sourceGroup = { // result of calling api /1234
id: '1234',
cover: {
id: '9999'
}
};
var sourceCover = { // result of calling api /9999
id: '9999',
images: [{
src: 'image1x80.png'
}, {
src: 'image1x320.png'
}]
};
var rxGroup = Rx.Observable.just(sourceGroup);
var rxCombined = rxGroup.flatMap(group =>
Rx.Observable.just(sourceCover)
.map(images => ({
id: group.id,
images: images.images
}))
)
rxCombined.subscribe(x =>
console.log(JSON.stringify(x, null, 2)));
<script src="https://cdnjs.cloudflare.com/ajax/libs/rxjs/4.1.0/rx.all.min.js"></script>
Result:
{
"id": "1234",
"images": [
{
"src": "image1x80.png"
},
{
"src": "image1x320.png"
}
]
}
You should use concatMap instead of flatMap, it will preserve the order of the source emissions.

Map reduce to count tags

I am developing a web app using Codeigniter and MongoDB.
I am trying to get the map reduce to work.
I got a file document with the below structure. I would like to do a map reduce to
check how many times each tag is being used and output it to the collection files.tags.
{
"_id": {
"$id": "4f26f21f09ab66c1030d0000e"
},
"basic": {
"name": "The filename"
},
"tags": [
"lorry",
"house",
"car",
"bicycle"
],
"updated_at": "2012-02-09 11:08:03"
}
I tried this map reduce command but it does not count each individual tag:
$map = new MongoCode ("function() {
emit({tags: this.tags}, {count: 1});
}");
$reduce = new MongoCode ("function( key , values ) {
var count = 0;
values.forEach(function(v) {
count += v['count'];
});
return {count: count};
}");
$this->mongo_db->command (array (
"mapreduce" => "files",
"map" => $map,
"reduce" => $reduce,
"out" => "files.tags"
)
);
Change your Map function to:
function map(){
if(!this.tags) return;
this.tags.forEach(function(tag){
emit(tag, {count: 1});
});
}
Yea, this map/reduce simply calculate total count of tags.
In mongodb cookbook there is example you are looking for.
You have to emit each tag instead of entire collection of tags:
map = function() {
if (!this.tags) {
return;
}
for (index in this.tags) {
emit(this.tags[index], 1);
}
}
You'll need to call emit once for each tag in the input documents.
MongoDB documentation for example says:
A map function calls emit(key,value) any
number of times to feed data to the reducer. In most cases you will
emit once per input document, but in some cases such as counting tags,
a given document may have one, many, or even zero tags.

Resources