I am new to reactive programming.
I have this observable:
[[7,2,5][4,3]]
I need to convert it to this:
[7,2,5,4,3]
How can I convert it with help of flatmap operator?
Using mergeAll or concatAll
source.pipe(mergeAll());
source.pipe(concatAll());
Related
In project reactor is it possible to implement a stream with switchIfEmpty and onErrorResume at the same time?
infoRepository.findById(id); //returns Mono<Info>
in case of empty or error then switch to the same backup stream?
There's no single operator that does these things together, but you can trivially switch to an empty publisher on an error, then handle both cases through switchIfEmpty like:
infoRepository.findById(id)
.onErrorResume(e -> Mono.empty())
.switchIfEmpty(newPublisher);
I have a Flux and I want to convert it to List. How can I do that?
Flux<Object> getInstances(String serviceId); // Current one
List<Object> getInstances(String serviceId); // Demanded one
Java 8 or reactive components have a prepared method to map or convert it to List ??
I should use .map()
final List<ServiceInstance> sis = convertedStringList.parallelStream()
.map( this.reactiveDiscoveryClient::getInstances )
// It should be converted to List<Object>
1. Make sure you want this
A fair warning before diving into anything else: Converting a Flux to a List/Stream makes the whole thing not reactive in the strict sense of the concept because you are leaving the push domain and trading it with a pull domain. You may or may not want this (usually you don't) depending on the use-case. Just wanted to leave the note.
2. Converting a Flux to a List
According to the Flux documentation, the collectList method will return a Mono<List<T>>. It will return immediately, but it's not the resulting list itself, but a lazy structure, the Mono, that promises the result will eventually be there when the sequence is completed.
According to the Mono documentation, the block method will return the contents of the Mono when it completes. Keep in mind that block may return null.
Combining both, you could use someFlux.collectList().block(). Provided that someFlux is a Flux<Object>, the result would be a List<Object>.
The block method won't return anything if the Flux is infinite. As an example, the following will return a list with two words:
Flux.fromArray(new String[]{"foo", "bar"}).collectList().block()
But the following will never return:
Flux.interval(Duration.ofMillis(1000)).collectList().block()
To prevent blocking indefinitely or for too long, you may pass a Duration argument to block, but that will timeout with an exception when the subscription does not complete on time.
3. Converting a Flux to a Stream
According to the Flux documentation, the toStream method converts a Flux<T> into a Stream<T>. This is more friendly to operators such as flatMap. Mind this simple example, for the sake of demonstration:
Stream.of("f")
.flatMap(letter ->
Flux.fromArray(new String[]{"foo", "bar"})
.filter(word -> word.startsWith(letter)).toStream())
.collect(Collectors.toList())
One could simply use .collectList().block().stream(), but not only it's less readable, but it could also result in NPE if block returned null. This approach does not finish for an infinite Flux as well, but because this is a stream of unknown size, you can still use some operations on it before it's complete, without blocking.
I'm using an Azure Stream Job to parse incoming JSON data from an IoT Hub.
I'm even using ...
CROSS APPLY GetArrayElements(event.NestedRows) as nestedrows
... to expand and denormalize additional events within each event - works great, no issues.
However, I have a new JSON property that is of type string and it is actually an embedded JSON array. For example:
{
"escapedArray": "[ 1, 2, 3 ]"
}
I'd like to use CROSS APPLY on this array as well, however I don't see any way to parse the string and convert it to a JSON array.
I considered a User Defined Function (UDF), but I read that it can only return scalers, and not arrays.
Is there a trick I'm missing inside the Stream Job to parse this string, or do I have to expand it in the event stream, prior to the Stream Job?
(FYI, I have no way to change this stream in the device event source.)
-John
According to the offical tutorial, you can create an UDF in Stream Analytics to convert a string to an array.
Defined an UDF udf.toJson as below.
function main(arrStr) {
return JSON.parse(arrStr);
}
Then use the UDF in a Query for the property escapedArray to return an array.
SELECT
UDF.toJson(escapedArray) as uarr
INTO
[YourOutputAlias]
FROM
[YourInputAlias]
My test result is as the figure below.
Hope it helps.
I have an Observable that emits a List of entries as below:
val obsList: Observable[List[MyEntry] = Observable.from(getListEntry)
// getListEntry would give me a List[MyEntry]
How could I now get the contents piped into another Observable that takes in a MyEntry? I mean, for each entry in the original Observable, I would need to pipe them to another Observable that has a type signature of:
Observable[MyEntry]
I figured out how to do this! I'm using the Monifu library!
So, I would be doing:
Observable.fromIterable(listOfEntry)
I have reviewed Jasmine's documentation of the toHaveBeenCalledWith matcher in order to understand whether it's possible to pass in a regular expression for an argument, if that argument is expected to be a string. Unfortunately, this is unsupported functionality. There's also an issue open on github requesting this functionality.
I've dug a bit into the codebase, and I see how it might be possible to implement this inside the existing matcher. I think it would be more appropriate to implement it as a separate matcher though, so that the abstraction is captured individually.
In the meantime, what might be a good workaround?
After doing some digging, I've discovered that Jasmine spy objects have a calls property, which in turn has a mostRecent() function. This function also has a child property args, which returns an array of call arguments.
Thus, one may use the following sequence to perform a regexp match on call arguments, when one wants to check that the string arguments match a specific regular expression:
var mySpy = jasmine.createSpy('foo');
mySpy("bar", "baz");
expect(mySpy.calls.mostRecent().args[0]).toMatch(/bar/);
expect(mySpy.calls.mostRecent().args[1]).toMatch(/baz/);
Pretty straightforward.
As of Jasmine 2.2, you can use jasmine.stringMatching:
var mySpy = jasmine.createSpy('foo');
mySpy('bar', 'baz');
expect(mySpy).toHaveBeenCalledWith(
jasmine.stringMatching(/bar/),
jasmine.stringMatching(/baz/)
);
In Jasmine 2.0 the signature changed a bit. Here it would be:
var mySpy = jasmine.createSpy('foo');
mySpy("bar", "baz");
expect(mySpy.calls.mostRecent().args[0]).toMatch(/bar/);
expect(mySpy.calls.mostRecent().args[1]).toMatch(/baz/);
And the Documentation for Jasmine 1.3 has moved.
Alternatively, if you are spying on a method on an object:
spyOn(obj, 'method');
obj.method('bar', 'baz');
expect(obj.method.argsForCall[0][0]).toMatch(/bar/);
expect(obj.method.argsForCall[0][1]).toMatch(/baz/);
Sometimes it is more readable to write it this way:
spyOn(obj, 'method').and.callFake(function(arg1, arg2) {
expect(arg1).toMatch(/bar/);
expect(arg2).toMatch(/baz/);
});
obj.method('bar', 'baz');
expect(obj.method).toHaveBeenCalled();
It give more clear visibility of method arguments (instead of using array)
As jammon mentioned, the Jasmine 2.0 signature has changed. If you are spying on the method of an object in Jasmine 2.0, Nick's solution should be changed to use something like -
spyOn(obj, 'method');
obj.method('bar', 'baz');
expect(obj.method.calls.mostRecent().args[0]).toMatch(/bar/);
expect(obj.method.calls.mostRecent().args[1]).toMatch(/baz/);