I've tried following code, but the task finishes before response arrives:
buildscript {
dependencies {
classpath("io.ktor:ktor-client-core:1.6.5")
classpath("io.ktor:ktor-client-cio:1.6.5")
}
}
tasks {
register("suspendCall") {
doLast {
kotlinx.coroutines.GlobalScope.launch {
val client = io.ktor.client.HttpClient()
val response = client.get<io.ktor.client.statement.HttpResponse>("https://ktor.io/")
println(response)
}
}
}
}
Is there a correct way to wait for async code to complete?
To my knowledge, to work asynchronously or in parallel, you will want to use the Worker API:
The Worker API provides the ability to break up the execution of a task action into discrete units of work and then to execute that work concurrently and asynchronously. This allows Gradle to fully utilize the resources available and complete builds faster.
As a result, you should not need to use a specific language asynchronous concept such as Kotlin's coroutines. Instead, you can use plain Kotlin or Java, and let Gradle handle the asynchronous bits.
Related
Hi guys i'm trying to improve performance of some computation in my system. Basically I want to generate a series of actions based on some data. This doesn't scale well and I want to try doing this in parallel and getting a result after (a bit like how futures work)
I have an interface with a series of implementations that get a collection of actions. And want to call all these in parallel and await the results at the end.
The issue is that, when I view the logs its clearly doing this sequentially and waiting on each action getter before going to the next one. I thought the async would do this asynchronously, but its not.
The method the runBlocking is in, is within a spring transaction. Maybe that has something to do with it.
runBlocking {
val actions = actionsReportGetters.map { actionReportGetter ->
async {
getActions(actionReportGetter, abstractUser)
}
}.awaitAll().flatten()
allActions.addAll(actions)
}
private suspend fun getActions(actionReportGetter: ActionReportGetter, traderUser: TraderUser): List<Action> {
return actionReportGetter.getActions(traderUser)
}
interface ActionReportGetter {
fun getActions(traderUser: TraderUser): List<Action>
}
Looks like you are doing some blocking operation in ActionReportGetter.getActions in a single threaded environment (probably in the main thread).
For such IO operations you should launch your coroutines in Dispatchers.IO which provides a thread pool with multiple threads.
Update your code to this:
async(Dispatchers.IO) { // Switch to IO dispatcher
getActions(actionReportGetter, abstractUser
}
Also getActions need not be a suspending function here. You can remove the suspend modifier from it.
Kotlin coroutines and 'suspending functions' make it easy for the programmer to await the result of I/O without stopping the thread (the thread is given other work to do until the I/O is complete).
jOOQ is a Java-first product for writing and executing SQL in a typesafe way, but does not itself make explicit use of Kotlin coroutines.
Can jOOQ be called from a Kotlin co-routine scope to get the nice-to-write and thread-is-productive-even-during-IO benefits?
suspend fun myQuery() {
return dsl.select()
// .etc()
.fetch() // <- could this be a 'suspend' call?
}
A: Yes.
BECAUSE
org.jooq.ResultQuery<R extends Record> has a #NotNull CompletionStage<Result<R>> fetchAsync()` method which hooks into Java's (JDK 8+) machinery for 'futures'.
AND
kotlinx-coroutines-jdk8 provides a bunch of extension methods for adapting between Kotlin suspending functions and JDK 8+ futures.
THEREFORE we can do:
import kotlinx.coroutines.future.await
...
suspend fun myQuery() {
return dsl.select()
//.etc()
.fetchAsync()
.await() // <- IS a suspending call !!!
}
It should be noted that there are a bunch of overloaded fetchX() methods on ResultQuery which provide lots of utility to synchronous calls, but they are not similarly overloaded for fetchAsync(). That's where the Kotlin programmer may wish to be familiar with the Java futures machinery: any sort of manipulation can be accomplished asynchronously using the thenApply {} method on CompletionStage<T>. For example, mapping the results:
suspend fun myQuery() {
return dsl.select()
//.etc()
.fetchAsync()
.thenApply { it.map(mapping(::Film)) } // <- done as part of the 'suspend'
.await()
}
although it should be fine to do it after the suspend:
suspend fun myQuery() {
val records = dsl.select()
//.etc()
.fetchAsync()
.await()
return records.map(mapping(::Film))
}
JOOQ added R2dbc support, the ResultQuery is a ReactiveStreams Publisher.
Currently the simplest approach is using Reactor Kotlin extension to convert the Reactor APIs to Kotlin Coroutines APIs.
Flux.from(
ctx.select()
...//etc
...// do not call fetch
)
.asFlow()
Update: Jooq guys cancelled the PR of Kotlin Coroutines on DslContext, so now you have to use kotlinx-coroutines-ractor as above.
When switching to 3.17 or later, use fetchAwait directly. There is a series of xxxAwait added.
ctx.select()
...//etc
.fetchAwait()
but I did not find direct support for Kotlin Coroutines,**Jooq 3.17 ships official Koltin Coroutines support in `jooq-kotlin` module**.
I'm working on converting a blocking sequential orchestration framework to reactive. Right now, these tasks are dynamic and are fed into the engine by a JSON input. The engine pulls classes and executes the run() method and saves the state with the responses from each task.
How do I achieve the same chaining in reactor? If this was a static DAG, I would have chained it with flatMap or then operators but since it is dynamic, How do I proceed with executing a reactive task and collecting the output from each task?
Examples:
Non reactive interface:
public interface OrchestrationTask {
OrchestrationContext run(IngestionContext ctx);
}
Core Engine
public Status executeDAG(String id) {
IngestionContext ctx = ContextBuilder.getCtx(id);
List<OrchestrationTask> tasks = app.getEligibleTasks(id);
for(OrchestrationTask task : tasks) {
// Eligible tasks are executed sequentially and results are collected.
OrchestrationContext stepContext = task.run(ctx);
if(!evaluateResult(stepContext)) break;
}
return Status.SUCCESS;
}
Following the above example, if I convert tasks to return Mono<?> then, how do I wait or chain other tasks to operate on the result on previous tasks?
Any help is appreciated. Thanks.
Update::
Reactive Task example.
public class SampleTask implements OrchestrationTask {
#Override
public Mono<OrchestrationContext> run(OrchestrationContext context) {
// Im simulating a delay here. treat this as a long running task (web call) But the next task needs the response from the below call.
return Mono.just(context).delayElements(Duration.ofSeconds(2));
}
So i will have a series of tasks that accomplish various things but the response from each task is dependent on the previous and is stored in the Orchestration Context. Anytime an error is occurred, the orchestration context flag will be set to false and the flux should stop.
Sure, we can:
Create the flux from the task list (if it's appropriate to generate the task list reactively then you can replace that arraylist with the flux directly, if not then keep as-is);
flatMap() each task to your task.run() method (which as per the question now returns a Mono;
Ensure we only consume elements while evaluateResult() is true;
...then finally just return the SUCCESS status as before.
So putting all that together, just replace your loop & return statement with:
Flux.fromIterable(tasks)
.flatMap(task -> task.run(ctx))
.takeWhile(stepContext -> evaluateResult(stepContext))
.then(Mono.just(Status.SUCCESS));
(Since we've made it reactive, your method will obviously need to return a Mono<Status> rather than just Status too.)
Update as per the comment - if you just want this to execute "one at a time" rather than with multiple concurrently, you can use concatMap() instead of flatMap().
I have this sync modeled as a Single, and only 1 sync can be running at a time.
I'm trying to subscribe the "job" on a Schedulers.single() which mostly works, but inside the chain there are schedulers hops (to db writes scheduler), which unblocks the natural queue created by single()
Then I looked at flatMap(maxConcurrency=1) but this won't work, as that requires always the same instance. I.e. from what I understand, some sort of a Subject of sync requests, which however is uncomposable as my usecase mostly looks like this
fun someAction1AndSync(): Single<Unit> {
return someAction1()
.flatMap { sync() }
}
fun someAction2AndSync(): Single<Unit> {
return someAction2()
.flatMap { sync() }
}
...
as you can see, its separate sync Single instances :/
Also note someActionXAndSync should not emit until the sync is also done
Basically I'm looking for coroutines Semaphore
I can think of three ways:
use a single thread for whole sync operation (decoupling through queue)
use semaphore to protect sync method from entering multiple times (not recommended, because will block callee)
fast return, when sync is in progress (AtomicBoolean)
There might by other solutions, which I am not aware of.
fast return, when sync is in progress
Also note someActionXAndSync should not emit until the sync is also done
This solution will not queue up sync requests, but will fail fast. The callee must handle the error appropriately
SyncService
class SyncService {
val isSync: AtomicBoolean = AtomicBoolean(false)
fun sync(): Completable {
return if (isSync.compareAndSet(false, true)) {
Completable.fromCallable { "" }.doOnEvent { isSync.set(false) }
} else {
Completable.error(IllegalStateException("whatever"))
}
}
}
Handling
When sync process is already happening, you will receive an onError. This issue must be handled somehow, because the onError will be emitted to the subscriber. Either you are fine with it, or you could just ignore it with onErrorComplete
fun someAction1AndSync(): Completable {
return Single.just("")
.flatMapCompletable {
sync().onErrorComplete()
}
}
use a single thread for whole sync operation
You have to make sure, that the whole sync-process is processed in a single job. When the sync-process is composed of multiple reactive steps on other threads, it could happen, that another sync process is started, while one sync process is already in progress.
How?
You have to have a scheduler with one thread. Each sync invocation must be invoked from given scheduler. The sync operation must complete sync in one running job.
I would use this:
fun Observable<Unit>.forceOnlyOneSubscriber(): Observable<Unit> {
val subscriberCount = AtomicInteger(0)
return doOnSubscribe { subscriberCount.incrementAndGet() }
.doFinally { subscriberCount.decrementAndGet() }
.doOnSubscribe { check(subscriberCount.get() <= 1) }
}
You can always generify Unit using generics if you need.
I have async method that returns Task. From time to time my process is recycling/restarting. Work is interruping in the middle of the Task. Is there more or less general approach in TPL that I can at least log that Task was interruped?
I am hosting in ASP.NET, so I can use IRegisteredObject to cancel tasks with CancellationToken. I do not like this however. I need to pass CancellationToken in all methods and I have many of them.
try..finally in each method does not seem even to raise. ContinueWith also does not work
Any advice?
I have single place I start my async tasks, however each task can have any number of child tasks. To get an idea:
class CommandRunner
{
public Task Execute(object cmd, Func<object, Task> handler)
{
return handler(cmd).ContinueWith(t =>
{
if (t.State = == TaskStatus.Faulted)
{
// Handle faultes, log them
}
else if (x.Status == TaskStatus.RanToCompletion)
{
// Audit
}
})
}
}
Tasks don't just get "interrupted" somehow. They always get completed, faulted or cancelled. There is no global hook to find out about those completions. So the only option to do your logging is to either instrument the bodies of your tasks or hook up continuations for everything.