How to pass context between asynchronous method calls in Python 3.5? - async-await

How can I pass context from on asynchronous method call to another one - without using method parameters?
I need the feature to enrich log messages with a kind of flow ID so that I can easily trace all log messages of a specific call method flow.
I use Python's async and await keywords (Python 3.5.x).

You should use Context Variables introduced in Python 3.7, and I have a polyfill for Python < 3.7 as aiocontextvars.
Previous answer:
You may want to take a look at tasklocals and aiolocals.

I solved the problem by setting a custom task factory. It turned out that having a context per task (in comparison to a context per async call) was sufficient for me.

I'm working on aiotask-context package. It's a really simple way of passing context between tasks (called with await or yield from). If you don't wan't to use the package you can still use the idea :).
I'm working on how to propagate it for the ensure_future calls too.

import contextvars
c_id = contextvars.ContextVar("context_id", default=None)
def get_context_id():
return c_id.get()
def set_context_id(value):
c_id.set(value)
I struggled a lot for getting this right. If anyone is still searching for the answer then they can refer here. This works with Python3.7 onwards.
Create an instance of contextvars.ContextVar.
Here you can give the context variable name and a default value for that variable, the default value will be used in case the variable is not found in the current context.
Set the value using the setter and you can get the same value using the getter inside same context.
Define ContextVar at the top level once, and not inside closures(i.e. functions or class etc.) as garbage collection for context is not proper.

Related

What is PreserveThreadContext() when calling async functions in Kephas?

I noticed in all Kephas examples that, when invoking async methods, at the end there is a call to PreserveThreadContext(). What does this do?
Some example:
var result = await dataContext.Query<Document>()
.ToListAsync()
.PreserveThreadContext();
I know about ConfigureAwait(false), is this something similar?
In a way, yes, meaning that in a server environment it includes also a call to ConfigureAwait(false). But it also restores the thread bound culture (and UI culture) upon returning from the async call, so that the strings can be localized in a consistent way. This is due to the fact that you may find yourself in another thread upon returning, where the culture is the default one, not the configured one.
Also, you can add your own behaviors for storing/restoring other thread bound information.
Check for this purpose the class https://github.com/kephas-software/kephas/blob/master/src/Kephas.Core/Application/PreserveCultureThreadContextAppLifecycleBehavior.cs, which adds the culture preservation behavior. Typically, you would implement this in an AppLifecycleBehavior, in the BeforeAppInitializeAsync method.

Is there a static way to get current request/context in Go?

I want my logging methods to output some sort of request ID. To do that I need to pass a request object (or some derivative) to the logging functions.
This is all fine when logging from the request handlers, but is problematic when used from within library-style methods, as I don't wish those to be aware of any ongoing http-requests.
Furthermore, those same library methods might be used in the future by console applications, and than the request-id would be replaced by some sort of worker-thread-id.
Using a context solves the problem, but this means I will have to add a context parameter to all my methods, which is somewhat annoying.
So, basically, what I need is some sort of static storage that is passed between method and goroutine calls.
I'm not sure there's anything like that in Go, so maybe my approach is totally off-base, in that case, I would be happy to hear what is a better approach to solve the above problem.
Thanks,
Try another logging library. For instance, in log15 (https://github.com/inconshreveable/log15) a Logger as an embedded key/value context.
logger := log.New("request_id", "8379870928")
You can pass the logger object to anyone that needs to log. Later on:
logger.Warn("blabalbla")
... will embed the request_id that you have put.

best practice to get the default object wrapper?

when creating custom method, I implements TemplateMethodModelEx and returns SimpleSequence object.
according to the API, I should use this constructor:
SimpleSequence(ObjectWrapper wrapper)
since I am setting incompatibleImprovements as 2.3.24, the doc said I can simply use Configuration instance's getObjectWrapper(). My problem is when implementing TemplateMethodModelEx, I have no access to the current config unless I pass cfg to the method's constuctor. then the root.put would look like:
root.put("getMeList", new GetMeListMethod(cfg));
this looks odd to me, i wonder whats the right to construct this kind of SimpleSquence model and whats the right way to get the default object wrapper.
Thanks a lot
You should pass in the ObjectWrapper as the constructor parameter. (It's unrelated to incompatibleImprovements 2.3.24.) Any TemplateModel that creates other TemplateModel-s (like TemplteSequenceModel-s, TemplateHashModel-s, TemplateMethodModel-s) used to work like that. This is normally not apparent because they are created by an ObjectWrapper. If you do the TemplateModel-s manually however (which is fine), then you will face this fact.

Spring state machine builder using StateTransition object

I couldn't find any reference with this functionality. Shall I just implement a helper method in the builder to read fields in StateTransition object and populate the chain configureTransition() call by myself??
Just to confirm not to reinvent the wheels.
UPDATE:
I'm trying to use StateMachineBuilder to configure with some pre-defined states and transitions in a properties file. In Builder, they use this chained call to generate configuration:
builder.configureTransitions().withExternal().source(s1)....
What I have in mind is, everything read from the file is stored in an object, the spring sm library has this StateTransition object. But as far as I know from the API, there is no way to use it directly to configure a state machine. Instead, I can read individual fields in the object and use the chained call above.
Thanks!
If you want to do it like that, what you mentioned is pretty much only option. Hopefully we get a real support for external state machine definition, i.e. tracked in https://github.com/spring-projects/spring-statemachine/issues/78.

Spring BatchSqlUpdate vs NamedParameterJdbcTemplate using named parameters

I have been using the BatchSqlUpdate class successfully for a while. The only annoyance using it is that named parameters need to be registered before running any query using the declareParameter or setParameter methods. This means that the type of the parameters has to be declared as well. However, Spring also provides a NamedParameterJdbcTemplate class which has a very convenient batchUpdate method that takes named parameters as input (an array of maps or SqlParameterSource objects) without the need of declaring them first. On top of that, this class can be reused easily and I also believe it's thread-safe.
So I have a couple of questions about this:
What's the recommended way to perform (multiple) batch updates?
Why is this feature duplicated across two different classes that also behave differently?
Why does BatchSqlUpdate require declared parameters if NamedParameterJdbcTemplate does not?
Thanks for the thoughts!
Giovanni
After doing some research, I reached the following conclusions.
First of all, I realized that the NamedParameterJdbcTemplate class is the only one accepting named parameters for batch updates. The method batchUpdate(String sql,Map[] batchValues) was added in Spring 3 to achieve this.
The BatchSqlUpdate class contains an overridden update(Object... params) method that adds the given statement parameters to the queue rather than executing them immediately, as stated in the javadoc. This means that the statements will be executed only when the flush() method is called or the batch size exceeded the maximum. This classed doesn't support named parameters, although it contains a updateByNamedParam() method inherited from SqlUpdate. This is unfortunate since this method allows reuse of the same map for the named parameters, whereas the NamedParameterJdbcTemplate.batchUpdate() method requires an array of maps, with the associated overhead, code bloating and complications in reusing the array of maps if the batch size is variable.
I think it would be useful to have an overridden version of updateByNamedParam(Map paramMap) in BatchSqlUpdate to behave just like update(Object... params) but with added support for named parameters.

Resources