Kafka Streams transform() state store - apache-kafka-streams

I have a use case where I need to use headers in DSL so used transformValues() but not doing stateful transformation , more of decision making based on headers.
I need to pass the state store name in this function.Is there any alternative of not giving a state store of some default or dummy value?

I need to pass the state store name in this function
That is not correct. The signature of KStream#transformValues() is
<VR> KStream<K, VR> transformValues(final ValueTransformerSupplier<? super V, ? extends VR> valueTransformerSupplier,
final String... stateStoreNames);
Note, that the second argument is a var-arg, and you don't need to provide any parameter for it. Hence, you can call transformValue() with a single argument.

Related

How do I handle null values during model binding

I have the folowing URL:
http://localhost:7975/test?parameter01=X
In my model, parameter01 is a List<int?>. If a non-integer value (e.g., a string) is passed to this parameter, the model binding process sets this value to null.
How do I intercept this as early as possible in the pipeline so I can return a HTTP status and description without handling this condition in the controller action?
Since you have tagged with asp.net-web-api2 I would recommend using Attribute Routing which enables you to constrain the Parameter Type. With this you'd able to switch handling according to the validity of your input. You can read up on this here
A second possibility would be to write a HTTPHandler which tests for valid input information. This one might be a bit trickier.

How to use response field in a new request in Jmeter

I have this issue that I want to solve.
I want to create a new http request using field from previous response
I send a request
I used Json extractor to move the response string to a variable (let call this string nurl)
I used Regular expression and move the field that I want to "Reference Name"
(meanning from nurl I just want tt_cid)
Now I want to make a new call, and use that field tt_cid in my new call
How I shall call tt_cid? since it is not passed as User Defined Variables,
when I use tt_cid, I do not think J meter know it, since it is not written there, I just pulled it from the response.
Provided a Pic of what I have done
Regards to you all
Short answer call it ${tt_cid}.
since it is not passed as User Defined Variables, when I use tt_cid,
I do not think J meter know it,
For your understanding add Debug Sampler after Regular expression,
You will see all your JMeter variables, including tt_cid, which can be called as other variables ${tt_cid} inside other Samplers.
It's called Reference Name and not Variable Name because it's more complicated, You should read JMeter's Regular Expression to understand how it works internally, But basically it saves more than just 1 Variable.

Apache NiFi to split data based on condition

Our requirement is split the flow data based on condition.
We thought to use "ExecuteStreamCommand" processor for that (intern it will use java class) but it is giving single flow data file only. We would like to have two flow data files, one is for matched and another is for unmatched criteria.
I looked at "RouteText" processor but it has no feature to use java class as part of it.
Let me know if anyone has any suggestion.
I think you could use GetMongo to read those definition values and store them in a map accessed by DistributedMapCacheClientService, then use RouteOnContent to route the incoming flowfiles based on the absence/presence of the retrieved values.
If that doesn't work, you could instead route the query result from GetMongo to PutFile and then use ScanContent, which reads from a dictionary file on the file system and routes flowfiles based on the absence/presence of those keywords in the content.
Finally, if all else fails, you can use ExecuteScript to combine those steps into a single processor and route to matched/unmatched relationships. It processes Groovy code easily, so you can directly invoke your existing Java class if necessary.

Laravel - Modify Received Request Parameters

When we receive a Request object in Laravel, is there a way to modify or add data to it? For instance, could I rename a parameter (not the value, but the parameter name itself) to something else? For example, the input might be called fname but I want to change it to first_name. Or could I add new inputs and values that weren't in the original request?
The reason I ask is that I have a method that accepts a Request object, and expects certain input names. I'd like to be able to reuse the method, but the request input names will be different.
If you have an Object you can edit and add new items.
$request->url = $new_url;
$request->new_item = 1;
If the object item not exists, then will create automatically, or if it exists, will modify it.
Tested #marc-garcia answer, and that will not persist through your script execution. This will...
// merge defaults into the request.
// this makes it consistent everywhere (blade, controller...)
request()->merge([
// find the request if it exists, second param is the default value
'reservable' =>request( 'reservable', (self::RESERVABLE_BY_DEFAULT?1:0) )
]);
You may also use request()->replace([...]); but that will remove all other parameters from the request and replace it will the array you provide.

Drupal6: Load a field, like one would load a node?

Is there a field_load() function equivalent to node_load()? I want to get information about the type of a field and other validation constraints without going to the database myself.
Better yet, is there any function that will validate it for me, like is_valid_for_field(field_name, input), that would take a field name and a potential input and return a boolean indicating whether or not the potential input is valid (within min/max, etc) for the specified field?
There is the content_fields() function, which will get you the meta data for a field. In terms of validation, IIRC, you can call content_field() with the operation set to validate, and the relevant data. However, by calling node_save with your completed node, the cck module will take care of all the relevant validation hooks for the entire node structure, so you may be better off taking that route.

Resources