Get parameter contexts from Nifi custom processor - apache-nifi

I need to create a Nifi (v1.17.0) custom processor with a property having several possible values. These values are generated during the execution of the previous processors in the flow.
The flow is as follows:
Processor A : Generate a yaml file.
Processor B : (the one I've been trying to create) Get the values in the yaml file generated previously and use them to populate a property in order to let the user select the one he needs. The process will then be adapted depending on the value selected.
I've found a solution based on the property method identifiesControllerService(DistributedMapCacheClient.class). It offers the possibility to have a property based on a dropdown menu with several possible choice (use a controller service or parameter contexts).
My idea is to use a parameter contexts available for the processor. So I would like to add the values I need in a Parameter context. This has to be done in the Processor A (which is also a custom processor) in order to use them in the property in the Processor B.
But I didn't find any way to add a parameter in the Parameter context programmaticaly in the onTrigger method of the Processor A. How can I add this parameter?

Related

JMeter Custom Plugin Variable Substitution

Context
I am developing a custom JMeter plugin which generates test data dynamically from a tree like structure.
The editor for the tree generates GUI input fields as needed, and therefore I have no set of defined configuration properties which are set in the respective TestElement. Instead, I serialize the tree as a whole in the GUI class, set the result as one property and deserialize it in the config element where it is processed further during test execution.
Problem
This works just fine, except that JMeter variable/function expressions like ${foo} or ${_bar(..)} in the dynamic input fields are not evaluated. As far as I understand the JMeter source code, the evaluation is triggered somehow if the respective property setters in org.apache.jmeter.testelement.TestElement are used which is not possible for my plugin.
Unfortunately, I was not able to find a proper implementation which can be used in my config element to evaluate such expressions explicitly after deserialization.
Question
I need a pointer to JMeter source code or documentation for evaluating variable/function expressions explicitly.
After I manages to setup the JMeter-Project properly in my IDE, I found org.apache.jmeter.engine.util.CompoundVariable which can be used like this:
CompoundVariable compoundVariable = new CompoundVariable();
compoundVariable.setParameters("${foo}");
// returns the value of the expression in the current context
compoundVariable.execute();

Nifi: Reading external properties in custom Processor

I have updated the variable registry to point to a custom properties file and i am able to read those in my processors using expression language with out any issues.
How ever i want to read them in my Custom Processor's (extending the AbstractProcessor) onTrigger()
I tried flowFile.getAttributes() and context.getAllProperties() and it is not getting picked up.
Appreciate any inputs.
Thanks
To clarify, you want to reference the value of these externally-defined variables inside the application logic of your CustomProcessor#onTrigger() method?
You can:
Load the variable definitions by querying NiFiProperties#getVariableRegistryProperties() or NiFiProperties#getVariableRegistryPropertiesPaths. Once you have a reference to the variable definitions, you can parse and use them as you wish.
You can reference them via the flowfile attributes or processor properties if those attributes or properties support Expression Language and it is appropriately scoped. The PropertyDescriptor will list expressionLanguageSupported() and return an ExpressionLanguageScope, which is an enum consisting of NONE, VARIABLE_REGISTRY, and FLOWFILE_ATTRIBUTES (which also includes the VR).
I don't understand the scenario where you want your code to load custom variables that aren't controllable by the flow administrator, which would be populated via processor properties or flowfile attributes. If you really feel you need to access custom variables that aren't available via the context or flowfile, you can use Option 1 above, but you could also theoretically store those variables in environment variables, System properties, etc.

Using derived attributes in same processor in Apache Nifi

I am adding two attributes in update attribute in which one depends on the other.
For example: say I create an attribute a = "Hello" and another attribute b=${a}. Then the value of "b" is set to an empty string.Is there any way in Nifi to use the value of "a" in the same processor or do I always need to create a new processor to use it?
Currently each UpdateAttribute property is evaluated independently of every other configured property (on the same processor). So in order to use "a" to create another property "b", you'd need to add a second UpdateAttribute processor.

Multi Value option in Apache Nifi Processor

I was designing on my own custom processor. I added couple of simple property descriptor into it with simple non-empty validators. I was looking for a validator by which I can add multiple values into one property descriptor. Something like below.
My property descriptor will have multi value selection option.
Does anyone know how can I achieve it ?
Multi-value selection for a single property descriptor is not supported. Would be curious to better understand the use case. Now, of course you can have many properties and even support dynamically generated (at runtime) properties.
Thanks
Joe

How to pass different set of data to two different mappers of the same job

I have one Single Mapper , say SingleGroupIdentifierMapper.java
Now this is a generic mapper which does all the filtration on a single line of mapper-input value/record based on property file (containing filters and key-value field indexes) passed to it from the driver class using cache.
Only the reducer business logic is is different and has been implemented keeping the mapper logic generic and to be implemented using the PropertyFile as mentioned above.
Now my problem statement is I have input from multiple Sources now, having different formats. That means I have to do some thing like
MultipleInputs.addInputPath(conf, new Path("/inputA"),TextInputFormat.class, SingleGroupIdentifierMapper.class);
MultipleInputs.addInputPath(conf, new Path("/inputB"),TextInputFormat.class, SingleGroupIdentifierMapper.class);
But the cached property file which I pass from the driver class to the mapper for implementing filter based on field indexes is common, So how can I pass two different property file to the same mapper, where if it processes, say Input A, then it will use PropertyFileA (to filter and create key value pair) and if it processes, say Input B then it will use PropertyFileB (to filter and create key value pair).
It is possible to change the Generic Code of the Mapper to take care of this scenario BUT how to approach this problem in the Generic Class and how to identify in the same Mapper Class if the input is from inputA/inputB and accordingly apply the propertyFile Configuration on the data.
Can we pass arguments to the constructor of this mapper class to specify it is from inputB or it needs to read which property file in cache.?
Eg Something like :
MultipleInputs.addInputPath(conf, new Path("/inputB"),TextInputFormat.class, args[], SingleGroupIdentifierMapper.class);
where args[] is passed to the SingleGroupIdentifierMapper class's constructor which we define to take as input and set it as a attribure.
Any thoughts or expertise is most welcomed.
Hope I was able to express my problem clearly, kindly ask me in case there needs to be more clarity in the question.
Thanks in Advance,
Cheers :)
Unfortunately MultipleInputs is not that flexible. But there is a workaround which matches InputSplit paths to the property files in the setup method of the Mapper. If you are not using any sort of Combine*Format, than a single mapper will process a single split from a single file:
When adding prop files into cache use /propfile_1#PROPS_A and /propfile_2#PROPS_B
Add input path into job.getConfiguration().set("PROPS_A", "/inputA") and job.getConfiguration().set("PROPS_B", "/inputB")
In the Mapper.setup(Context context) method, use context.getInputSplit().toString() to get the path of the split. Than match it to the paths saved in the context.getConfiguration().get("PROPS_A") or PROPS_B
If you are using some Combine*Format, than you would need to extend it, override getSplits that use information from the JobContext to build the PathFilter[] and call createPool, which will create splits that contain files from the same group (inputA or inputB).

Resources