Multi Value option in Apache Nifi Processor - hadoop

I was designing on my own custom processor. I added couple of simple property descriptor into it with simple non-empty validators. I was looking for a validator by which I can add multiple values into one property descriptor. Something like below.
My property descriptor will have multi value selection option.
Does anyone know how can I achieve it ?

Multi-value selection for a single property descriptor is not supported. Would be curious to better understand the use case. Now, of course you can have many properties and even support dynamically generated (at runtime) properties.
Thanks
Joe

Related

Get parameter contexts from Nifi custom processor

I need to create a Nifi (v1.17.0) custom processor with a property having several possible values. These values are generated during the execution of the previous processors in the flow.
The flow is as follows:
Processor A : Generate a yaml file.
Processor B : (the one I've been trying to create) Get the values in the yaml file generated previously and use them to populate a property in order to let the user select the one he needs. The process will then be adapted depending on the value selected.
I've found a solution based on the property method identifiesControllerService(DistributedMapCacheClient.class). It offers the possibility to have a property based on a dropdown menu with several possible choice (use a controller service or parameter contexts).
My idea is to use a parameter contexts available for the processor. So I would like to add the values I need in a Parameter context. This has to be done in the Processor A (which is also a custom processor) in order to use them in the property in the Processor B.
But I didn't find any way to add a parameter in the Parameter context programmaticaly in the onTrigger method of the Processor A. How can I add this parameter?

Nifi: Reading external properties in custom Processor

I have updated the variable registry to point to a custom properties file and i am able to read those in my processors using expression language with out any issues.
How ever i want to read them in my Custom Processor's (extending the AbstractProcessor) onTrigger()
I tried flowFile.getAttributes() and context.getAllProperties() and it is not getting picked up.
Appreciate any inputs.
Thanks
To clarify, you want to reference the value of these externally-defined variables inside the application logic of your CustomProcessor#onTrigger() method?
You can:
Load the variable definitions by querying NiFiProperties#getVariableRegistryProperties() or NiFiProperties#getVariableRegistryPropertiesPaths. Once you have a reference to the variable definitions, you can parse and use them as you wish.
You can reference them via the flowfile attributes or processor properties if those attributes or properties support Expression Language and it is appropriately scoped. The PropertyDescriptor will list expressionLanguageSupported() and return an ExpressionLanguageScope, which is an enum consisting of NONE, VARIABLE_REGISTRY, and FLOWFILE_ATTRIBUTES (which also includes the VR).
I don't understand the scenario where you want your code to load custom variables that aren't controllable by the flow administrator, which would be populated via processor properties or flowfile attributes. If you really feel you need to access custom variables that aren't available via the context or flowfile, you can use Option 1 above, but you could also theoretically store those variables in environment variables, System properties, etc.

How to push a value of an unchanged field into the target in a plugin's input parameters?

I'm deleting an instance of an entity and depending on the value of an option set in it, I wish to carry our different course of action. The problem is that the field isn't changed, hence, not provided to the plugin's target.
How can I easily tell the stupid plugin to fetch all the fields?
The way I do it now is to use pre-image but I'll be showing the plugin to some rookies and they will definitely not like it. And they won't believe me that's the way to go, for sure, because they're a cocky bunch.
Is there a work-around for that?
Using the pre-image is the suggested way in this scenario, the alternative is to instantiate a service factory in order to get an IOrganizationService and retrieve the entity using the target's Id.
It is part of the IPluginExecutionContext (of which Target is one part.) I think the beginners are confused if they think of Target as anything more than a property of IPluginExecutionContext.
It wouldn't make sense to have these values as part of Target, because then it would cause an update of the field to its current value - if you forced it into Target you would see the update in the audit details.
Thus, CRM has PreEntityImages, Target, and PostEntityImages, if Target was used the way "they" want it would not be able to differentiate between values being updated, previous values, and the final result of the entity.

How to define templates for Spring Batch jobs?

I'm using Spring Batch 2.1.5. I have many jobs that are very similar between each other and I'm looking for a way to have an smaller XML acting as a job template.
The things that are shared among jobs are readers, processors, writers and one tasklet. Also some of the parameters for each of these beans are the same. For instance they all use the same data source.
I thought about 4 approaches and 3 of them don't work...
1 - Using a postprocessor to add the common beans and attributes as default values is not possible, because the Spring Batch class JobParserJobFactoryBean is not a public class.
2 - To add an XML extension seems to be a wrong thing, because I'm not adding any custom tags to the XML file.
3 - Using a PropertyOverrideConfigurer I can put default values into properties, but I have to define those values for each bean. So I'll have many repeated values and I'll only move the problem to properties.
4 - Using some kind of custom factory bean. These seems to be the only choice, but I don't know exactly how to plug it into the existing code.
Did anybody try to do this? Can somebody give tips or recommend resources on how to do it?
Spring Batch provides the ability to define Abstract Jobs (and steps, etc.) to inherit from. Take a look at the reference manual for more information. Using this you should be able to accomplish exactly what you are looking for.

Validate a Collection Has at Least One Item using Validation Application Block

Using the Enterprise Library 4.1 Validation Application Block, how can I validate that a collection property contains at least one item?
I'm assuming you mean out of the box. If so, then I don't think there is way to validate directly the number of items in a collection.
These are some other ways that you could try:
Decree that you only deal with null collections and not empty collections and use a Not Null Validator. Not practical, though.
Use self validation and have the object validate in code that the collection(s) have the correct number of items. Will work but it's nice to have the validation in the configuration file.
Expose the collection count as a property. This could be done, assuming an employee collection for example, with an EmployeeCount property on your object that contains the collection or you could create your own custom collections that expose a count property. Then you could use a Range Validator to validate on the Count property.
Create a custom validator that can validate the number of items in a collection -- something like CollectionCountRangeValidator.
If I wanted to develop something quickly, I would probably go with option 3. However, option 4 fits in well with the Enterprise Library approach and also allows your class design to be independent of the validation requirements. Plus you could always reuse it on your next project. :) And does anyone really miss creating their own collections when a List will do nicely?
This is already implemented in the EntLib Contrib.
This is called CollectionCountValidator.

Resources