I am using IBM Integration Toolkit in order to implement a WMB msgflow. Can logging be done from within this node or do I need to use an additional Java compute node for this purpose? Any references would be helpful.
The only one way I am aware of implementing such functionality is by adding SLF4J (or any other logging java framework of your preference) to IIB Runtime classpath and then write LOGGER.log() in the Java Compute nodes exactly in the same way you would have done it in plain Java. You can also create static wrappers on your log() methods so they can be called from ESQL Compute nodes.
You can use a built-in node which is "Trace node", write the records to the user trace file, another file, or the local error log (which contains error and information messages written by all other IBM® Integration Bus components).
example:
To use logging response time
Here are the steps:
Download whatever version you want of Log4J.
Create a Java Project and import the jar files into it then add the jars to the project classpath (open the java prespective - right click on the project - select properties - - select the Library tab - add the jars).
Create a library and add the JavaProject to it.
Create an ESQL file in that library and declare an external function. Below is an example of what it should look like
CREATE FUNCTION myProc1( IN P1 INTEGER, OUT P2 INTEGER, INOUT P3 INTEGER )
RETURNS INTEGER
LANGUAGE JAVA
EXTERNAL NAME "com.ibm.broker.test.MyClass.myMethod1";
For further elaboration you can refer to the knowledge centre page:
https://www.ibm.com/support/knowledgecenter/en/SSMKHH_9.0.0/com.ibm.etools.mft.doc/ak04960_.htm
Now you can call the ESQL function that you mapped in that ESQL file.
Related
We're using avro for (de)serialization of messages that flow through a message broker. For the purpose of storing the avro files a schema registry (apicurio) is used. This provides two benefits - schema validation and compatibility validation. However, I'm wondering if there is a way to go around the schema registry and achieve the same locally, using a script/plugin. Validating if an avro file is syntactically/semantically valid should be possible. The same applies for compatibility validation, as checking if a new schema version is backward/forward compatible against a list of other schemas (the previous versions) also sounds doable locally.
Is there a library that does that? Ideally a gradle plugin, but a java/python library would do as well, as it can easily be called from a gradle task.
I believe this is confluent's Java class for checking schema compatibility within its schema registry:
https://github.com/confluentinc/schema-registry/blob/master/core/src/test/java/io/confluent/kafka/schemaregistry/avro/AvroCompatibilityTest.java
You can use it to validate schemas locally.
Expedia has used it as a basis to create their own compatibility tool:
https://github.com/ExpediaGroup/avro-compatibility
I could not find a plugin doing just that what you ask for. Plugins seem to aim for generating classes from the schema files (i.e. https://github.com/davidmc24/gradle-avro-plugin). Without getting into why you want to do this, I think you could use this simple approach (How do I call a static Java method from Gradle) to hook you custom code into Gradle and checking for schema validity and compatibility.
Refer to following Avro Java API:
https://avro.apache.org/docs/current/api/java/org/apache/avro/SchemaCompatibility.html
https://avro.apache.org/docs/current/api/java/org/apache/avro/SchemaValidatorBuilder.html
Also checking this particular class can be helpful for executing validation against the schema:
https://github.com/apache/avro/blob/master/lang/java/tools/src/main/java/org/apache/avro/tool/DataFileReadTool.java
I am using NiFi to chain several API calls. I would make my flow more configurable, by setting the API keys/endpoints in an external configuration file (for example JSON, or even the nifi.properties file).
How can I use the informations in this config file in the properties of my processors?
Thank you in advance!
Currently the easiest way to do this is by setting values in bootstrap.conf which are then available through NiFi expression language. For example, if you created a new java arg like:
java.arg.15=-DmyProperty=myValue
Then in your processor, your properties need to support expression language. This is done on the property descriptor builder:
.expressionLanguageSupported(true)
Then from the UI you would set the value of that property to ${myProperty}
In a future release there is going to be a new capability to make this a little easier where you can have an external properties file that will be loaded and accessible from expression language, so you won't have to edit bootstrap.conf, but for now this is the approach.
I have an EJB which has as input argument and return value a JAXB mapped complex structure (with subclasses etc).
Now I want to deploy this on the Oracle Service Bus 11g. I can create a business proxy invoking the EJB, but only with basic types (int, ...).
How do i tunnel the XML between EJB and OSB? Any advanced OSB information is appreciated, as I don't know much about it.
After playing around, it turns out the OSB supports (afaik only) Apache XMLBeans. Thus if you declare parameters and return values of type org.apache.xmlbeans.XmlObject it works. I did get some errors concerning DOM v3 not being implemented and some crashes in the oracle DOM implementation, so I just use the XmlObject to create an XML string, and the reparse it.
#Euclides: I have XMLObject and XmlObject in my classpath. I need the second (lower case). Thanks for the hint, anyway.
Is there a graphical tools that is included in Fuse ESB or available outside that I can use for creating data transformation between source and target data elements?
There is currently no such tool out of the box with JBoss Fuse. It is on the roadmap for a future release (likely 6.2) and we have some work in progress code which needs to be finished first.
It allows you to use the web console (based on hawtio - http://hawt.io/) to do the data transformation mapping. And leveraging the Dozer library for the actual mapping implementation.
Is there a way to make the Websphere Application Server use the jar places inside the application's WEB-INF/libs folder and ignore the one that available within the server's plugin folder.
I am using EMF in my application and the version provided in the server doesn't include support for EMF GenericType, so I want to make the application use the jar file inside the libs folder.
Thanks for any help
This is quite possible. You need to change the classloading order from parent first to parent last. This will cause it to consult the server runtimes version of the code after your application. This is documented in the infocenter.
Another approach you might want to consider is to use an isolated shared library. There is a good introduction for those in IBM Education Assistant. These are more complex to setup, but using parent last classloading order can sometimes cause unexpected side effects that don't occur with isolated shared libraries.