JMS filtering using XPATH - xpath

i have a requirement in which i need to filter JMS messages according to XPATH.
I tried to use a jca property as shown below. But it did not pick any JMS messages.
property name="MessageSelector" value="JMS_BEA_SELECT('xpath','/ns1:books/ns1:book[2]/ns1:price/text()') = '20'"/>
i tried to use multiple variants in the value part but anything doesn't work.
Is any syntax available for this or any other way for doing this task.
NOTE: i do not want to use headers

Usually, you cannot use XPATH in selectors when using JMS. However, ActiveMQ supports Xpath selectors: http://activemq.apache.org/selectors.html
Obviously, your messages must then be TextMessages containing XML bodies, as above documentation suggests.

Related

Issue on Camel route - parsing XML tags

I have a complex camel route, which starts with an initialization route, which tries to set the headers with the info from the XML used as input.
I wonder how the route is not being able to parse the XML content, using XPath.
Before calling the route, I print the xml information in my java JUNIT, and it prints correctly, with all xml tags.
So I know the information is being sent as I am expecting.
But that route, which should set the headers using XPath, returns empty to any expression I try to use! I even used a XPath tool to assist me (https://codebeautify.org/Xpath-Tester), to check if was some xpath coding mistake, but I get the results I want from there.
So, let's suppose, I have an XML as:
<bic:Test>
<bic:context>
<bic:memberCode>GOOGLE</bic:memberCode>
</bic:context>
</bic:Test>
So, with the line below:
<setHeader headerName="myHeader">
<xpath resultType="java.lang.String">//<anyTag>/text()</xpath>
</setHeader>
or
<setHeader headerName="myHeader">
<xpath resultType="java.lang.String">//<anyTag></xpath>
</setHeader>
I will see the header with empty content.
I tried so many different things, that finally I decided to print the all the content, using an XPath expression as /.
It will print only the content ("GOOGLE"), not the tags.
Could you please assist me?
Thank you in advance!
This is probably a namespace related issue.
You have to define the bic namespace in the camel context and then use it in the xpath expression.
Have a look at the documentation in https://github.com/apache/camel/blob/master/camel-core/src/main/docs/xpath-language.adoc and particularly in the example of "Using XML configuration"
Also look at "Namespace auditing to aid debugging" for further information about debugging namespace related issues in camel.

Overriding Spring cloud sleuth Trace Id format

We are looking at leveraging spring cloud sleuth for distributed tracing and we've worked on a POC. It seems like a great solution, works out of the box.
Had a follow up question though:
We use random UUID vs 64 bit ids as trace id. We understand that custom headers (A new trace Id for example) can be added along with sleuth headers but would it be possible to override the default trace id format for slueth? We have looked through the documentation and perhaps Propagation is
the way to go. Can someone who has done this point us in the right direction and to some examples if possible. The help would be much appreciated.
We are using the latest release 2.0.1 which uses the brave library.
Any help/pointers would be greatly appreciated.
Thanks,
GK
Spring-sleuth doesn't provide a way to override the default ID's. According to OpenZipkin, 'Trace identifiers are 64 or 128-bit, but all span identifiers within a trace are 64-bit. All identifiers are opaque'
Refer this:
https://github.com/openzipkin/b3-propagation#identifiers
So, You can either put the generated request ID as Tag ('tag':'requestID') or You can place generate UID in a different field and use propagation technique. Refer ExtraFieldPropagationTest for reference.
https://github.com/openzipkin/brave/blob/master/brave/src/test/java/brave/propagation/ExtraFieldPropagationTest.java
Even though this is not possible (afaik), if your use case is to use custom headers for log correlation, all that's needed is setting these properties (Related SO Answer):
# To add request-id (to MDC?) via sleuth
spring.sleuth.baggage.correlation-fields=x-request-id
spring.sleuth.baggage.remote-fields=x-request-id
And then this can be used in your logging pattern:
%d{yyyy-MM-dd HH:mm:ss.SSS} %-5level [%X{traceId:-},%X{spanId:-},%X{x-request-id:-}] [%thread] %logger{40} : %msg%n
Now along with the built-in traceId & spanId, the value of the header x-request-id will also be logged:
2022-06-28 19:55:40.071 WARN [8add19deba73c0f3,cda65c8122e5e025,some-request-id] [reactor-http-epoll-8] c.i.p.g.s.MyService : My warn log
To make this more concise, you can skip traceId & spanId if not required. A better way could have been to use them as a fallback when your own custom correlation header is not available, but logback currently does not (and probably will not) support nesting default values for MDC.
What you can do is to generate the id in a different field and propagate it further on. Check this part of the documentation https://cloud.spring.io/spring-cloud-static/Finchley.SR1/single/spring-cloud.html#_propagating_extra_fields
52.1 Propagating extra fields Sometimes you need to propagate extra fields, such as a request ID or an alternate trace context. For
example, if you are in a Cloud Foundry environment, you might want to
pass the request ID, as shown in the following example:
// when you initialize the builder, define the extra field you want to
propagate Tracing.newBuilder().propagationFactory(
ExtraFieldPropagation.newFactory(B3Propagation.FACTORY,
"x-vcap-request-id") );
// later, you can tag that request ID or use it in log correlation
requestId = ExtraFieldPropagation.get("x-vcap-request-id"); You may
also need to propagate a trace context that you are not using. For
example, you may be in an Amazon Web Services environment but not be
reporting data to X-Ray. To ensure X-Ray can co-exist correctly,
pass-through its tracing header, as shown in the following example:
tracingBuilder.propagationFactory(
ExtraFieldPropagation.newFactory(B3Propagation.FACTORY,
"x-amzn-trace-id") ); [Tip] In Spring Cloud Sleuth all elements of the
tracing builder Tracing.newBuilder() are defined as beans. So if you
want to pass a custom PropagationFactory, it’s enough for you to
create a bean of that type and we will set it in the Tracing bean.

XPath expression to filter incoming XML Documents

I'm trying to use a "Choice" to direct XML documents based on part of their content. But as I'm new to mule I'm struggeling to understand how to get the xpath expression to parse the incoming message.
As an SSCCE I've set up this flow:
The "when" for each choice is set to
#[xpath('//foo/bar').text] == baz
and
#[xpath('//foo/bar').text] != baz`
respectively.
No Matter what I seem to try; it always gives a response to the client of:
Execution of the expression "xpath('//foo/bar').text] == baz" failed. (org.mule.api.expression.ExpressionRuntimeException). Message payload is of type: ContentLengthInputStream
Do I need to convert the input into something first? If so then what? Or is there something else I should be doing to make this work?
Edit
Having checked the logs it seems that the error was the same as this. More than one instance of JAXB context. I found two ways to solve this.
One was to refactor all our code to have only one instance of JAXB (not easy and not preferable but it does fix the issue).
The other was to stop using XPATH in EML. Instead I've created a Java transformer which manually uses a Java Document Builder and XPATH objects to extract the information and place it in the registry. This also worked.
Note we couldn't use getPayloadAsString() without hitting "More than one JAXB Context". Don't ask me why mule needs a JAXB context to convert an input stream to a string. So instead we placed a Bytearray to String transformer in the flow:
If anyone has any good way to use xpath in eml when there are more than one JAXB contexts included then feel free to leave an answer
The MULE documentation shows examples like this:
<when expression="#[payload.getPurchaseType() == 'book']">
where the comparison operator and its second operand are all inside the #[...]. Have you tried that? E.g.
<when expression="#[xpath('//foo/bar').text == 'baz']">

JMS-like Message Selectors for AMQP (ActiveMQ / RabbitMQ)

Is there anything similar to JMS message selectors for RabbitMQ? Or must some code be written to parse and select the messages?
Thanks.
It's called "amqp routing key".
You can find the different here:
http://www.wmrichards.com/amqp.pdf
And you can find some example about the routing-key here:
http://www.rabbitmq.com/tutorials/tutorial-four-python.html
AMQP routing key with direct/topic exchanges work well if the selector is always on a single string field. If your selectors are all of the form message_type = 'foo' then you would use message_type as your routing key.
If the message filter uses multiple/different fields, then you could use the amq.match exchange, which would route messages that match any or all header values to the related queue. This would handle selectors like field1 = 'value' OR field2 = 'value' and cases where different consumers selectively consume based on different attributes.
I think JMS message selectors also let you do more complex logic and comparison operators like greater-than, less-than, etc. and I haven't found an equivalent of that with AMQP/RabbitMQ.

How to use SLF4J to log to two different files based on type of msg..?

i am running a client server program in spring.
i am trying to implement SLF4J + Logback for logging.
Now the thing is my client (which in real life would be a device/sensor) will send me data in string format which contains various fields seperated by comma) exact pattern is like this : deviceID,DeviceName,DeviceLocation,TimeStamp,someValue
now what i want is to filter the message in Logback using deviceID and then write the whole string to file which has name like device.log suppose for example 1,indyaah,Scranton,2011-8-10 12:00:00,34 should be logged in to file device1.log dynamically.
so how can i use evaluateFilter in logback/janino.
Thanks in advance.
Logback provides all the features you need out of the box. You need to learn about SiftingAppender and probably MDC.
SiftingAppender wraps several homogeneous appenders and picks single one per each logging message based on user-defined criteria (called distriminator). The documentation is pretty good, and it has some nice examples.

Resources