Mule Cache - Cluster In Memory - deserialization issue - caching

My Mule 3.9 application exposes a rest end point.
Application is clustered and on-prem managed through Runtime Manager
Condition is: the end point which kicks off the batch process should be singleton meaning only 1 process should be running on the entire cluster. If a rest endpoint is invoked again, it should result into http status 409
For this use case, I have utilized Mule Caching - Clustered - In Memory version with below configuration
<ee:object-store-caching-strategy name="caching_strategy" doc:name="caching_strategy" keyGenerationExpression="some_key" synchronized="false" entryTTL="14400000" persistent="false" />
My flow looks like below -
<flow name="some-flow" doc:description="some-flow">
<message-properties-transformer scope="invocation" doc:name="Intialize Message Properties" mimeType="application/java">
<add-message-property key="messageId" value="#[message.rootId]"/>
</message-properties-transformer>
<ee:cache doc:name="inititiation" cachingStrategy-ref="caching_strategy" >
<logger message="process cache miss" level="INFO" doc:name="process cache miss"/>
<set-payload doc:name="initialize cache map" value="#[{'id' : flowVars.messageId}]" />
</ee:cache>
<choice doc:name="Is process already running ?" >
<when expression="#[payload.id == flowVars.messageId]">
<logger message="New process started" level="INFO" />
</when>
<otherwise>
<logger message="Process is already running" level="WARN" />
</otherwise>
</choice>
</flow>
As you can see, I am putting java.util.HashMap with 1 key-value pair in cache and checking if it already exists or not
<set-payload doc:name="initialize cache map" value="#[{'id' : flowVars.messageId}]" />
Actual functionality works great in the cluster and serves the purpose !
HOWEVER application logs are full of below **WARN** statements
org.mule.util.store.MonitoredObjectStoreWrapper -
Running expirty on org.mule.util.store.ObjectStorePartition#4648ce75 threw java.lang.IllegalArgumentException:
Cannot deserialize with a null classloader:
Cannot deserialize with a null classloader
I am not sure what is issue? The object which is in the cache is java.util.HashMap which is serializable and only key-value pair is of String.
I sense some class loader issue, but could not bring myself close to it.
Does anybody have any clue?
Thanks
Vikas

Had my hands on the ground and managed to resolve the issue with below configuration -
<ee:object-store-caching-strategy name="caching_strategy" doc:name="caching_strategy" keyGenerationExpression="some_key" synchronized="false" >
<!-- this is because my flow eturns different message than cache"
<ee:serializable-event-copy-strategy/>
<!-- manged store without persistance -->
<managed-store storeName="MyTaskInMemoryStoreForClusteredCaching"
maxEntries="1" entryTTL="14400000"
expirationInterval="3660000" persistent="false"/>
</ee:object-store-caching-strategy>

Related

spring-cloud-gcp-starter-logging: child-log not shown when "This request caused a new process to be started"

I Configured the Cloud Logging implementation in my Java11 Google App Engine application following the guide
From the Cloud Logging UI I can see the log lines of the same request being grouped under a common parent, which is exactly what I wanted. And it's working:
I found that when a request actually triggers a new instance, this kind of behaviour is not working:
You can see that there aren't any children, only the log about the instance being created:
This request caused a new process to be started for your application, and thus caused your application code to be loaded for the first time. This request may thus take longer and use more CPU than a typical request for your application.
If I filter by the specific traceId I can actually see the logs of that specific request, the only missing part here is that are not displayed under the common parent:
Am I missing a configuration? Is this a known behaviour?
This is my current logback-spring.xml
<!-- https://spring-gcp.saturnism.me/app-dev/observability/logging#logback -->
<configuration>
<include resource="org/springframework/boot/logging/logback/defaults.xml" />
<include resource="org/springframework/boot/logging/logback/console-appender.xml" />
<springProfile name="development | stage | production">
<include resource="org/springframework/cloud/gcp/logging/logback-json-appender.xml" />
<root level="INFO">
<appender-ref ref="CONSOLE_JSON" />
</root>
</springProfile>
<springProfile name="default | localhost">
<root level="INFO">
<appender-ref ref="CONSOLE" />
</root>
</springProfile>
</configuration>

Mule faile to download file from SFTP location

I am trying to connect to an SFTP location and download a .zip file using Mule SFTP connector. It looks very straightforward configuration, but I am not sure what is missing in my configuration. I am not able to make out why it is not working for me. Can someone please look at it and suggest what should I change to make work?
In below flows configurations, I am starting with an HTTP end point (http://localhost:8181/invoice) then “ftpconnectivityFlow1” is called and it checks the value of "ftp" variable and based on its value it either goes to my FTP location or SFTP location. when I set the variable "ftp" to true it works as expected, as I can see file from FTP location is downloaded in my output folder and deleted from FTP location as expected. When I set it to false it is not giving any error but file at SFTP location is still there meaning it is not able to read file (I am guessing) and it is not downloaded to my output folder. So for some debugging I added a custom transformer so that I can inspect payload. In my custom transformer I notice that when it connects to FTP location it has some binary data (all number), I am guessing it is my .zip file, but when variable "ftp" is set to false, meaning it is trying to connect to SFTP location in that case payload contains "/invoice" which is my http relative path. So my output folder contains a file with name “null” and all it contains is "/invoice"
Any help is greatly appreciated.
<flow name="ftpconnectivityFlow1">
<logger message="ftp:#[message.outboundProperties['ftp']]" doc:name="Logger" level="INFO"/>
<choice doc:name="Choice">
<when expression="#[message.outboundProperties['ftp']==true]">
<flow-ref name="FTPConnection" doc:name="FTPFileDownloadConnection"/>
</when>
<otherwise>
<flow-ref name="SFTPConnection" doc:name="SFTPFileDownloadConnection"/>
</otherwise>
</choice>
</flow>
<flow name="FTPConnection">
<ftp:inbound-endpoint host="host" port="22" path="abc" user="user" password="password" responseTimeout="10000" doc:name="FTP"/>
<custom-transformer class="abc.transformer.CustomeFileTransformer" />
<logger message="connected to FTP" level="INFO" doc:name="Logger"/>
<file:outbound-endpoint path="output" outputPattern="#[message.inboundProperties['originalFilename']]" responseTimeout="10000" doc:name="File"/>
</flow>
<flow name="SFTPConnection">
<sftp:inbound-endpoint connector-ref="sftp-default" doc:name="SFTP" responseTimeout="10000" host="host" password="password" path="/Inbound" port="21" user="user"/>
<custom-transformer class="abc.transformer.CustomeFileTransformer" />
<logger level="INFO" doc:name="Logger"/>
<file:outbound-endpoint path="output" outputPattern="#[message.inboundProperties['originalFilename']]" responseTimeout="10000" doc:name="File"/>
</flow>
<ftp:inbound-endpoint host="host" port="22" ... doc:name="FTP"/>
...
<sftp:inbound-endpoint ... port="21" user="user"/>
You might have those port numbers backwards. FTP normally runs on port 21 and SFTP (SSH) normally uses port 22.

Spring property resolution in Mule Groovy script

I'm trying to access Spring initialized properties within a Mule Groovy script without success.
Loading the property file in Mule application XML:
<context:property-placeholder location="application.properties"/>
Contents of property file:
key=value
Accessing the property from inside the application XML works fine:
<logger message="key: ${key}" level="INFO" doc:name="Logger" />
Produces the following output:
key: value
Attempting the same in a groovy script:
log.info "key: ${key}"
Results in exception:
Exception stack is:
1. No such property: key for class: Script1 (groovy.lang.MissingPropertyException)
org.codehaus.groovy.runtime.ScriptBytecodeAdapter:53 (null)
2. groovy.lang.MissingPropertyException: No such property: key for class: Script1 (javax.script.ScriptException)
org.codehaus.groovy.jsr223.GroovyScriptEngineImpl:326 (http://java.sun.com/j2ee/sdk_1.3/techdocs/api/javax/script/ScriptException.html)
3. Failed to invoke ScriptComponent{main-flow.component.32928685}. Component that caused exception is: ScriptComponent{main-flow.component.32928685}. Message payload is of type: ContentLengthInputStream (org.mule.component.ComponentException)
org.mule.component.AbstractComponent:144 (http://www.mulesoft.org/docs/site/current3/apidocs/org/mule/component/ComponentException.html)
********************************************************************************
Root Exception stack trace:
groovy.lang.MissingPropertyException: No such property: key for class: Script1
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.unwrap(ScriptBytecodeAdapter.java:53)
at org.codehaus.groovy.runtime.callsite.PogoGetPropertySite.getProperty(PogoGetPropertySite.java:52)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callGroovyObjectGetProperty(AbstractCallSite.java:307)
at Script1.run(Script1.groovy:18)
<snip>
I'm assuming that all properties are available somewhere within the Groovy script context references available to the script, but I've found no documentation on how to navigate to them. I've looked through the JavaDocs, but have been unable to piece together the proper method for Spring initialized property resolution. Any help is greatly appreciated.
Thanks,
Steve
I found the answer to my question in the following post. In short,
Spring property placeholders are resolved at configuration time and not stored anywhere, so they cant be loaded afterwards.
If you need to store it you can always inject them into a bean and retrieve that from the registry.
Based on the answer provided above, this behavior may have changed in newer releases of the Mule runtime, but my experience using version 3.5.4 is consistent with the other post's description of how Spring properties are evaluated by the MEL interpreter.
Steve
Which version of Mule runtime are you using ?
I am able to get the value from properties file using following example in Mule runtime 3.7.3:-
<http:listener-config name="HTTP_Listener_Configuration" host="0.0.0.0" port="8081" doc:name="HTTP Listener Configuration"/>
<flow name="teFlow">
<http:listener config-ref="HTTP_Listener_Configuration" path="/test" doc:name="HTTP"/>
<logger message="key: ${key}" level="INFO" doc:name="Logger" />
<scripting:component doc:name="Groovy">
<scripting:script engine="Groovy"><![CDATA[
System.out.print("In Groovy Script "+${key}+"\n");
log.info "In Groovy logger: ${key}";
]]></scripting:script>
</scripting:component>
</flow>
application.properties contains:-
key=555
And the output is in the Mule console as follows:-

Can we use JMS in the middle of proxy flow in mule

Actually we have produced JAX-WS web services, and it is working fine. But now we want to use Mule ESB's JMS. But i'm unable to configure that.
I have tried Mule's Proxy for webservices, and it is working fine. But we are trying to put JMS in between HTTP Endpoints. But the body of the soap cannot be transferred to the other end (i.e. to our services)
JMS Server is ActiveMQ.
Thanks in advance,
Copied the flow from comment --
<flow name="finalFlow1" doc:name="finalFlow1">
<http:inbound-endpoint exchange-pattern="request-response"
host="localhost" port="8888" contentType="text/xml" doc:name="HTTP" />
<jms:outbound-endpoint exchange-pattern="request-response"
queue="servicesQueue" doc:name="JMS" connector-ref="Active_MQ" />
<http:outbound-endpoint exchange-pattern="request-response"
method="POST" address="localhost:5050/MyServices" ; mimeType="text/xml"
contentType="text/xml" doc:name="HTTP" />
</flow>
In the flow that you posted you are consuming the message via HTTP based inbound endpoint. If you just want to consume this message from JMS and send it to another HTTP endpoint, you need to use JMS inbound
<flow name="finalFlow1" doc:name="finalFlow1">
<jms:inbound-endpoint exchange-pattern="request-response"
queue="servicesQueue" doc:name="JMS" />
<logger level="INFO" doc:name="Logger" />
<http:outbound-endpoint exchange-pattern="request-response"
host="localhost" port="5050" method="POST" doc:name="HTTP" path="MyServices"
mimeType="text/xml" />
</flow>
However, this will just send the payload as is and not going to convert it into SOAP payload. If you want to convert the message consumed from JMS to SOAP payload, you need to use CXF
<flow name="finalFlow1" doc:name="finalFlow1">
<jms:inbound-endpoint exchange-pattern="request-response"
queue="servicesQueue" doc:name="JMS" />
<logger level="INFO" doc:name="Logger" message="#[payload]" />
<logger message="SOAP call started" level="INFO" doc:name="Logger"/>
<http:outbound-endpoint
mimeType="text/xml" doc:name="HTTP" exchange-pattern="request-response" method="POST" path="MyServices" host="localhost" port="5050">
<cxf:proxy-client payload="body"
enableMuleSoapHeaders="false">
<cxf:inInterceptors>
<spring:bean class="org.apache.cxf.interceptor.LoggingInInterceptor" />
</cxf:inInterceptors>
<cxf:outInterceptors>
<spring:bean class="org.apache.cxf.interceptor.LoggingOutInterceptor" />
</cxf:outInterceptors>
</cxf:proxy-client>
</http:outbound-endpoint>
<logger message="SOAP call completed" level="INFO" doc:name="Logger"/>
</flow>
If you just want to kickoff the JMS consumption via HTTP you can go with what Seba suggested using MuleRequester - https://github.com/mulesoft/mule-module-requester/blob/master/mulerequesterdemo/src/main/app/MuleRequesterDemo.xml
If you need to consume messages from a queue in the middle of a flow, you should take a look at this: http://blogs.mulesoft.org/introducing-the-mule-requester-module

Mule FTP: how to move up one directory?

I need to configure a FTP inbound endpoint in Mule, what I got so far is this:
<ftp:connector name="ftpConnector" pollingFrequency="1000"
validateConnections="true"
moveToDirectory="C:\Users\jonbrynjar.FRETT\Documents\national_registry"
moveToPattern="*.txt"/>
<ftp:inbound-endpoint host="ftp1.xxxx.is" port="21"
user="xxxx" password="xxxx" binary="false"
pollingFrequency="5000" responseTimeout="10000"
connector-ref="ftpConnector">
<file:filename-wildcard-filter pattern="../einst.txt" />
</ftp:inbound-endpoint>
I can access this server in command prompt this way:
user:xxxx
pass:xxx
cd ..
get K0274K.N4503.EIN.E32 einst.txt
get K0274K.N301.F300 fyrirt.txt
bye
I think the problem is I am not able to move up one directory as implied in the command text!
How would I implement this action Mule?
I would suggest using the mule's composite source to use multiple source(The path of the folder being different in each ) .
<flow name="MuleRunnerFlow1" doc:name="MuleRunnerFlow1">
<composite-source doc:name="Composite Source">
<ftp:inbound-endpoint host="ftp1.xxxx.is" port="21" user="xxxx" password="xxxx" binary="false" pollingFrequency="5000" responseTimeout="10000" connector-ref="ftpConnector" doc:name="FTP" path="/parent">
<file:filename-wildcard-filter pattern="einst.txt" />
</ftp:inbound-endpoint>
<ftp:inbound-endpoint host="ftp1.xxxx.is" port="21" user="xxxx" password="xxxx" binary="false" pollingFrequency="5000" responseTimeout="10000" connector-ref="ftpConnector2" doc:name="FTP" path="/parent/children">
<file:filename-wildcard-filter pattern="einst.txt" />
</ftp:inbound-endpoint>
</composite-source>
<logger level="INFO" doc:name="Logger" />
</flow>
You May use two connectors or same connector based on your requirement and you can take the path and other properties from the property file if necessary
Hope this helps

Resources