I've been working with Apache Camel for a while and doing some basic stuff, but now I'm trying to create a route in which I can have multiple "consumers" to the same route, or add a consumer to the route and then process the message.
My idea is to have an Event Driven Consumer which is triggered by an event, and then to read a file from an ftp for example. I was planning to do something like this:
from("direct:processFile")
.from("ftp://localhost:21/folder?fileName=${body.fileName}") // etc.
.log("Start downloading file ${file:name}.")
.unmarshal().bindy(BindyType.Csv, MyFile.class)
.to("bean:fileProcessor")
.log("Downloaded file ${file:name} complete.");
So the idea is I have an event (for example a direct or from a message queue) that has a "fileName" property, and then use the property to download/consume a file with that name from a ftp.
I believe the problem is to have from().from() in the same route, but the problem is if I leave the ftp component inside a "to", then my queue event will be written into a file in the ftp, which is the opposite from what I want; it behaves as a produces instead of a consumer.
Is there any possible way to achieve what I'm trying to do or does it conflict with what Camel is for?
Thanks to the comment from Claus Ibsen I found what I was looking for, the component I needed and wich made it work was the Content Enricher.
Here is the route that worked for me:
from("direct:processFile")
.pollEnrich().simple("ftp://localhost:21/folder?fileName=${body.fileName}")
.log("Start downloading file ${file:name}.")
.unmarshal().bindy(BindyType.Csv, MyFile.class)
.to("bean:fileProcessor")
.log("Downloaded file ${file:name} complete.");
How about something like this ?
.from("direct:processFile")
.transform(simple("${body.fileName}"))
.from("ftp://localhost:21/folder?fileName=${body.fileName}") // etc.
.log("Start downloading file ${file:name}.")
.unmarshal().bindy(BindyType.Csv, MyFile.class)
.to("bean:fileProcessor")
.log("Downloaded file ${file:name} complete.");
Related
I just wanted to ask if it is possible to specify the name for the temporary auto-delete queues, which are bound to the destination when I subscribe to a webstomp queue/exchange.
The reason is, that I would like to specify a fine graded JWT permission control, so I would like to give permission f.e. to "stomp-subscriptions-user123-abcde" and therefore I would like to call the temporary queue name not "stomp-subscription-randomstring", but "stomp-subscriptions-user123-randomstring".
Is this possible?
I looked through the available documentation, but couldn't find anything (only the name of the subscription id, but not of the temporary queue name).
Documentation: https://stomp-js.github.io/
Here is the source code for the function that generates a queue name:
https://github.com/rabbitmq/rabbitmq-server/blob/main/deps/rabbitmq_stomp/src/rabbit_stomp_util.erl#L368-L382
Notice that it only auto-generates a name if the x-queue-name header is NOT present. So, it looks like you can specify whatever name you'd like via that header. Here is the documentation for it:
https://www.rabbitmq.com/stomp.html#d.ugqn
NOTE: the RabbitMQ team monitors the rabbitmq-users mailing list and only sometimes answers questions on StackOverflow.
I am trying to create a 3.rd party persistent store and I have done with the conf. The only problem is: the samples are showing this like I need to write code in order to get the cache instance and then I need to call loadcache function.
Is there a way to call it over config(default-config.xml)?
I think what you might need is a Ignite LifecycleBean, which can be configured inside of XML file. More information here: https://apacheignite.readme.io/docs/ignite-life-cycle#section-lifecyclebean
I don't think that it can be done via XML configuration file. You have to call IgniteCache.loadCache() explicitly.
I have a service of class EnsLib.HL7.Service.FTPService that picks up files from multiple subfolders and sends them to an EnsLib.HL7.MsgRouter.RoutingEngine. What I want to do is somehow capture the subfolder as a variable for use in the routing rules. Is this possible?
Let's say I have the following files and directory structure on my FTP Server
/incoming/green/apple.dat
/incoming/yellow/banana.dat
I want the Routing Rule to be able to send anything that came from the /green/ folder to one operation and from /yellow/ to another.
With a message viewer, you can trace any messages. Where you can see any message properties, and one of them is Source. Text in this property looks like:
Source apple.dat via FTP localhost:21 path '/incoming/green/'
So with all of this data, you can create a rule by this property in a Rule editor
I am working on an use case using FTP endpoint in Mule.
I am expecting FTP inbound endpoint should poll oldest files first. Currently It is polling files randomly.
Is there way to configure FTP to poll oldest file(the file has been placed first in the FTP shared folder) first?
Can some one please suggest me the solution to achieve this usecase.
Thank you
I believe you would have to extend the default Mule FtpConnectionFactory class and set your new class as an attribute named connectionFactoryClass in the FTP Connector. For the actual ordering you would need to extend the Apache FTPClient class the factory uses, with a method that orders the file listings by timestamp. Overriding the FTPClient.initiateListParsing() method would probably be enough.
I have done similar thing in sftp connector but i have time stamp in file names.
we can try this approach if timestamp or on some basis we can sort the files.
it polls the directory brings all the file names, configured a service override in the connector which sorts the file names based on the timestamp then route. pasting the sample code for reference. we can try similar thing with ftp also.
override the poll() in your service override class
public void poll(){
String[] files = sftpRRUtil.getAvailableFiles(false);
List<String> orderedFileList = sortAllFileNamesByTime(files);
Map<String,String> eventStateMap=new HashMap<String,String>();
for (String file : orderedFileList){
if (getLifecycleState().isStopping()){
break;
}
routeFile(file);
}
...
I want to upload ajax file upload which uses xhr to send file data,
at client m using this
http://valums.com/ajax-upload/
how i will accept this data on node and save the file to server by node.js , which module i need to use in node.js?
I've created an uploader with progress bar using the formidable module, it's really easy to use and provides a lot of useful callbacks.
Have a look here:
https://github.com/felixge/node-formidable (scroll down to get the Docs)
http://debuggable.com/posts/parsing-file-uploads-at-500-mb-s-with-node-js:4c03862e-351c-4faa-bb67-4365cbdd56cb
due to the lack of an example file in valums ajax-uploader, I've just created one.
It catches up the XHR upload if possible, alternatively falling back to the old form-based method.
All in conclusion to valums ajax-uploader.
https://github.com/aldipower/file-uploader/blob/master/server/nodejs.js
Maybe Valums will accept the pull request some time and the sample file gets merged in the standard repository.