I am new in spring integration. So please help me to resolve this problem.
Requirement is something like, we have multiple location from where we have to read the files and in future it can increase, location can be Any file system, so I am trying to use file inbound channel adapter to complete this requirement.
we can have multiple locations stored in our data base like pooling time and location from where we have to pool to get the files.
But if I go with xml configuration so I have to create new file inbound channel adapter every time in the xml configuration and all the details. if we want to pool that specific location to get the files. something like below -
int-file:inbound-channel-adapter id="AdapterOne" prevent-duplicates="false"
directory="${FileInputLoc}" filter="compositeFilter"
int:poller fixed-rate="${poolingTime}"
int-file:inbound-channel-adapter
int:service-activator input-channel="AdapterOne" ref="testbean"
bean id="testbean" class="com.SomeActivatorClass"
Please suggest me, how can I achieve this by code. so that based on data base row it create different channel adapter which pool at different time to different location.
Thanks,
Ashu
Related
I am trying to validate a json using ValidateRecord Processor via Avroschemaregistry. I need to store validation error message into a sql table, so i tried to capture the error message in attribute but i am unable to capture the error message in attribute, any idea how to do it
After your ValidateRecord Processor, you can choose to route flow files which are 'invalid' to a separte log and route them to your sql table, you can do the same if they 'fail'. I am assuming from the 'error message' you mean the 'bullentin' which would occur when the Processor can neither validate or invalidate the flow file based on your schema.
A potential solution to this is to use the SiteToSiteBulletinReportingTask
Screenshot of SiteToSiteBulletinReportingTask
You can build a dataflow to receive these bulletin events, manipulate them as you want and store them in a location of your choice for your auditing needs.
From the sounds of it, the SiteToSiteBulletinReportingTask should be able to achieve what you want. To implement this, add a iteToSiteBulletinReportingTask to the 'Reporting Tasks' in the NiFi Settings: Reporting Tasks in NiFi Settings
You can name your input port and have it flow towards your SQL store and you should have what you're after.
You need to allow NiFi nodes to receive data via site-to-site on the input port and you also need to grant the correct permissions on the root process group so the nodes are able to see the component, view and modify the data.
Side note: I would usually log everything, and have all failures and invalid route to log files, which I put to store, e.g. HBase/SQL. One suggestion I've seen is configure the logging subsystem to additionally send specific error categories to your destination of choice (e.g. active notification vs passive parsing of logs). NiFi is leveraging a very flexible logback system (an evolution of log4j). The best part - changes to the $NIFI_HOME/conf/logback.xml configuration file do not require an instance restart, will be picked up within 30 seconds or less.
I am using IBM MQMFT hosted on-premise and I have a request to transfer a new file from one server to the other.
Both servers already have MQ agents and are already sending some files across.
I would like to be able to add this new file configuration so that the file can also be picked up.
Please advise the steps required to achieve this. Do I just edit the agent's XML?
If you simply wish to send a file from one agent to another, and those agents are already configured...
Use the fteCreateTransfer command for a 'one off':
https://www.ibm.com/support/knowledgecenter/en/SSFKSJ_8.0.0/com.ibm.wmqfte.doc/start_new_transfer_cmd.htm
Consider a resource monitor (via fteCreateMonitor) or scheduled transfer (via fteCreateTransfer) if you're doing it on a regular basis:
https://www.ibm.com/support/knowledgecenter/en/SSFKSJ_8.0.0/com.ibm.wmqfte.doc/create_monitor_cmd.htm
Remember this is not Standalone Environment.
In one physical server I have set up a Cell with one Deployment Manager, one Node, one application server and configured them. Now I need to create 12 more similar cell with same configuration in different physical servers. So is it possible to copy/export configuration from one environment to another ?
Creating the Cell is not a problem for me, I want to skip the step of configuring again and again.
Start by looking at this IBM KnowledgeCenter topic on properties-based configuration and administration. It has a number of links to other topics with additional information. The property file based configuration allows you to extract a text file of properties from an existing WebSphere Application Server configuration, perform some processing on the text file (like changing hostnames, ports, etc) using your favorite tools and then apply that configuration to another cell, node, or server.
Is there a preserve-timestamp property in the file:inbound-channel-adapter like in the ftp:inbound-channel-adapter or sftp:inbound-channel-adapter?
I have a requirement to copy some files from one folder to another folder (not FTP) and I need to keep the timestamp.
If there isn't such property, can anyone suggest me how to this with spring integration?
For information, I'm using spring-integration 3.0.8.RELEASE.
Thanks a lot for the help.
The payload of the message from the file adapter is a java.io.File. If you want to move it to some other directory on the same physical disk, you could use a simple <service-activator ... expression="payload.renameTo(...)" />.
If you are using the file outbound channel adapter to copy the file to another disk, there is (currently) no option to preserve the timestamp.
I have opened a JIRA issue to add it as a new feature.
In the meantime; you could save off the lastModified in a header (using a header enricher) make the last channel before the outbound adapter a pub/sub channel, and add a second subscriber to set the lastModified on the new file.
Thanks for attention, i using Spring Integration in my project, i want to retrieve files from servers into tmp folder by int-ftp:inbound-channel-adapterand move files to orginal forder by int-file:outbound-gateway for future batch processing, but i feel when file name is duplicate int-file:outbound-gateway not working for me and does not transmit the file and seems ignore them, how to solving this my problem.
<int-file:outbound-gateway id="tmp-mover"
request-channel="ready-to-process-inbound-tmp-mover"
reply-channel="ready-to-process-inbound"
directory="${backupRootPath}/ali/in//"
mode="REPLACE" delete-source-files="true"/>
Set the local-filter in the ftp inbound channel adapter to an AcceptAllFileListFilter. By default, it's an AcceptOnceFileListFilter.