Error while adding TimeLine to file in Apache Nifi - apache-nifi

I am using HDP 2.5. I try to add time for file which is locate in HDFS file. For that I use GetHDFS->UpdateAttribute->PutHDFS.
First I get file from HDFS through GetHDFS processor and then I change format of file in UpdateAttribute by adding property "
${filename}.${now():format("yyyy-MM-dd-HH:mm:ss.SSS'z'")}". Finally I put file in HDFS. In this stage I have one issue for example If destination folder(in HDFS) contain file which already have time line. Once I run flow in result two or more time line is present for same file
File which contain already timeline
After flow of Nifi File contain two timeline
Can anyone tell me how to resolve this issue

If you don't want to change your current workflow, the best option is probably to use the "File filter" property in the GetHDFS processor to only get files not containing the date in the filename (assuming your files have some naming convention). Another option is to send the renamed files in another directory.
As a general comment, I'd recommend using the combination of ListHDFS and FetchHDFS processors as it is a more efficient pattern when working with a NiFi cluster. You could then use a RouteOnAttribute in the middle to do some more advanced filtering than the "File filter" option.
Another comment: your approach is not the most performant one as you are downloading the data from HDFS, and then uploading it back. A rename/move operation in HDFS would probably be cleaner (or having a correct naming in the first place). You could use WebHDFS interface to perform the renaming using InvokeHTTP processor in NiFi in combination with ListHDFS processor.

You can use Expression Langage to delete the previous timestamp and then add the current timestamp. You have several string functions such as substringBefore or substringAfter that you can use depending on the logic of your file names.
enter link description here

Related

Process latest file from GetS/List SFTP processor

I am getting multiple files from List SFTP processor. However the requirement is to only process the latest file based on last modification time of file. I tried merging files via merge content processor , but the last modification time goes away. Current version of Nifi is 1.6, so record set writer can't be used. How can the solution for it be implemented.
You can use AttributesTo*Processor and create a new flow file from filename and file.lastModifiedTime attributes. Then you can merge content to get a single flow file with both filename and modifiedtime. You should be able to able to get the file from here.

Need to use 1 Processor instead of 5 FetchHDFS in NiFi

I have 5 XML files in HDFS which I am fetching using Apache this is the flow nifi. First, I am using Generate Flow file processor and then I have to use 5 different FetchHdfs processors. I can't use GetHdfs because it deletes all the file from directory and I don't have permission to ingest the files back. Hence, I am searching for a way that instead of using 5 FetchHdfs, what else can I do?. All the files are in the same directory and I want to keep them so that I can test multiple times.
I am ingesting those files in TransformXML processor and converting them to JSON
Instead of the GetHDFS Processor, try the ListHDFS Processor as it lists the entire directory and doesn't delete the files ListHDFS It says in the description, "Unlike GetHDFS, this Processor does not delete any data from HDFS."
Thanks everyone for answering. I am unable to vote anyone's answer and hence I am writing what I did.
First I used the ListHDFS processor and it will list out all the filenames.
Then I used FetchHDFS and in HDFS filename, I put '${path}/${filename}'.
change the ${path} to your path of the directory and leave the ${filename} as is as this is a property of ListHDFS and that's where it is picking the filenames from.
This way, there is no need of loops or anything and as soon as the new file is uploaded in the directory, it will be picked by the ListHDFS processors.
So, leave the entire processes working.

Is it possible to get Nifi to Put to multiple HDFS folders?

I need to stream a bunch of json files to Nifi, which will then go to HDFS. Nifi needs to look at the creation date (UNIX format) within the json file and then route it to the appropriate HDFS folder. So far I have the processors set up like this:
Consume Kafka -> RouteOnContent (using regex ^"creationDate": \"[0-9]{4}-[0-9]{2}-[0-9]{2}$) -> PutHDFS
There is an HDFS folder for every day, like "2019-01-28", "2019-01-29", "2019-01-30" etc. However, the "PutHDFS" processor will just output to a single directory and I obviously don't want to have 365 processors. And as far as I know, Nifi doesn't have a way to create HDFS folders dynamically so is there an elegant way to handle this?
https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.8.0/org.apache.nifi.processors.hadoop.PutHDFS/index.html
there is a parameter Directoryin PutHDFS processor:
The parent HDFS directory to which files should be written. The directory will be created if it doesn't exist.
Supports Expression Language: true (will be evaluated using flow file attributes and variable registry)
so you can use expression like ${creationDate} for this parameter

Write time series data into hdfs partitioned by month and day?

I'm writing a program which save the time series data from kafka into hadoop. and I designed the directory struct like this:
event_data
|-2016
|-01
|-data01
|-data02
|-data03
|-2017
|-01
|-data01
Because the is a daemon task, I write a LRU-based manager to manage the opened file and close inactive file in time to avoid resource leaking, but the income data stream is not sorted by time, it's very common to open the existed file again to append new data.
I tried use FileSystem#append() method to open a OutputStream when file existed, but it run error on my hdfs cluster(Sorry, I can't offer the specific error here because it's several month ago and now I tried another solution).
Then I use another ways to achieve my goals:
Adding a sequence suffix to the file name when the same name file exists. now I have a lot of file in my hdfs. It looks very dirty.
My question is: what's the best practice for the circumstances?
Sorry that this is not a direct answer to your programming problem, but if you're open for all options rather than implement it by yourself, I'd like to share you our experiences with fluentd and it's HDFS (WebHDFS) Output Plugin.
Fluentd is a open source, pluggable data collector and by which you can build your data pipeline easily, it'll read data from inputs, process it and then write it to the specified outputs, in your scenario, the input is kafka and the output is HDFS. What you need to do is:
Config fluentd input following fluentd kafka plugin, you'll config the source part with your kafka/topic info
Enable webhdfs and append operation for your HDFS cluster, you can find how to do it following HDFS (WebHDFS) Output Plugin
Config your match part to write your data to HDFS, there's example on the plugin docs page. For partition your data by month and day, you can configure path parameter with time slice placeholders, something like:
path "/event_data/%Y/%m/data%d"
With this option to collect your data, you can then write your mapreduce job to do ETL or whatever you like.
I don't know if this is suitable for your problem, just provide one more option here.

How to process files with same name in Apache NiFi?

I'm learning NiFi and I'm working on a flow where I get files using GetFile and then I do some process and then store them into HDFS using PutHDFS processor. The thing is, most probably I'll get files with the same name. For ex, I might get a file every 30 minutes and the file that is generated every 30 minutes will have the same name.
Now when I put that file into HDFS, I get an "File with the same name already exists". How do I overcome this? Is there any way to change the file name on the run?
It is a very easy one. I just have to use UpdateAttribute processor to change the file name. For ex: you can append timestamp to the file name.
In the UpdateProcessor, add a property filename and its value ${filename}.${now()}

Resources