Use Nifi to copy/move logs from different Nifi servers into AWS S3 - apache-nifi

We have a Nifi cluster of 4 servers and we want to ingest the logs of all the servers on S3 . Is there a way in Nifi using which we can ingest the logs of each Nifi server to S3. Logs on each node are stored on its local (separate disk mounted for Nifi logs -: /data/logs/nifi)

Related

Using Logstash to pass airflow logs to Elasticsearch

When using logstash to retrieve airflow logs from a folder you have access to, would I still need to make any changes in the airflow.cfg file?
For instance, I have airflow and ELK deployed on same ec2 instance. The logstash .conf file has access to the airflow logs path since they are on the same instance. Do I need to turn on remote logging in airflow config?
In fact you have two options to push airflow logs to Elastic Search:
Using a log collector (logstash, fluentd, ...) to collect Airflow log then send it to Elastic Search server, in this case you don't need to change any Airflow config, you can just read the logs from the files or stdout and send it to ES.
Using Airflow remote logging feature, in this case Airflow will log directly to your remote logging server (ES in your case), and will store a local version of this log to show it when the remote server is unavailable.
So the answer to your question is no, if you have a logstash, you don't need Airflow remote logging config

elasticsearch move data from local device to cloud ealstic

Is there any way to copy all the data of an index from elasticsearch from my computer to cloud elastic ?
i'm working on localhost and now i want to migrate it to cloud.elastic.co
Cheers!
you can do this with a snapshot of your local cluster into s3, then a restore of that on Elastic Cloud.
Which cloud provider are you using? If it is AWS OpenSearch, it does not allow remote index operation from local ElasticSearch. Allows if it's an ElasticSearch in the AWS cloud and https is broadcasting.
If the data is critical, you can pull the data and send bulk requests to ElasticSearch in the cloud. I had to do so.
You can write your own application or you can send requests multi-threaded with a tool like Jmeter.

Elasticsearch/kibana Logs export via raw log format

We have elasticsearch, filebeat, kibana at our stateful deployment inside kubernetes cluster. We have nfs server outside of kuberentes cluster as VM from where we've using static provisioning of NFS mounted inside Elasticsearch pods to preserve log.
Is there's any ways by which we can export logs from elasticsearch/ kibana in raw format?

How to setup a nifi Registry on a Server

I have been able to setup a Nifi Registry locally and connect it to my Local Nifi Cluster.
But i have Nifi Cluster of my organization (which is on a different port) and i want to setup a Nifi Registry for it. Thus i have setup the nifi registry on a server.
Can anyone help me with procedure for doing this

Could not establish site-to-site communication for apache nifi

I am working with two instances of nifi.
Instance-1: A secure nifi single node.
Instance-2: A secure 3-node nifi cluster on AWS.
My site to site settings have the below configurations:
Instance-1:
nifi.remote.input.host=<hostname running locally
nifi.remote.input.secure=true
nifi.remote.input.socket.port=10443
nifi.remote.input.http.enabled=true
Instance-2:
nifi.remote.input.host=<ec2 public fqdn>.compute.amazonaws.com
nifi.remote.input.secure=true
nifi.remote.input.socket.port=10443
nifi.remote.input.http.enabled=true
My remote processor group is in locally running nifi and I am trying to push a flowfile from local to AWS cluster. I am facing error as below:
Error while trying to connect RPG

Resources