influxdb proxy or influxdb fetches data from other influxdb - proxy

Is it possible to setup influxdb to fetch data from other influxdb?
or have some local log proxy for influx data.
e.g. I have 10 backend server. I wish every backend servers middleware code to log events locally in influxdb or some local log proxy.
I wish to setup another server with influxdb that will fetch logs from every backend server and accumulate it.
I can't find if such configuration is possible. May be I miss some "search term" to google for it.

Take a look at Telegraf from InfluxData - it is an agent for Collecting & Reporting Metrics & Data.
You can install a Telegraf instance on each of your 10 middleware servers and report data to local Telegraf instead of InfluxDB.
In Telegraf you can either configure output to your InfluxDB or even point it to some persistent stream like Kafka and configure another Telegraf which will read data from Kafka and persist it to InfluxDB

Related

Elasticsearch Fleet Server Outputs > Specify where agents will send data

Is posible to send data to an external elasticsearch deployment with fleet server?
I have tried with the Kibana Fleet UI settings but there is no username, password field for connection, if I specify those on Advanced YAML configuration give me these error: cannot set both api_key and username/password accessing 'elasticsearch'
Fleet > Settings > Outputs | Specify where agents will send data
I can see the Kibana Fleet Settings xpack.fleet.outputs > config described as Extra config for that output to set this manually but there is no example to set this config variable.
Kibana version: kibana:8.5.3
Elasticsearch version: elasticsearch:8.5.3
Install method: Elastic ECK 2.6
Agents don't support sending the logs to remote cluster, so you won't be able to send data to any external Elasticsearch per say.
However, you can either opt for beats and provide the list of ES hosts where you want to send the logs OR use logstash to receive input from Agent and configure your output with list of ES hosts.

Using Logstash to pass airflow logs to Elasticsearch

When using logstash to retrieve airflow logs from a folder you have access to, would I still need to make any changes in the airflow.cfg file?
For instance, I have airflow and ELK deployed on same ec2 instance. The logstash .conf file has access to the airflow logs path since they are on the same instance. Do I need to turn on remote logging in airflow config?
In fact you have two options to push airflow logs to Elastic Search:
Using a log collector (logstash, fluentd, ...) to collect Airflow log then send it to Elastic Search server, in this case you don't need to change any Airflow config, you can just read the logs from the files or stdout and send it to ES.
Using Airflow remote logging feature, in this case Airflow will log directly to your remote logging server (ES in your case), and will store a local version of this log to show it when the remote server is unavailable.
So the answer to your question is no, if you have a logstash, you don't need Airflow remote logging config

elasticsearch move data from local device to cloud ealstic

Is there any way to copy all the data of an index from elasticsearch from my computer to cloud elastic ?
i'm working on localhost and now i want to migrate it to cloud.elastic.co
Cheers!
you can do this with a snapshot of your local cluster into s3, then a restore of that on Elastic Cloud.
Which cloud provider are you using? If it is AWS OpenSearch, it does not allow remote index operation from local ElasticSearch. Allows if it's an ElasticSearch in the AWS cloud and https is broadcasting.
If the data is critical, you can pull the data and send bulk requests to ElasticSearch in the cloud. I had to do so.
You can write your own application or you can send requests multi-threaded with a tool like Jmeter.

ElasticSearch and Redis Remote Servers

I deployed a Laravel application on AWS Elasticbeanstalk.
I want to incorporate caching with Redis as my cache driver as well as Elasticsearch.
I managed to run these 2 features locally (redis on port 6379 and elasticsearch on 9200),
but now I want them to run on remote servers and I simply specify their endpoints in my .env file.
Can anyone let me know how I can obtain remote URLs for Redis and Elasticsearch?
Update:
I found out that Heruko offers the ability to create a Redis instance and thereby one can obtain a URL for Redis. I presume a similar thing is for Elasticsearch.
If this is not the right way to do so, please let me know how it works

how to get logs into logstash server without using filebeats

I have logstash installed in a server which will process logs and publish to elastic search. But, is it possible for logstash to pull logs from remote servers (linux) without installing filebeats in those servers.
Or if filebeats can be installed in the same server as logstash and can it fetch the logs? Please help me if there is any other option as well.
Thanks in advance
Neither Logstash nor Filebeat can pull/fetch log files from remote servers, you need to have some tool installed in the remote servers that will ship the logs elsewhere.
Logstash can consume logs from message queue systems like kafka, redis or rabbitmq, for example, but you need that your remote servers send the logs to those systems anyway, so you would need a log shipper on your remote servers.

Resources