How to transfer data to a host with a username and password - elasticsearch

I have a problem. Current problem is that I want to transfer some data to a host with an username and password, but I keep getting the same error message. I would be very happy if you help.
My conf file:
input {
file {
path => "........../*.txt"
start_position => "beginning"
sincedb_path => "NUL"
}
}
filter {
.............
}
output {
elasticsearch {
hosts => "xx.xx.xxx.xxx:xxxx"
manage_template => false
index => "my_index_name"
document_type => "my_index_name"
user => "my_user_name"
password => "my_password"
}
Error message:
[WARN ][logstash.outputs.elasticsearch][main] Attempted to resurrect
connection to dead ES instance, but got an error
{:url=>"http://elastic_user_name:xxxxxx#xx.xx.xxx.xxx:xxxx/",
:exception=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError,
:message=>"Got response code '403' contacting Elasticsearch at URL
'http://xx.xx.xxx.xxx:xxxx/'"}
I also made changes to logstash.yml and elasticsearch.yml files as follows, but I got the same error.
elasticsearch.yml: xpack.management.elasticsearch.username:
my_elastic_user_name xpack.management.elasticsearch.password:
my_password
logstash.yml: xpack.monitoring.elasticsearch.username:
my_elastic_user_name xpack.monitoring.elasticsearch.password:
my_password

Receiving a HTTP 403 response code ("Forbidden") indicates that your user does not have permissions to index data to Elasticsearch (see this answer for the difference between Unauthorized (401) and Forbidden (403)).
Your user should have the following permissions
Setting up such a role is described here.
Please refer to the documentation about security privileges in order to adapt it to your use case.
For xpack monitoring you should create a user with the logstash_system built-in role.
I hope I could help you.

Related

Got response code '400' contacting Elasticsearch at URL in logstash

I am new to elasticsearch. I tried to configure elastisearch, Kibana , logstash with MQTT plugin. I supposed to send logs to elasticseach through logstash MQTT plugin. I installed them on Mac locally, but when starting logstash, it throws following error.
[2021-11-12T17:26:37,976][ERROR][logstash.outputs.elasticsearch][logstash_pipeline] Unable to get license information {:url=>"http://localhost:9200/", :exception=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError, :message=>"Got response code '400' contacting Elasticsearch at URL 'http://localhost:9200/_license'"}
my logstash configuration file islike:
input {
mqtt {
host => "localhost"
port => 1883
topic => "test"
qos => 0
certificate_path => "/Users/john/logstash-7.10.2/logstash/m2mqtt_ca.crt"
}
}
output {
elasticsearch {
hosts => ["http://localhost:9200"]
index => "%{[#metadata][beat]}-%{[#metadata][version]}-%{+YYYY.MM.dd}"
#user => "elastic"
#password => "changeme"
}
}
can anybody tell, what was the issue?

Got response code '401' in logstash. If restart it automatically disappear

can i know why i am getting response code '401'. Everything look fine. After i restart logstash again this does not happen
I am using docker-compose
[2021-08-18T04:49:18,326][ERROR][logstash.outputs.elasticsearch][main][2b6d754adc23b7c8ea56f9a46472ea071a1e60f0a221ed2f896a7d3e34026d00] Elasticsearch setup did not complete normally, please review previously logged errors {:message=>"Got response code '401' contacting Elasticsearch at URL 'https://es1:9200/_ilm/policy/logstash-policy'", :exception=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError}
these may help
https://stackoverflow.com/a/62224289
BadResponseCodeError, :error=>"Got response code '401' contacting Elasticsearch at URL
elastic environments
xpack.monitoring.elasticsearch.username: "logstash_system"
xpack.monitoring.elasticsearch.password: => "l12345"
and
logstash config
output {
elasticsearch {
hosts => ["elasticsearch:9200"]
user => logstash_system
password => l12345
index => "logs-topic"
workers => 1
}
}

kafka connect elastic sink Could not connect to Elasticsearch. General SSLEngine problem

I'm trying to deploy confluent Kafka connect to elasticsearch. My elastic stack is deployed on kubernetes, has HTTP encryption, and authentication. I'm forwarding elastic from kubernetes to localhost.
Caused by: org.apache.kafka.connect.runtime.rest.errors.BadRequestException: Connector configuration
is invalid and contains the following 3 error(s):
Could not connect to Elasticsearch. Error message: General SSLEngine problem
Could not authenticate the user. Check the 'connection.username' and 'connection.password'. Error
message: General SSLEngine problem
Could not authenticate the user. Check the 'connection.username' and 'connection.password'. Error
message: General SSLEngine problem
I'm sure that the username and password are right. Elastic properties file looks like
name=elasticsearch-sink
connector.class=io.confluent.connect.elasticsearch.ElasticsearchSinkConnector
tasks.max=1
topics=pwp-alerts
key.ignore=true
connection.url=https://localhost:9200
type.name=kafka-connect
errors.tolerance = all
behavior.on.malformed.documents=warn
schema.ignore = true
connection.username ="elastic"
connection.password ="my_password"
Does anyone know what can cause the problem?
I guess the failure issued by unsuccessful connection to your elastic engine it may happens by many things for example wrong port or your listener type it may be advertised listener instead of simple consumer, I recommend to use Logstash and add the Kafka input configuration in your Logstash configuration, You can simply modify your Kafka consumer and bootstrap server and many properties in input and your elastic index, Port and authorization in output easily.
Your Logstash configuration file with Kafka input may look like as below
input {
kafka{
group_id => "Your group consumer group id"
topics => ["Your topic name"]
bootstrap_servers => "Your consumer port, Default port is 9092"
codec => json
}
}
filter {
}
output {
file {
path => "Some path"
}
elasticsearch {
hosts => ["localhost:9200"]
document_type => "_doc"
index => "Your index name"
user => username
password => password
}
stdout { codec => rubydebug
}
}
You can remove the file in output if you don't want to store your data additionally beside your Logstash pipeline.
Find out more about Logstash Kafka input properties in Here

Logstash Elastic Cloud 401 Unauthorized error

Official logstash elastic cloud module
Official doc for starting with
My logstash.yml looks like:
cloud.id: "Test:testkey"
cloud.auth: "elastic:password"
With 2 spaces in front and no space at end, within ""
This is all I have in logstash.yml and nothing else,
And I am getting:
[2018-08-29T12:33:52,112][WARN ][logstash.outputs.elasticsearch] Attempted to resurrect connection to dead ES instance, but got an error. {:url=>"https://myserverurl:12345/", :error_type=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError, :error=>"Got response code '401' contacting Elasticsearch at URL 'https://myserverurl:12345/'"}
And the my_config_file_name.conf looks like:
input{jdbc{...jdbc here... This works, as I see data in windows console}}
output {
stdout { codec => json_lines }
elasticsearch {
hosts => ["myserverurl:12345"]
index => "my_index"
# document_id => "%{brand}"
}
What I am doing is hitting bin/logstash on windows cmd,
It loads data from database that I have configured in input of conf file and then shows me error, I want to index my data from MySQL to elasticsearch on Cloud, I took 14 days trial and created a test index, for learning purpose as I later have to deploy it.
My Pipeline looks like:
- pipeline.id: my_id
path.config: "./config/conf_file_name.conf"
pipeline.workers: 1
If logs won't include senistive data, I can also provide them.
Basically I wan't to sync (schedule check) my MYSQL data with ElasticSearch on cloud i.e. AWS
The output shall be:
elasticsearch {
hosts => ["https://yourhost:yourport/"]
user => "elastic"
password => "password"
# protocol => https
# port => "yourport"
index => "test_index"
# document_id => "%{table_id}"
# - represent comments
as stated at: Configuring logstash with elastic cloud docs
The document provided while deploying app does not provide config for jdbc, jdbc as well need user and password even if defined in settings file i.e. logstash.yml
Also if you created your API key in the web UI you will not be able to get the values needed to configure Logstash. You must to use the devtool console found at /app/dev_tools#/console with something like this:
POST /_security/api_key
{
"name": "logstash"
}
of which the output is something like:
{
"id": "<id value>",
"name": "logstash",
"api_key": "<api key>",
"encoded": "<encoded api key>"
}
And in your logstash pipeline config you use the values like this:
output {
elasticsearch {
cloud_id => "<cloud id>"
api_key => "<id value>:<api key>"
data_stream => true
ssl => true
}
stdout { codec => rubydebug }
}
Note the combined "api_key" value separated by ":". Also, you can find the "cloud id" under your "Deployments" menu option.
I add the same issue in my dev environment. After scour hours on google, I understood by default, when you install Logstash, X-Pack is installed. In the doc https://www.elastic.co/guide/en/logstash/current/setup-xpack.html it is stated that
Blockquote
X-Pack is an Elastic Stack extension that provides security, alerting, monitoring, machine learning, pipeline management, and many other capabilities
Blockquote
As I don't need x-pack to run in my dev while I am streaming Elasticsearch, I had to disable it by setting ilm_enabled to false in the output of my indexation file configuration.
output {
elasticsearch {
hosts => [.. ]
ilm_enabled => false
}
}
The link bellow may help
https://discuss.opendistrocommunity.dev/t/logstash-oss-with-non-removable-x-pack/655/3

logstash not pushing logs to AWS Elasticsearch

I am trying to push my logs from logstash to elasticsearch but its failing. here is my logstash.conf file :
input {
file {
path => "D:/shweta/ELK_poc/test3.txt"
start_position => "beginning"
sincedb_path => "NUL"
ignore_older => 0
}}
output {
elasticsearch {
hosts => [ "https://search-test-domain2-2msy6ufh2vl2ztfulhrtoat6hu.us-west-2.es.amazonaws.com" ]
index => "testindex4-5july"
document_type => "test-file"
}
}
The ES endpoint that i have provided in hosts is open , so there should not be an access isssue, but it still gives following error:
_[2018-07-05T13:59:05,753][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>https://search-test-domain2-2msy6ufh2vl2ztfulhrtoat6hu.us-west-2.es.amazonaws.com:9200/, :path=>"/"}_
_[2018-07-05T13:59:05,769][WARN ][logstash.outputs.elasticsearch] Attempted to resurrect connection to dead ES instance, but got an error. {:url=>"https://search-test-domain2-2msy6ufh2vl2ztfulhrtoat6hu.us-west-2.es.amazonaws.com:9200/", :error_type=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError, :error=>"Elasticsearch Unreachable: [https://search-test-domain2-2msy6ufh2vl2ztfulhrtoat6hu.us-west-2.es.amazonaws.com:9200/][Manticore::ResolutionFailure] This is usually a temporary error during hostname resolution and means that the local server did not receive a response from an authoritative server (search-test-domain2-2msy6ufh2vl2ztfulhrtoat6hu.us-west-2.es.amazonaws.com)"}_
I am stuck here. But when i downloaded ES and installed it in my machine and ran it locally , replacing hosts with: hosts => [ "localhost:9200" ] ,in output , it worked all good pushing data to local es:
I tried a lot of ways but not able to resolve the issue , can anyone please help. I don't want to give localhost but AWS ES domain endpoint. Any hints or leads will be highly appreciated
Thanks in advance
Shweta
In my opinion, you simply need to explicitly add the port 443 and it will work. I think the elasticsearch output plugin automatically uses port 9200 if no port is explicitly given.
elasticsearch {
hosts => [ "https://search-test-domain2-2msy6ufh2vl2ztfulhrtoat6hu.us-west-2.es.amazonaws.com:443" ]
index => "testindex4-5july"
document_type => "test-file"
}
An alternative would be to not add the port but specify ssl => true as depicted in the official AWS ES docs
elasticsearch {
hosts => [ "https://search-test-domain2-2msy6ufh2vl2ztfulhrtoat6hu.us-west-2.es.amazonaws.com" ]
index => "testindex4-5july"
document_type => "test-file"
ssl => true
}

Resources