Grafana provision elasticsearch datastore over ansible - elasticsearch

Anybody can advise how to map the configuration properties seen in the Grafana UI to their equivalents in the configuration file over Ansible?
This is what I have that is working well:
grafana_datasources:
- name: elasticsearch
type: elasticsearch
access: server
database: "metricbeat-7.5.2"
url: 'http://localhost:9200'
readOnly: false
editable: true
basicAuth: false
jsonData:
timeField: "#timestamp"
esVersion: 70
maxConcurrentShardRequests: 5
I managed to set up everything except Auth section. Actually I setup only "Basic auth" field by adding "basicAuth: false". Now I am stuck with setting up the following fields:
TLS Client Auth
Skip TLS Verify
Forward OAuth Identity
I tried with adding:
tlsAuth: false
tlsAuthWithCACert: false
tlsSkipVerify: false
but nothing happens. I also tried with adding the same to jsonData but still no luck...
Thanks in advance.
Cheers,
Dragan

This is how I resolved this. In order to get these three fields I added the following to my playbook:
isDefault: false
How I figured it out? Well, I created a dashboard manually and then exported it to json with the following command:
mkdir -p data_sources && curl -s "http://localhost:3000/api/datasources"  -u admin:password | jq -c -M '.[]'|split -l 1 - data_sources/
Then I edited exported dashboard json file and found out the key and the value I used in my playbook.
Cheers

Related

Supabase Self-Hosted Connection to Default Project (Still connecting)?

I deploy supabase to digitalocean
.env:
POSTGRES_PASSWORD=MY_PASSWORD
JWT_SECRET=MY_JWT_SECRET
ANON_KEY=MY_ANON_KEY
SERVICE_ROLE_KEY=MY_SERVICE_ROLE_KEY
## General
SITE_URL=http://mydropletip:3000
ADDITIONAL_REDIRECT_URLS=
JWT_EXPIRY=3600
DISABLE_SIGNUP=false
API_EXTERNAL_URL=http://mydropletip:8000
STUDIO_PORT=3000
# replace if you intend to use Studio outside of localhost
SUPABASE_PUBLIC_URL=http://mydropletip:8000
volumes/api/kong.yml:
consumers:
- username: anon
keyauth_credentials:
- key: MY_ANON_KEY
- username: service_role
keyauth_credentials:
- key: MY_SERVICE_ROLE
When I run docker compose up -d I got this
But It's still connecting?
How to fix it? Thank you for reply 🙏🏼

Integration between ELK and LDAP

I recently got to manage an opensource-based infrastructure composed by multiple Debian servers. On some of them, the ELK stack is installed.
I am verifying verify the presence of any integration between ELK and LDAP or other IAMs. On the dedicated monitoring node, I looked for IAM-related info into the following configuration files:
/etc/elasticsearch/elasticsearch.yaml
/etc/kibana/kibana.yml
/etc/logstash/logstash.yml
but the only login/account credentials I have been able to find are in the kibana.yml file:
elasticsearch.username: "username"
elasticsearch.password: "password"
In /etc/kibana/kibana.yml and /etc/elasticsearch/elasticsearch.yml I find the following:
xpack.security.enabled: false
which leads me think to the presence of a "xpack" plugin in somehow related to ldap. Where should I look for LDAP integration ?
Thanks to #Wonka for suggesting the presence of ReadOnlyRest. I found a readonlyrest.yml in /etc/elasticsearch. There, the following was present:
ldaps:
- name: ldap1
host: "ourldapserver.ourdomain"
[...]
Here is where LDAP integration occured.

How to run Mongodb as a service with authentication on a windows machine

remark: I am using win10.
My goal is when windows boot mongodb as a service with authentication start( you can not enter the database without authenticate) but I can not manage to do it on a windows machine ( in linux it worked)
I write here the steps I tried:
dowlnload MongoDB
change conf from default to the following
# mongod.conf
http://docs.mongodb.org/manual/reference/configuration-options/
# Where and how to store data.
storage:
dbPath: C:\MongoDB\Server\4.0\data
journal:
enabled: true
# where to write logging data.
systemLog:
destination: file
logAppend: true
path: C:\MongoDB\Server\4.0\log\mongod.log
# network interfaces
net:
port: 27017
bindIp: 127.0.0.1
security:
authorization: enabled
setParameter:
enableLocalhostAuthBypass: false
create a Admin user in the Admin collection.
db.createUser(
{
user: "....",
pwd: "...",
roles:
[
{ role: "root", db: "admin" }
]
}
)
Made it a service:
sc.exe create MongoDB
binPath=“\”C:\MongoDB\Server\4.0\bin\mongod.exe\”
–service
config=\”C:\MongoDB\Server\4.0\bin\mongod.cfg\”” DisplayName= “MongoDB” start= “auto”
getting feedback Successful.
but when i restart the computer, mongod is not starting and if i dont specify mongod --auth i can still enter without a authentication
How can I run Mongod as service with authentication? what am i doing wrong?
When i am trying to activate the service manually I get the following error
Error photo
The issue with the security tag. I have the same issue when I wanted to start the service in Windows 10. I copy the command from Windows service properties and then run on the command prompt.
The prompt shows me the error:
Unrecognized category : security
I found the solution and it is to write the security tag with options properly.
YAML need some specific input I guess. Here it is the solution.
security:
authorization: enabled
I had the same issue.
In your mongodb.cfg, use 2 spaces (instead of TAB) to indent authorization: enabled

"curl: (52) Empty reply from server" / timeout when querying ElastiscSearch

I've ran into an annoying issue with my ElasticSearch (Version 1.5.2): Queries immediately return timeout (when I used python's Requests) or
curl: (52) Empty reply from server
when I used curl.
This only happened when the expected output was large. When I sent a similar (but smaller) query, it came back just fine.
what's going on here? and how can I overcome this?
just open
sudo nano /etc/elasticsearch/elasticsearch.yml
and replace this setting with false
# Enable security features
xpack.security.enabled: false
An other explanation can be making http request when ssl/security is activated on the cluster.
In this case use
curl -X GET "https://localhost:9200/_cluster/health?wait_for_status=yellow&timeout=50s&pretty" --key certificates/elasticsearch-ca.pem -k -u elasticuser
As stated by #FanchenBao, one can read the doc about ELK with SSL.
I meet with the same issue on Elasticsearh 8.1.3, which is the latest version.
I fixed this issue by changing the following setting from true to false in the /config/elasticsearch.yml file:
# Enable security features
xpack.security.enabled: false
I installed elastic by downloading the tar file, and unzip it, then going to the folder of elasticsearch, and running the following command:
./bin/elasticsearch
The first time you run this command, it will change the elasticsearch.yml file with the following content, which means it's a default secruity setting auto generated:
#----------------------- BEGIN SECURITY AUTO CONFIGURATION -----------------------
#
# The following settings, TLS certificates, and keys have been automatically
# generated to configure Elasticsearch security features on 01-05-2022 06:59:12
#
# --------------------------------------------------------------------------------
# Enable security features
xpack.security.enabled: true
xpack.security.enrollment.enabled: true
# Enable encryption for HTTP API client connections, such as Kibana, Logstash, and Agents
xpack.security.http.ssl:
enabled: true
keystore.path: certs/http.p12
# Enable encryption and mutual authentication between cluster nodes
xpack.security.transport.ssl:
enabled: true
verification_mode: certificate
keystore.path: certs/transport.p12
truststore.path: certs/transport.p12
# Create a new cluster with the current node only
# Additional nodes can still join the cluster later
cluster.initial_master_nodes: ["DaMings-MacBook-Pro.local"]
# Allow HTTP API connections from localhost and local networks
# Connections are encrypted and require user authentication
http.host: [_local_, _site_]
# Allow other nodes to join the cluster from localhost and local networks
# Connections are encrypted and mutually authenticated
#transport.host: [_local_, _site_]
#----------------------- END SECURITY AUTO CONFIGURATION -------------------------
This issue was caused by Elastic running out of memory: it simply can't hold all the documents in memory. Unfortunately there's no explicit error code for this case.
There are a bunch of options to work around this (besides adding more memory):
You can tell Elastic to not attach the source, by specifying "_source: false". The results would then just list the relevant documents (and you would need to retrieve them).
You could use "source filtering" to return just part of the documents, if you dont need the whole thing - that worked for me.
You can also just split your query into a bunch of sub-queries. not pretty, but it would do the trick.
When running in Docker you can disable the security by setting the environment variable xpack.security.enabled to false, e.g. in docker-compose.yml:
environment:
- xpack.security.enabled=false
- discovery.type=single-node
In version 6.2, there are more strict checking.
for example:
curl -XPUT -H'Content-Type: application/json' 'http://localhost:9200/us/user/2?pretty=1' -d '{"email" : "mary#jones.com", "name" : "Mary Jones","username" : "#mary"}'
curl: (52) Empty reply from server
if you remove =1:
curl -XPUT -H'Content-Type: application/json' 'http://localhost:9200/us/user/2?pretty' -d '{"email" : "mary#jones.com", "name" : "Mary Jones","username" : "#mary"}'
{
"_index" : "us",
"_type" : "user",
"_id" : "2",
"_version" : 1,
"result" : "created",
"_shards" : {
"total" : 2,
"successful" : 1,
"failed" : 0
},
"_seq_no" : 0,
"_primary_term" : 1
}
it works!
In my case it was because the url scheme https:// was missing in the endpoint url.

logstash Authentication error with shield

I'm getting the following error while trying to output data to elasticsearch from logstash:
Failed to install template: [401]
{"error":"AuthenticationException[unable to authenticate user
[es_admin] for REST request [/_template/logstash]]","status":401}
{:level=>:error}
I have the configuration like this in logstash:
if [type]=="signup"{
elasticsearch {
protocol => "http"
user =>"*****"
password =>"*******"
document_type => "signup"
host => "localhost"
index => "signups"
}
}
I have tried adding user with following commands:
esusers useradd <username> -p <password> -r logstash
I also tried giving role admin but logstash not working for admin user also.
The localhost:9200 is asking for the password and after entering the password it works but the logstash is giving an error.
I also had similar issue. There is a known issue with elasticsearch if the password has an "#" symbol, this issue can happen. See below link:
https://github.com/logstash-plugins/logstash-output-elasticsearch/issues/232
Also some documentation on elasticsearch has instructions to include "shield" configuration in elasticsearch.yml, but if you have only one shield realm, this is not needed. I dont have shield configuration in elasticsearch.yml
I see that you tried with both logstash and admin user but failed.
To try with an admin privileged user:
Please make sure your /etc/elasticsearch/shield/roles.yml has below content for admin role:
# All cluster rights
# All operations on all indices
admin:
cluster: all
indices:
'*':
privileges: all
Then test something like below:
curl -u es_admin:es_admin_password localhost:9200/_cat/health
To make the logstash role user to work, logstash role need to be tweaked at roles.yml. I configured logstash to use admin privileged user to write to elasticsearch. I hope this would help.

Resources