How to get Prometheus metrics from Sonarqube kubernetes pod? - sonarqube

I am running Sonarqube in Kubernetes and I want to get metrics of Sonarqube pod to Prometheus. I added prometheus.io/scrape: "true" in the service of sonarqube and able to see the endpoint in the Prometheus dashboard but it's showing DOWN status, though my pod is up and running.
Endpoint: http://sonar_ip:9000/metrics. I don't think Sonarqube exposes metrics on /metrics path because executing 'curl http://sonar_ip:9000/metrics' not showing metrics list. Does Sonarqube pod exposes any Prometheus metrics and if yes then on what path? Let me know if you need any further information.

SonarQube 9.3 has added inbuilt support for Prometheus monitoring for all the editions
https://www.sonarqube.org/sonarqube-9-3/
Follow latest docs on same -
https://docs.sonarqube.org/latest/instance-administration/monitoring/
Prometheus monitors your SonarQube instance by collecting metrics from the /api/monitoring/metrics endpoint.
Results are returned in OpenMetrics text format.
See Prometheus' documentation on Exposition Formats for more information on the OpenMetrics text format.
It requires auth in any of below way -
Authorization:Bearer xxxx header: You can use a bearer token during database upgrade and when SonarQube is fully operational. Define the bearer token in the sonar.properties file using the sonar.web.systemPasscode property.
X-Sonar-Passcode: xxxxx header: You can use X-Sonar-passcode during database upgrade and when SonarQube is fully operational. Define X-Sonar-passcode in the sonar.properties file using the sonar.web.systemPasscode property.
username:password and JWT token: When SonarQube is fully operational, system admins logged in with local or delegated authentication can access the endpoint.

Related

How to configure elasticsearch for enabling xpack security?

after all the necessary installation for kibana when i tried to do integration to add logs it shows "To use central management for Elastic Agents, enable the following Elasticsearch security features."
when i added xpack security value to true and after restarting the elasticsearch when i'm checking on browser it shows kibna is not ready yet or gives 502 error : service unavailable message: licence is not available
You can use Filebeat to monitor the Elasticsearch log files, collect log events, and ship them to the monitoring cluster.
Your recent logs are visible on the Monitoring page in Kibana.

why does it need cluster:monitor/main permission

ES 7.17 cloud is hosted in https://xxx.elastic-cloud.com:9243]
Using Java High Level REST client, I was able to connect to server using spring data ES 4.1.5(ES client 7.9.3). Now I update to spring data ES 4.4.1 (ES client 7.17.4). But it gives me permission issue.
"root_cause":[{"type":"security_exception","reason":"action [cluster:monitor/main] is unauthorized for user [xxxx] with roles
Why does it need this permission?
It's because the client pings the cluster using the / endpoint and it's actually why the cluster:monitor/main privilege is required.
I believe this was added after the 7.10.2 Opensearch fork.
Val is right. / endpoint requests are being sent out.
I found out more details for my issue.
HEAD / and GET / in spring data ES 4.4.1

Kibana Embedded Dashboard Not Logging In after Enabling X-pack security plugin

I am Wroking with ELK Stack 7.x version, Here I am trying to Embed Dashboard to another html page but I am not able to loggin into embedded dashboard since i configured x-pack security plugin.
You should configure the Anonymous access
xpack.security.authc.providers:
anonymous.anonymous1:
order: 0
credentials:
username: "anonymous_service_account"
password: "anonymous_service_account_password"
Anyone with access to the network Kibana is exposed to will be able to
access Kibana. Make sure that you’ve properly restricted the
capabilities of the anonymous service account so that anonymous users
can’t perform destructive actions or escalate their own privileges.

searchguard for SSL communication in ELK

Basic username/password authentication for ELK, i was able to achieve using searchguard on windows platform.
Now i am trying to establish secure communication. I have performed the following changes,
In elasticsearch.yml
searchguard.ssl.http.enabled: true
searchguard.ssl.http.keystore_filepath: D:\Softwares\ELK\elasticsearch-5.4.0\elasticsearch-5.4.0\config\CN=localhost-keystore.jks
searchguard.ssl.http.keystore_password: 221749a2add117cf889f
searchguard.ssl.http.truststore_filepath: D:\Softwares\ELK\elasticsearch-5.4.0\elasticsearch-5.4.0\config\truststore.jks
searchguard.ssl.http.truststore_password: 6d6cf1cc017dc874960b
searchguard.authcz.admin_dn:
- CN=sgadmin
searchguard.ssl.transport.keystore_filepath: D:\Softwares\ELK\elasticsearch-5.4.0\elasticsearch-5.4.0\config\CN=localhost-keystore.jks
searchguard.ssl.transport.keystore_password: 221749a2add117cf889f
searchguard.ssl.transport.truststore_filepath: D:\Softwares\ELK\elasticsearch-5.4.0\elasticsearch-5.4.0\config\truststore.jks
searchguard.ssl.transport.truststore_password: 6d6cf1cc017dc874960b
In Kibana.yml
elasticsearch.url: "https://localhost:9200"
elasticsearch.username: "admin"
elasticsearch.password: "admin"
If i login to kibana, using http://localhost:5601, it asks for username and password. I dont know what credentails to enter here. i tried admin/admin. Its not working. Before i gave searchguard.ssl.http.... configurations, admin/admin was working fine.
After i added all the searchguard.ssl.http related configuration, the credentails are not working.
I am sure there is some other configuration in kibana.yml w.r.t searchguard configuration. I am not able to find it online. Can any one please help me here on what is missing.
Enabling TLS on the REST layer does not have any impact on user authentication/authorisation. The only difference is:
If you enable TLS on REST layer, only HTTPS access is allowed. You will see an error message if you try to access ES with HTTP
If you disable TLS on REST layer, only HTTP access is allowed. You will see an error message if you try to access ES with HTTPS.
Search Guard will authenticate the credentials against the configured authentication backend in sg_config.yml. If you use the default configuration that ships with Search Guard, it will use the internal user database and the users defined in internal_users.yml. The default "admin" user has full access to all indices and types, so you should be able to use this user to log into Kibana. If you need a user with limited access, the corresponding demo user is "kibanaro". Please refer to internal_users.yml to see all demo users.
Since ES 5.0.2, you need to install the Search Guard Kibana plugin for full authentication and session support. You can find the plugin on the corresponding GitHub repository. You install it like any other Kibana plugin, for example:
bin/kibana-plugin install https://github.com/floragunncom/search-guard-kibana-plugin/releases/download/v5.4.3-3/searchguard-kibana-5.4.3-3.zip
If you do not need advanced stuff like multi-tenancy or JWT Single Sign-on, it should start and work out of the box.
If this does not help, please post the output of your Elasticsearch log files when trying to log in.
Disclaimer: I work for floragunn, makers of Search Guard.

How do I connect to Google Cloud Datastore in golang from a GCE VM?

When I try running these example functions to connect to Cloud Datastore, I get a 401 Invalid Credentials error.
I'm running the go code from a VM within a Google Cloud Project. I have enabled the Datastore API and generated a JSON key which is loaded by the example code.
This question is very similar and even mentions the same repo, but does not use the same authentication shown in the examples, and was related to a 403 Unauthorized error.
For some reason the Datastore documentation does not mention go outside the context of App Engine.

Resources