How to log every query in solr using slow query log? - elasticsearch

I have been using elasticsearch and in that you just set the slowquerylog threshold to 0 and all queries would be logged so I tried the same in solr.
I am using the techproducts example here and just added the following config to the file
/home/ygrover/software/solr-8.3.1/server/solr/configsets/sample_techproducts_configs/conf/solrconfig.xml
<slowQueryThresholdMillis>0</slowQueryThresholdMillis>
also I changed the logging level in solr via the http://localhost:8983/solr/#/~logging/level to ALL
The log folder is at the location /home/ygrover/software/solr-8.3.1/server/logs
but there are no logs printing in the file solr_slow_requests.log
Am I missing something here.
Note : I am doing this for testing and local env only. also if there is an alternative way then please suggest but I need to know what is the missing peice here as this process works seamlessly in elasticsearch.
Edit 1 :
Facing this problem in cloud mode only when launching the techproducts example: followed this tutorial : https://lucene.apache.org/solr/guide/8_4/solr-tutorial.html
I have edited the _default config as well and set the slow query thrshold to 0 there as well. This config works when I dont run in cloud mode and I can then see all queries logged in the solr_slow_requests.log

Related

Configure Apache Solr logging to show warnings and slow queries via global config file

I start Solr in the foreground like so C:\solr-8.10.1\bin\solr start -p 8983 -m 1536m -f -v
It shows a command window and it logs a massive amount of DEBUG info, which I don't need.
I want to reduce the amount of logging here, and I found this: https://solr.apache.org/guide/8_5/configuring-logging.html
This seems exactly like what I need for my scenario:
I have many cores, each with their own solrconfig.xml:
C:\solr-8.10.1\server\solr\core1
C:\solr-8.10.1\server\solr\core2
C:\solr-8.10.1\server\solr\core3
C:\solr-8.10.1\server\solr\coreX
I don't want to have to make the logging changes to each core separately but 1 global setting that applies to all
I don't use Solr API, I want to be able to change settings via config files
I want ERRORS to be logged, and also any slow queries
After reading the tutorial then I decided I need to:
start Solr using solr start -p 8983 -m 1536m -f -q
Need to add an element <slowQueryThresholdMillis>1000</slowQueryThresholdMillis>
However, it's that last part where I have questions. I see a reference made to so called configsets, but I have no idea if that's the place where I need to configure my global settings.
I inspected the sample files, e.g. \solr-8.10.1\server\solr\configsets\sample_techproducts_configs\conf\solrconfig.xml
But I can't figure out if that's the right config file or how it would even apply to all other cores without any reference to the other cores.
I've had a look at these already, but they seem to want to handle things via code, whereas I'm looking for a file configuration:
configure Logger via global config file
Use of readConfiguration method in logging activities

Elastic Cloud APM not showing logs in Transactions Page

What makes Kibana to not show docker container logs in APM "Transactions" page under "Logs" tab.
I verified the logs are successfully being generated with the "trace.id" associated for proper linking.
I have the exact same environment and configs (7.16.2) up via docker-compose and it works perfectly.
Could not figure out why this feature works locally but does not show in Elastic Cloud deploy.
UPDATE with Solution:
I just solved the problem.
It's related to the Filebeat version.
From 7.16.0 and ON, the transaction/logs linking stops working.
Reverted Filebeat back to version 7.15.2 and it started working again.
If you are not using file beats, for example - We rolled our own logging implementation to send logs from a queue in batches using the Bulk API.
We have our own "ElasticLog" class and then use Attributes to match the logs-* Schema for the Log Stream.
In particular we had to make sure that trace.id was the same as the the actual Traces, trace.id property. Then the logs started to show up here (It does take a few minutes sometimes)
Some more info on how to get the ID's
We use OpenTelemetry exporter for Traces and ILoggerProvider for Logs. The fire off batches independently of each other.
We populate the Trace Id's at the time of instantiation of the class as a default value. This way you in the context of the Activity. Also helps set the timestamp exactly when the log was created.
This LogEntry then gets passed into the ElasticLogger processor and mapped as displayed above to the ElasticLog entry with the Attributes needed for ES

how to configure elastic search in jaeger without docker

I am trying to setup jaeger-all-in-one on one server. If I run the exe jaeger-all-in-one, everything works as expected (using in-memory). In order to see the options available with ES, I am not able to run a help command. Now, my requirement is to specify an elastic search URL. I have set up the environment variables SPAN_STORAGE_TYPES and ES_SERVER_URLS, but couldn't find how to run jaeger-all-in-one.exe by asking it to take in these environment variables.
Are you connecting it to only elasticsearch or stack like ELK/EFK?. I had tried but we cannot configure ELK in jeager-all-one.exe alone in windows without docker.You can do it by running Jeager-collector, Jeager agent and Jeager query individually by mentioning configurations related to ELK.
In Jeager collector and Jeager query you need to set up variables SPAN_STORAGE_TYPES and ES_SERVER_URLS.

Native application to query ELK?

I'm using Logstash, Elasticsearch and Kibana to process, store and visualize my logs.
My setup works fine but now I'm looking for a new tool : before ELK I was used to read my logs on Notepad++ or Glogg (I'm on Windows) and now I'm using only kibana discover tab.
Do you think I can find a native application that looks like a read-only Notepad++ that query Elasticsearch and display my logs like before ?
The three features I actually need are :
querying multiple sources logs,
for a specified date range,
and display it quickly to a concise and fast viewer.
I don't think it's very complicated to implement, so that's why i'm wondering if it already exists :)

Elastic Search JDBC River Plugin SQL Server Integrated Security

So I've been working on implementing elastic search using the JDBC River plugin to get data from our SQL Server DB into elastic search.
I've got it working fine using the SQL Server credentials, but trying to use integrated security doesn't work. It will create the index, but it doesn't have data in it.
The parameters I've been using are:
PUT /_river/test_river/_meta
{
"type":"jdbc",
"jdbc":
{
"driver":"com.microsoft.sqlserver.jdbc.SQLServerDriver",
"url":"jdbc:sqlserver://testServer:1433;databaseName=TestDb;
integratedSecurity=true;",
"user":"",
"password":"",
"sql": "select * from users",
"poll":"30s",
"index":"testindex",
"type":"testusers"
}
}
I've tried quite a few things, including removing the user and password fields completely, removing integratedSecurity=true, but it gave the same result.
I've checked on their github for the river plugin and it says this issue was fixed back in January, but it still doesn't seem to be working.
Also I'm using elastic search version: 1.5.1
and jdbc river plugin version : 1.4.0.10
Any help would be much appreciated
Get rid of the user and password options. You're not gonna need them.
Check the console when running elasticserch.bat, you should see an error message when it tries to update the river. I'm going to go out on a limb and assume you're probably seeing an error stating that the file sqljdbc_auth.dll can't be found. If this is the case, you can download this file from here and copy the x64 version of sqljdbc_auth.dll to your java lib folder. For me, this folder is C:\ProgramData\Oracle\Java\javapath but you can type echo %path% in a console window to find yours.
Once you have followed these steps, restart elasticsearch.bat, and it should start processing your river. If not, post back with the output you're seeing when running elasticsearch.bat.

Resources