Adjust ElasticsearchSinkOptions.NumberOfShards in SeriLog not working in .Net core - elasticsearch

I have a problem when setting NumberOfShards for ElasticSearch while writing log by SeriLog.
I do config for Serilog like this in .Net Core
.WriteTo.Elasticsearch(new ElasticsearchSinkOptions(new Uri(config.ElasticConnectionUrl))
{
AutoRegisterTemplate = true,
IndexFormat = config.ElasticIndex + "-{0:yyyy.MM.dd}",
NumberOfShards = 2,
NumberOfReplicas = 0
}));
But when I query the setting of the created Index in Kibana, the numberOfShards still 5 (default value). Even for NumberOfReplicas won't affect.
I am using ELK stack to trace logs.
Is anyone khow why?

You can do Serilog configuration in code, or in appSettings.json configuration. If you are doing this:
var loggerConfiguration = new LoggerConfiguration()
.ReadFrom.Configuration(configuration) // <= this reads from config
.WriteTo.Elasticsearch(new ElasticsearchSinkOptions(new Uri(config.ElasticConnectionUrl))
{
AutoRegisterTemplate = true,
IndexFormat = config.ElasticIndex + "-{0:yyyy.MM.dd}",
NumberOfShards = 2,
NumberOfReplicas = 0
})); // this gets partially ignored
And you have the following configuration file:
"Serilog": {
"MinimumLevel": "Debug",
"WriteTo": [
{
"Name": "Elasticsearch",
"Args": {
"nodeUris": "http://localhost:9100"
}
}
]
}
If you have left the index format empty in the JSON file, then you'll probably have an index named like "log-stash-{0:YYYY-MM-DD}", but you'll also have the one you've added in code. Same goes for the URIs. You might have two if you've added one in in the JSON sink WriteTo and in the ElasticsearchSinkOptions in code.
Config as JSON or as code. Choose one (sadly). See this for further information: https://github.com/serilog/serilog-sinks-elasticsearch/issues/180
I was looking to have a way to have a default configuration in code, that could potentially be overwritten using the configuration file, since I'm trying to create a generic opinionated IHostBuilder for our company various APIs, but Serilog is playing nicely. My solution is to move the Sinks config out of the Serilog config section, and define it separately, and load it myself, (in the same format - string Name, Args Dictionary<string, string>), and then create the config manually in code.

Related

Azure Application Insights - recreate the singleton instance with the modified configuration

I would like to track Sql command texts using Application Insights (Setting/ overriding the value of EnableSqlCommandTextInstrumentation in a running application) on a demand basis through a reloadable configuration. The ConfigureTelemetryModule of Microsoft.ApplicationInsights SDK uses singleton registration and this limits me from using IOptionsSnapshot. Can anyone please suggest me some ideas to override the config value EnableSqlCommandTextInstrumentation at runtime? Thank you.
Program.cs
var builder = WebApplication.CreateBuilder(args);
...
var myOptions = new MyOptions();
var configSection = builder.Configuration.GetSection(MyOptions.Name);
configSection.Bind(myOptions);
builder.Services.Configure<MyOptions>(configSection);
builder.Services
.AddSingleton<ITelemetryInitializer>(_ => new MyTelemetryInitializer(applicationName))
.ConfigureTelemetryModule<DependencyTrackingTelemetryModule>(
(module, _) =>
{
module.EnableSqlCommandTextInstrumentation = myOptions.EnableSqlCommandTextInstrumentation;
})
.AddApplicationInsightsTelemetry(configuration);
Can anyone please suggest me some ideas to override the config value EnableSqlCommandTextInstrumentation at runtime?
AFAIK, we cannot add configuration value run time to EnableSqlCommandTextInstrumentation.
The module.EnableSqlCommandTextInstrumentation has accept the bool value either we can enable or disable the EnableSqlCommandTextInstrumentation.
As per the MS-Doc you have to keep the settings in your host.json file "EnableDependencyTracking": true
Program.cs
builder.Services
.AddSingleton<ITelemetryInitializer>(_ => new MyTelemetryInitializer(applicationName))
.ConfigureTelemetryModule<DependencyTrackingTelemetryModule>(
(module, _) =>
{
module.EnableSqlCommandTextInstrumentation = true;
})
host.json
{
"version": "2.0",
"logging": {
"applicationInsights": {
"samplingSettings": {
"isEnabled": false,
"excludedTypes": "Exception"
},
"dependencyTrackingOptions": {
// Enable the Sql command text instrumentation to true to collect data
"enableSqlCommandTextInstrumentation": true
}
},

Logging into ElasticSearch with Serilog in ASP.NET Web API

I am trying to log into Elasticsearch in one of my ASP.NET Web API project using Serilog, but unfortunately, I can't find the logs in Kibana.
public class Logger
{
private readonly ILogger _localLogger;
public Logger()
{
ElasticsearchSinkOptions options = new ElasticsearchSinkOptions(new Uri("xxx"))
{
IndexFormat = "log-myservice-dev",
AutoRegisterTemplate = true,
ModifyConnectionSettings = (c) => c.BasicAuthentication("yyy", "zzz"),
NumberOfShards = 2,
NumberOfReplicas = 0
};
_localLogger = new LoggerConfiguration()
.MinimumLevel.Information()
.WriteTo.File(HttpContext.Current.Server.MapPath("~/logs/log-.txt"), rollingInterval: RollingInterval.Day)
.WriteTo.Elasticsearch(options)
.CreateLogger();
}
public void LogError(string error)
{
_localLogger.Error(error);
}
public void LogInformation(string information)
{
_localLogger.Information(information);
}
}
I can see the logs in the file specified above, just not in Elasticsearch. So, I am wondering is there is any way I can debug why it failed to log into Elasticsearch? I am also open to using other logging framework to log into Elasticsearch.
*The credentials and url for Elasticsearch are valid as I have implemented this in my other AWS Lambda project (.net core).
To see exactly what went wrong, the easiest way is to write into console, and in case of ASP.NET project, it will be Debug.WriteLine. So the code to see what went wrong would be
Serilog.Debugging.SelfLog.Enable(msg => Debug.WriteLine(msg));
ElasticsearchSinkOptions options = new ElasticsearchSinkOptions(new Uri("xxx"))
{
IndexFormat = "log-myservice-dev",
AutoRegisterTemplate = true,
ModifyConnectionSettings = (c) => c.BasicAuthentication("yyy", "zzz"),
NumberOfShards = 2,
NumberOfReplicas = 1,
EmitEventFailure = EmitEventFailureHandling.WriteToSelfLog,
MinimumLogEventLevel = Serilog.Events.LogEventLevel.Information
};
The following error message was retrieved from the output console.
Failed to create the template.
Elasticsearch.Net.ElasticsearchClientException: The request was
aborted: Could not create SSL/TLS secure channel.. Call: Status code
unknown from: HEAD /_template/serilog-events-template --->
System.Net.WebException: The request was aborted: Could not create
SSL/TLS secure channel.
The issue is quite clear cut. Added the following in my logger class constructor helped with the issue.
ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls | SecurityProtocolType.Tls11 | SecurityProtocolType.Tls12;
Hope it helps others that encounter issue trying to use Serilog to log into Elasticsearch for .Net Framework.

Elastic cloud logs not appearing in observability logs

The logs appear in my index pattern search when I go to the Analytics > Discover section, but don't appear in Observability > Logs. Not sure what I have to do to get them to show up there. I have already added the application-* prefix to the settings.
Looks like Pino logger for elastic search doesn't format it using the ecs format anymore by default. So I have to enable that in order for that to work.
ECS support
If you want to use Elastic Common Schema, you should install #elastic/ecs-pino-format, as the ecs option of this module has been removed.
const pino = require('pino')
const ecsFormat = require('#elastic/ecs-pino-format')()
const pinoElastic = require('pino-elasticsearch')
const streamToElastic = pinoElastic({
index: 'an-index',
consistency: 'one',
node: 'http://localhost:9200',
'es-version': 7,
'flush-bytes': 1000
})
const logger = pino({ level: 'info', ...ecsFormat }, streamToElastic)
logger.info('hello world')

Usage of AlternativeLanguageCodes in Google Cloud Speech to Text API v1p1beta1 RPC

I am working on Google Cloud Speech to Text API in RPC v1p1beta1 with its go client. The API works as expected but if alternativeLanguageCodes are set in the RecognitionConfig it does not answer.
GoogleRecognitionConfig: &speech.StreamingRecognitionConfig{
SingleUtterance: c.SingleUtterance,
InterimResults: false,
Config: &speech.RecognitionConfig{
Encoding: speech.RecognitionConfig_LINEAR16,
SampleRateHertz: 8000,
LanguageCode: lang,
// AlternativeLanguageCodes: []string("en-US"),
SpeechContexts: []*speech.SpeechContext{
{Phrases: c.Phrases},
},
},
},
I am aware it's in beta but I am wondering if anyone else is having issues as well or it's just a bug in my code.
Thanks
I have tried this today (c#, 1.0.0-beta02) but I never get alternative language codes results, only for primary language code.
ENGINE = SpeechClient.Create();
ENGINE_CONFIG = new StreamingRecognitionConfig()
{
Config = new RecognitionConfig()
{
Encoding = RecognitionConfig.Types.AudioEncoding.Linear16,
SampleRateHertz = settings.ArchiveSampleRate,
LanguageCode = firstLanguageCode,
ProfanityFilter = false,
MaxAlternatives = Constants.MASTER_SETTINGS.SpeechRecognitionAlternatives,
SpeechContexts = { new HintsManager(settings).GetHintsBasedOnContext(Contexts) }
},
InterimResults = Constants.MASTER_SETTINGS.RecognitionConfigSettings.InterimResultsReturned
};
// NOTE: 10062019 - ADD ALTERNATIVE LANGUAGE CODES HERE
// NOTE: 10062019 - ADD ALTERNATIVE LANGUAGE CODES HERE
// NOTE: 10062019 - ADD ALTERNATIVE LANGUAGE CODES HERE
foreach (var alternativeCode in otherAlternativeLanguageCodes)
{
ENGINE_CONFIG.Config.AlternativeLanguageCodes.Add(alternativeCode);
}
EDIT: After upgrading yesterday to new Beta, Nuget:
Install-Package Google.Cloud.Speech.V1P1Beta1 -Version 1.0.0-beta03
Everything seems to be working ok. The only thing I noticed was that interim results are never returned?

Whats the right BlobStorageService Configuration format?

When creating a Microsoft Bot Framework 4 project - the Startup.cs has the following code which can be uncommented.
const string StorageConfigurationId = "<NAME OR ID>";
var blobConfig = botConfig.FindServiceByNameOrId(StorageConfigurationId);
if (!(blobConfig is BlobStorageService blobStorageConfig))
{
throw new InvalidOperationException($"The .bot file does not contain an blob storage with name '{StorageConfigurationId}'.");
}
This code handles a way to configure an Azure Storage Account via Json Configuration.
However the project lacks an example on what the config Json looks like for the "is BlobStorageService" to work.
I have done various tries and searched for examples but cannot make it work.
Has anyone got the nailed?
Got it working using this json...
{
"type": "blob", //Must be 'blob'
"name": "<NAME OF CONFIG - MUST BE UNIQUE (CAN BE ID)>",
"connectionString": "<COPY FROM AZURE DASHBOARD>",
"container": "<NAME OF CONTAINER IN STORAGE>"
}

Resources