Diagnostic Monitor Trace Listener - windows

I would like to know if it is possible to modify the way the Trace is recording trace information ?
Trace.Listeners.Add(new DiagnosticMonitorTraceListener());
Trace.TraceInformation("OnStart");
I would like to be able to use the current WADLogsTable and adding one or more custom columns to the table.
Right now the default table created by the DiagnosticMonitorConfiguration looks like that:
PartitionKey|RowKey|Timestamp|EventTickCount|DeploymentID|Role|RoleInstance|Level|EventID|Pid|TiD|Message|
I would like to add at the end some custom columns like :
PartitionKey|RowKey|Timestamp|EventTickCount|DeploymentID|Role|RoleInstance|Level|EventID|Pid|TiD|Message|Custom1|Custom2
So every time I trace something I am able to add data for those two custom columns
Thanks

I don't think you'll be able to do this. While Windows Azure Diagnostics is quite extensible, you'll not be able to modify schema for trace logging. I would recommend looking into implementing custom diagnostics. You may find this link useful for this: http://convective.wordpress.com/2009/12/08/custom-diagnostics-in-windows-azure/.

As Gaurav mentioned, this is not doable with default implementation of Trace.
I'd recommend using something like Log4Net and implementing a custom Table Storage appender. I've done this on a number of projects and it works wonderfully. It (Log4Net) can also consume regular Trace messages and log them into its storage

Related

Best way to concatenate/collapse stacktraces in logs

My goal is to include my stacktrace and log message into a single log message for my Spring Boot applications. The problem I'm running into is each line of the stacktrace is a separate log message. I want to be able to search my logs for a log level of ERROR and find the log message with the stacktrace. I've found two solutions but not sure which to use.
I can use Logback to put them all in one line but would like to keep the new lines for a pretty format. Also the guide I found might override defaults that I want to keep. https://fabianlee.org/2018/03/09/java-collapsing-multiline-stack-traces-into-a-single-log-event-using-spring-backed-by-logback-or-log4j2/
I could also use ECS and concatenate it there, but it could affect other logs (though I think we only have Java apps). https://docs.aws.amazon.com/AmazonECS/latest/developerguide/firelens-concatanate-multiline.html
Which would be the best way to do it? Also is there a better way to do it in Spring compared to the guide that I found?

Celigo integrator.io logging

I have several flows. I want to save result of these flows (succes, faile or something else) into MS SQL table. I cannot find how I can achive that.
Maybe we can save output result into file (not manually). I never worked with Caligo before. I just trying to find some options. Or maybe we don't have logging at all?

Disabling specific context path in azure app insights

I have a spring-boot application that I have used alongside azure app insights and it works as expected.
Had a question on this: Is it possible to exclude specific context paths? I have just dropped in the starter and just configured the instrumentation key.
Are there any specific settings on application.properties that can fix this? Or do I need to create my own instance of TelemetryClient?
I am assuming that you don't want to store some event in AI instance , If yes then you can use sampling feature to suppress some of the events.
You can set sampling in the Java SDK in the ApplicationInsights.xml file.
You can Include or Exclude specific types of telemetry from Sampling using the following tags inside the Processor tag "FixedRateSamplingTelemetryProcessor"
<ExcludedTypes>
<ExcludedType>Request</ExcludedType>
</ExcludedTypes>
<IncludedTypes>
<IncludedType>Exception</IncludedType>
</IncludedTypes>
You can read more about sampling here.
Hope it helps.

How to save start log and end log when using Integration Service IIB?

I'm deploying a project with IIB.
The good feature is Integration Serivce, but I dont know how to save log before and after each operation.
So can any one know how to resolve that ?
Tks !
There are three ways in my project. Refer to the following.
Code Level
1.JavaComputeNode (Using log4j )
Flow Level
1.TraceNode
2.Message Flow Monitoring
In addition to the other answers there is one more option, which I often use: The IAM3 SupportPac
It adds a log4j-Node and also provides the possibility to log from esql and java compute nodes.
There are two ways of doing this:
You can use Log Node to create audit logging. This option only store in files and the files are not rotatives
You can use the IBM Integrated Monitor these events to create a external flow that intercepts messages and store this message in the way you prefer

Configure Elmah to use existing stored procedure/tables

We have a system that already has a table and some stored procedures used for logging (Oracle). I am currently working on another system which is going to use the same database, but does not have a system for logging errors yet.
I read that Elmah was an easy to use system, and have tried to set it up, but it seems that it by default tries to use tables and procedures that can be made the scripts that came with the Elmah download.
So, my question is, is it possible to configure Elmah to use myStoredProcedure, and if it is, how do I configure this?
To change the stored procedure that Elmah calls, and the parameters that get sent, you would have to download the source code, edit the OracleErrorLog.cs file, and recompile. If you feel comfortable with that, it shouldn't be too hard.
Alternatively, you could just edit the Oracle.sql script to alter the built-in Elmah packages to point to your own tables.

Resources