logging:
level:
root: info
com:
demo:
mapper: debug
insertBigData: info
I want to see my mybatis sql in debug model, but there a function which inser a huge value, and I want to ignore this special function, but yml seem don't support config this style
I know move the function to another mapper is a way. what else can I do to ignore log of function insertBigData
It might not be ideal, but specifying partially-qualified key seems to be working.
logging:
level:
root: info
com:
demo:
mapper: debug
mapper.insertBigData: info
Related
I want to optionally apply a VPC configuration based on whether an environment variable is set.
Something like this:
custom:
vpc:
securityGroupIds:
- ...
subnetIds:
- ...
functions:
main:
...
vpc: !If
- ${env:USE_VPC}
- ${self:custom.vpc}
- ~
I'd also like to do similar for alerts (optionally add emails to receive alerts) and other fields too.
How can this be done?
I've tried the above configuration and a variety of others but just receive various different errors
For example:
Configuration error:
at 'functions.main.vpc': must have required property 'securityGroupIds'
at 'functions.main.vpc': must have required property 'subnetIds'
at 'functions.main.vpc': unrecognized property 'Fn::If'
Currently, the best way to achieve such behavior is to use JS/TS-based configuration instead of YAML. With TS/JS, you get full power of a programming language to shape your configuration however you want, including use of such conditional checks to exclude certain parts of the configuration. It's not documented too well, but you can use this as a starting point: https://github.com/serverless/examples/tree/v3/legacy/aws-nodejs-typescript
In general, you can do whatever you want, as long as you export a valid object (or a promise that resolves to a valid object) with serverless configuration.
I'm working in a Spring Boot Application using Gradle. I'm probably going to be giving a lot of unnecessary context here, but I'm not sure where my problem is.
I have an SLF4j logger that's resolving to a org.slf4j.impl.Log4jLoggerAdapter. I'm trying to change the log pattern layout for the console logs. The default pattern looks like:
2022-03-02 17:42:48.892 [ INFO] 19296 --- [ main] ggestions.jvm.JVMLatestVersionSuggestion
In the console output, the level, timestamp, and class path are all highlighted. However, I really don't like the format, and would like to overwrite it.
I've added the following to my application.yml:
logging.pattern.console: '%d [%t] %-5level %logger{36} - %m%n'
Which is giving me the layout that I want, but the highlighting disappears and it's all default color. I've seen advice to add %highlight, but that doesn't correctly display - it just shows up as the word %highlight in the output, like so:
2022-03-02 17:50:09,525 [main] %highlight(INFO ) com.indeed.common.boot.suggestions.jvm.JVMLatestVersionSuggestion
The logger is still resolving to org.slf4j.impl.Log4jLoggerAdapter. My guess is that it's because Log4j doesn't support highlighting, but then I'm not clear what was making it work in the first place. Is it possible to get it back?
(note: this is work code, and I don't have the ability to change the Slf4j implementation to Log4j2)
I'm wondering if it's possible to use the placeholder replacement not only in values but also for the key in a spring-boot application.yaml (or .properties).
For example:
openapi:
security:
- my-resource
- ${some.path.role-a}
- ${some.path.role-b}
Work's like a charm, nothing special here.
But what if I need to grab the "my-resource" also from a "placeholder"?
Is that possible?
Already tried different ways like:
openapi:
security:
- ${some.path.resource}
- ${some.path.role-a}
- ${some.path.role-b}
or
openapi:
security:
- "${some.path.resource}"
- ${some.path.role-a}
- ${some.path.role-b}
But everything yields to just the variable name as the key.
I have a Spring boot + sleuth based application. All works as expected. I have for now logs like this:
2017-05-04 17:55:52.226 INFO [alert,692d0eeca479e216,c3c8b680dc29ad02,false] 17292 --- [cTaskExecutor-1] c.k.a.b.s.alert.impl.AlertServiceImpl : Alert state to process: xxx
Now, I want to add custom MDC to my log like the contract reference for example. I want to have logs like this:
2017-05-04 17:55:52.226 INFO [alert,692d0eeca479e216,c3c8b680dc29ad02,false] [CONTRACT_REF] 17292 --- [cTaskExecutor-1] c.k.a.b.s.alert.impl.AlertServiceImpl : Alert state to process: xxx
I tried various things with no success:
Use the Spring Sleuth Tracer to add a tag;
Add logging.pattern.level=%5p %mdc to my application.properties file with MDC.put(xxx, xxx)
How can I add custom MDC/tags to my log?
For versions before 2.x, You have to create your own implementation of a SpanLogger. The easiest way will be to extend the Slf4jSpanLogger and provide your own code to add / update and remove the entries from MDC context. Then you can change your logging pattern and that way your logs will contain what they need.
I was able to add data to the MDC fairly easily by doing MDC.put("yourCoolKey", "your cool value") (see MDC.put JavaDoc).
Once you put the value into the MDC, you can use the sequence %X{yourCoolKey} in your logging pattern (in my case, the value of logging.pattern.console) to print the string "your cool value" as part of each log statement.
Optionally, you can specify a default value in the pattern string by adding :-<defaultValue> after the key, such as %X{yourCoolKey:-N/A}, which will print the string "N/A" whenever the MDC does not have an entry for "yourCoolKey". The default, if not specified, is a blank string ("")
I'm trying to filter data in the build in alfresco-access audit application, but it's not working.
I want to audit only READ and DELETE actions and exclude one particular user called synchronizer, so in my alfresco-global.properties I put this:
# Audit
audit.enabled=true
audit.tagging.enabled=false
audit.alfresco-access.enabled=true
# audit access-filter
audit.filter.alfresco-access.default.enabled=false
audit.filter.alfresco-access.default.user=~System;~null;~synchronizer;.*
audit.filter.alfresco-access.default.type=cm:folder;cm:content
audit.filter.alfresco-access.default.path=/app:company_home/.*
audit.filter.alfresco-access.transaction.user=~System;~null;~synchronizer;.*
audit.filter.alfresco-access.transaction.action=READ;DELETE
audit.filter.alfresco-access.login.user=~System;~null;~synchronizer;.*
In the log I see that login from synchronizer user are stored in the audit tables:
2017-02-01 18:18:45,067 DEBUG [repo.audit.AuditComponentImpl] [http-bio-8881-exec-5]
Extracted audit data:
Application: AuditApplication[ name=alfresco-access, id=2, disabledPathsId=5694]
Values:
/alfresco-access/login=null
/alfresco-access/loginUser=synchronizer
New Data:
/alfresco-access/login/user=synchronizer
2017-02-01 18:18:45,070 DEBUG [repo.audit.AuditComponentImpl] [http-bio-8881-exec-5]
New audit entry:
Application ID: 2
Entry ID: 58797
Values:
/alfresco-access/login=null
/alfresco-access/loginUser=synchronizer
Audit Data:
/alfresco-access/login/user=synchronizer
http://docs.alfresco.com/5.2/concepts/audit-example-filter.html
Have a look at the "It is important to note that it is the data producer that is specified and not the name of the audit application.", I believe that's what's bugging you.
The problem is that the data generator is alfresco-api, not alfresco-access, so the correct filter is this:
audit.filter.alfresco-api.post.AuthenticationService.authenticate.args.userName=~System;~null;~synchronizer;.*
Also you have to look at alfresco-audit-access.xml to understand how to build the filter.
Thanks to Axel Faust. sorry, but I posted the same question also here, cause it was making me crazy :-) : alfresco-42-audit-filter
A couple of things to consider
Set the enabled property to true:
audit.filter.alfresco-access.default.enabled=true
Verify your properties file is in a location where it is being picked up and read by Alfresco.