JOOQ LoggerListener extensive DEBUG logging - spring-boot

I need, for performance reasons, get rid of org.jooq.tools.LoggerListener DEBUG log messages in Spring Boot application running inside Docker. None of the Spring Boot options like (Docker) env variable LOGGING_LEVEL_ORG_JOOQ=INFO in docker-compose.yml or Java system property -Dlogging.level.org.jooq=INFO passed to docker container in entry.sh do not remove these DEBUG messages reporting query execution details. Both option have been checked at Docker container level.
Even custom logback-perf.xml conf file, as in https://github.com/jOOQ/jOOQ/blob/master/jOOQ-examples/jOOQ-spring-boot-example/src/main/resources/logback.xml with DEBUG->INFO, pointed by LOGGING_CONFIG env var from docker-compose.yml does not prevent these debug messages. I have verified that the custom logback-perf.xml conf file is in use by changing appender patterns.

The best way to remove those messages in jOOQ directly, is to specify Settings.executeLogging = false, see here.
Obviously, there are also ways to set up loggers correctly, but I cannot see what you did from your description, or why that failed.

Related

Redirect task log of Spring Cloud Data Flow

Is there any way to locate, copy, or manipulate logs of task execution, in SCDF, in local?
I'm currently seeing logs whenever I execute batch (or not batch) task in cmdline of shell where I've started dataflow server locally. In both CentOS 7 and Windows 10, it says that it located their stdout/stderr logs inside
/tmp (temp in windows)/[SOME_NUMBER_I_DON'T_KNOW]/${task_name_i_defined}_${SOME_HEX_CODE_RELATED_TO_TASK_EXECUTION_ID}
I want to use that information whenever I need.
Passing properties to dataflow jar doesn't work. It just creates a file, writes that file over and over at each task execution, unlike storing each task execution at different folder.
Modifying properties like loggig.file.path at task lauching configurations doesn't work, either. Only stdout of task is made with the name of 'spring.log', at specific location i designated. Behavior is same as above case.
Spring Cloud Data Flow Task logs
I looked at this answer, but it does not work, either...
I know there are a lot of parameters that I could pass to dataflow or tasks. I don't think none of them could satisfy this condition. Please enlighten me.
The only configuration property available to effect the log location is the working-directories-root deployer property.
Because it is a deployer property, it can not simply be set as spring.cloud.deployer.local.working-directories-root.
It can set at task launch time and prefixed w/ deployer.*.local (details).
It can also be configured globally via the "task platform" properties (details).
When configured at the platform level, it can be done in yml:
spring:
cloud:
dataflow:
task:
platform:
local:
accounts:
default:
working-directories-root: /Users/foo/logz
or via an env var:
SPRING_CLOUD_DATAFLOW_TASK_PLATFORM_LOCAL_ACCOUNTS_DEFAULT_WORKING_DIRECTORIES_ROOT=/Users/foo/logz
Details
The STDOUT log location is created at <task-work-dir>/stdout.log (details).
The <task-work-dir> is defined as:
<working-directories-root> / System.nanoTime() / <taskLaunchId>
(details)
The <working-directories-root> is the value of the
working-directories-root local deployer property or the "java.io.tmpdir" system property when the local deployer property is not specified.

Duplicate log entries with spring boot logback

In our projects we have a strange problem with duplicate log entries in the log file.
We have multiple appenders but a single logger.
If the spring boot application is started on local machine using java -jar the problem is not reproducible.
The problem occurs only when the application started as a service.
How can i solve the problem.
The problem occurs only if a file appender configured and if the spring boot application started using /etc/init.d/ symlink.
The spring boot's default start script redirects all console logs into the configured log file.
As a result both the logback logger and start scripts writes in the same file, thus we see duplicate entries in the log file.
Using systemctl (or setting the LOG_FILE or LOG_FOLDER environment variables) will solve this problem.
If you cannot switch to systemd you can set the environment variables so that all stdout&stderr messages redirected to /dev/null:
export LOG_FOLDER=/dev
export LOG_FILENAME=null

Spring Boot log file not being created

I have a spring boot app (1.5.10.RELEASE) that logs great when running as a standalone application in Eclipse IDE. I am using spring config and the properties file says this:
logging.level.com.myco.impl=DEBUG
logging.path=/log/myService
When I run as a standalone application, I see everything I see logged to the console and to a file in the above directory (called spring.log).
I then build my "uber jar" and run in like this:
java -jar my-service-0.1.0.jar. I see all the console logging just like running in IDE. I can see it looking for my configuration in spring config but I do not see any log file created.
I could use some ideas on what to look at.
In short - I (almost :-) ) believe that you have not set proper permissions on the logging path as pointed in the comments. The logging configurations seem quite simple and are described here. You can't get these wrong easily and yours seem OK.
The tricky part is how to diagnose the exact problem. For example on my system if I change the owner of the logging directory I can reproduce the behaviour with the uber jar that you described.
Next - I guess we both use slf4j with logback (e.g. coming because of spring-boot-starter). The laziest and quickest approach to understand what's wrong is to print the logback status messages as explained here. For example - programatically - there are other options too but I'll take the dirtiest :-) . Put this somewhere in your code:
LoggerContext lc = (LoggerContext) LoggerFactory.getILoggerFactory();
// print logback's internal status
StatusPrinter.print(lc);
LOGGER.debug("Some other message");
If I do that in my sample app (boot 1.5.10.RELEASE) I can immediately see the problem printed in the console along with many more status messages:
23:23:07,857 |-ERROR in ch.qos.logback.core.rolling.RollingFileAppender[FILE] - openFile(/tmp/so/spring.log,true) call failed. java.io.FileNotFoundException: /tmp/so/spring.log (Permission denied)
at java.io.FileNotFoundException: /tmp/so/spring.log (Permission denied)
See comments above. I did not include a jar I needed that was critical for reinitializing the logging system once it contacted spring cloud.
The issue may be a bug in Spring Boot 1.5.10, see:
https://github.com/spring-projects/spring-boot/issues/11951
I've faced similar issue when application cannot run, because log file was owned by root user and not available to change by application user.
Try to upgrade Spring Boot to 1.5.11 -- issue fixed in this version.

Spring Boot and Logback logging.config file with spring properties placeholders

I've a set of applications installed on a server that I want to start sending logs thru syslog to a remote Logstash server. For that, I've created a single external Logback configuration file containing a SyslogAppender pointing to the desired remote server.
As multiple applications will log to the same server, I've changed the log pattern to be the following:
<suffixPattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [${server_instance}] [${application_name}] %p ${PID:- } --- [%15.15t] %logger : %m%n</suffixPattern>
Where server_instance and application_name are command-line option arguments provided at startup.
Now I just simply set the logging.config property of all my apps pointing to the same Logback config file, and all the applications start sending the logs using the specified pattern to the desired server. That part works like a charm.
But the only problem I've is that Logback isn't able to determine the server_instance and the application_name properties and they appear as [server_instance_IS_NOT_DEFINED] and [application_name_IS_NOT_DEFINED] respectively.
Can I achieve this somehow using a single external configuration file?
Logback isn't able to see command-line arguments; those are seen by Spring alone.
Moved those command-line options arguments to system properties (i.e. "-Dserver_instance" and "-Dapplication_name" instead of "--server_instance" and "--application_name") and now everything's working as expected.

Intellij, Spring dev tools remote, Docker, error Unexpected 404 response uploading class files

Im trying to use Spring Boot Dev tools (Spring Remote), and automatically upload recompiled files to my docker container.
I keep receiving
Unexpected 404 response uploading class files
This is my docker file:
FROM java:8
WORKDIR /first
ADD ./build/libs/first.jar /first/first.jar
EXPOSE 8080
RUN bash -c 'touch /first/first.jar'
ENTRYPOINT ["java","-Dspring.data.mongodb.uri=mongodb://mongodb/micros", "-Djava.security.egd", "-agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=5005","-jar", "first.jar"]
This is my configuration and the configuration
And this is the error I'm receiving:
As of Spring Boot 1.5.0, devtools defaults were changed to exclude the devtools from fat jars.
If you want to include them, you have to set the excludeDevtools flag to false.
However, the devtools documentation doesn't explain how to do this. The necessary documentation is actually in the spring-boot-gradle-plugin documentation.
To do it, you can put this snippet of code in your build.gradle file:
bootRepackage {
excludeDevtools = false
}
Unfortunately, this was buggy at first and had no effect as of Spring Boot 1.5.0. The workaround was to do this instead:
springBoot {
excludeDevtools = false
}
However, I have verified that the bootRepackage approach works for Spring Boot 1.5.8 .
I got the same issues as yours while using docker-compose to compose my application ( a web service + redis server + mongo server ).
As the Spring developer tools document point out "Developer tools are automatically disabled when running a fully packaged application. If your application is launched using java -jar or if it’s started using a special classloader, then it is considered a “production application”."
I think when we running Spring Web Application inside Docker container, the developer tool is disabled then we cant remotely restart it.
Currently, I'm running my web application on the host machine and set the redis server, mongo server inside containers so I can restart the web app quickly when the code is changing in development process.
In my case I had to put the application context on the argument of the IDE RemoteSpringApplication configuration.
For example, my application root context was /virtue so I had to configure it like so:

Resources