Can I delete Spring log file? - spring

I'm not very familiar with Spring log.
The log file is getting huge (>100GB), need to be deleted. I assume a new log file will be created after deletion, but not sure if it will bring any new problem. How can I delete it safely?
Also, how to configure to create a new log file every week?
Much appreciated!

It can be deleted, then restart the API.

Related

Springboot Logging Pattern For Rolling Logs

I am using Springboot version 2.7 and trying to configure the log pattern to be daily rolling.
I am currently using just the application properties file to configure the logging as that's the preference.
I added the following line in the properties file but does not seem to work
logging.logback.rollingpolicy.file-name-pattern=myservice-%d{yyyy-MM-dd}.log
Any clues what I may be missing?
Also, is there a way to check daily log rolling without having to wait for EOD :)
First, you have to specify the file name:
logging.file.name=myservice.log
then you can use the rolling file name pattern
logging.logback.rollingpolicy.file-name-pattern=myservice-%d{yyyy-MM-dd}.log
To force the file change you could set the size to something small
logging.logback.rollingpolicy.max-file-size=100K
To specify the directory you must set this property
logging.file.path=/var/logs
The documentation can be found here:
https://docs.spring.io/spring-boot/docs/current/reference/html/features.html#features.logging

migration.conf file while installing splunk universal forwarder

Please let me know what exactly is migration.conf file created during installation of splunk UF ?
File path : /opt/splunkforwarder/etc/system/local/migration.conf
Thanks in Advance,
NVP
Have you looked at the contents of the file? It shows what Splunk did when a new version of the forwarder was installed. When the new version runs for the first time, a number of checks are run and files may be modified or removed. The migration.conf file lists each action that was performed.
It's a good idea to review this log after each upgrade, because it may identify local changes that override new features.

How can I resolve that issue?

IDescribableEntity not founds
My requirement is to maintain all logs of sites.
e.g insertion deletion and edition of any field/table in database must be save in log data i try above code in my project but some thing is missing.
You've forgot to add a use.
use TheNameSpaceNameWhereYouCreatedTheInterface.IDescribableEntity;

Spring batch integration file lock access

I have a spring batch integration where multiple servers are polling a single file directory. This causes a problem where a file can be processed up by more than one. I have attempted to add a nio-lock onto the file once a server has got it but this locks the file for processing so it can't read the contents of the file.
Is there a spring batch/integration solution to this problem or is there a way to rename the file as soon as it is picked up by a node?
Consider to use FileSystemPersistentAcceptOnceFileListFilter with the shared MetadataStore: http://docs.spring.io/spring-integration/reference/html/system-management-chapter.html#metadata-store
So, only one instance of your application will be able to pick up a file.
Even if we find a solution for nio-lock, you should understand that lock means "do not touch until freed". Therefore when one instance has done its work, another one is ready to pick up the file. I guess that isn't your goal.

logstash forwareder doesn't release file handle

I am running logstash forwareder to ship logs.
Forwarder,logstash,elasticsearch all are on localhost.
I have one UI application whose log files is read by shipper. When forwarder is running, archiving of log file doesn't work. logs are appended in same file. I have configured log file to archive every minute, so I can see the change. As soon as I stop forwarder, log file archiving starts working.
I guess forwarder keep holding file handle that's why file does not get archived.
Please help me on this.
regards,
Sunil
Running on windows? There exists known unfixed bugs.
See https://github.com/elasticsearch/logstash/issues/1944
for some kind of work around.

Resources