react native ffmpeg-kit-react-native FFprobeKit how can i save return log? - ffmpeg

I want to get the bitrate of the audio and store it in a variable. The output is normal, but I don't know how to save it.
static getAudio(
localFileName,
videoURI,
successCallback,
errorCallback,
) {
let outputPath = `${RNFS.CachesDirectoryPath}/${localFileName}.txt`; // 업로드된 파일을 캐싱하여 각 초마다 저장했을때의 path 를 등록
// const ffmpegCommand = `-ss 0 -i ${videoURI} -acodec copy -map 0:a:0 -vn -f rawvideo ${outputPath}`;
const ffmpegCommand = `-select_streams v -show_entries packet=size:stream=duration -of compact=p=0:nk=1 ${videoURI}` // 윈도우에서는 됨
//const ffmpegCommand = `ffprobe -v quiet -print_format json -show_format -show_streams ${videoURI}` // 윈도우에서는 됨
audio(ffmpegCommand, videoURI)
}
const audio = async (command, videoURI) => {
// FFprobeKit.executeAsync(command).then(async (session) => {
// FFprobeSessionCompleteCallback(session).then(function (res) {
// console.log(res)
// })
// })
try {
await FFprobeKit.executeAsync(
command,
).then(async information => {
console.log('들어옴')
const failStackTrace = await information.getFailStackTrace();
console.log(failStackTrace);
await information.getOutput().then(output => {
console.log(output)
});
});
} catch (error) {
console.log(error);
}
}
LOG 47402
LOG
LOG 6784
LOG
LOG 5131
LOG
LOG 3646
LOG
LOG 51849
LOG
LOG 6661
LOG
LOG 6067
LOG
LOG 3642
LOG
LOG 43953
LOG
LOG 5562
LOG
LOG 3584
LOG
LOG 3579
LOG
LOG 43500
LOG
LOG 5496
LOG
LOG 3669
LOG
LOG 3355
LOG
LOG 41392
LOG
LOG 5571
LOG
LOG 3487
LOG
LOG 3135
LOG
LOG 40202
LOG
LOG 5817
LOG
LOG 3557
LOG
LOG 3427
LOG
LOG 40353
LOG
LOG 5794
LOG
LOG 4051
LOG
LOG 3716
LOG
LOG 55030
LOG
LOG 6560
LOG
LOG 3588
LOG
LOG 3947
LOG
LOG 45565
LOG
LOG 5614
LOG
LOG 3788
LOG
LOG 3361
LOG
LOG 40883
LOG
LOG 5708
LOG
LOG 3867
LOG
LOG 3566
LOG
LOG 39203
LOG
LOG 6061
LOG
LOG 3815
LOG
LOG 3506
LOG
LOG 39310
LOG
LOG 5763
LOG
LOG 3521
LOG
LOG 3607
LOG
LOG 39931
LOG
LOG 5735
LOG
LOG 3586
LOG
LOG 3497
LOG
LOG 40156
LOG
LOG 6022
LOG
LOG 3830
LOG
LOG 3597
LOG
LOG 104340
LOG
LOG 7485
LOG
LOG 3591
LOG
LOG 3621
LOG
LOG 38863
LOG
LOG 5449
LOG
LOG 2920
LOG
LOG 3525
LOG
LOG 47031
LOG
LOG 5764
LOG
LOG 3574
LOG
LOG 3564
LOG
LOG 44096
LOG
LOG 5841
LOG
LOG 4010
LOG
LOG 3679
LOG
LOG 44091
LOG
LOG 6321
LOG
LOG 3874
LOG
LOG 3945
LOG
LOG 44564
LOG
LOG 6486
LOG
LOG 4300
LOG
LOG 3931
LOG
LOG 43305
LOG
LOG 6364
LOG
LOG 4230
LOG
LOG 4026
LOG
LOG 42957
LOG
LOG 6475
LOG
LOG 4262
LOG
LOG 3975
LOG
LOG 51836
LOG
LOG 6901
LOG
LOG 4502
LOG
LOG 4035
LOG
LOG 43711
LOG
LOG 6544
LOG
LOG 4172
LOG
LOG 3938
LOG
LOG 43549
LOG
LOG 6270
LOG
LOG 4046
LOG
LOG 3793
LOG
LOG 44727
LOG
LOG 6319
LOG
LOG 3998
LOG
LOG 3883
LOG
LOG 42722
LOG
LOG 6528
LOG
LOG 4230
LOG
LOG 3909
LOG
LOG 41801
LOG
LOG 6815
LOG
LOG 3978
LOG
LOG 4094
LOG
LOG 40256
LOG
LOG 6366
LOG
LOG 4218
LOG
LOG 3743
LOG
LOG 101632
LOG
LOG 8175
LOG
LOG 3830
LOG
LOG 4310
LOG
LOG 33766
LOG
LOG 5974
LOG
LOG 3380
LOG
LOG 3806
LOG
LOG 43362
LOG
LOG 6177
LOG
LOG 3964
LOG
LOG 3873
LOG
LOG 43735
LOG
LOG 6232
LOG
LOG 4145
LOG
LOG 3612
LOG
LOG 45288
LOG
LOG 6192
LOG
LOG 4035
LOG
LOG 4044
LOG
LOG 42205
LOG
LOG 6837
LOG
LOG 4486
LOG
LOG 3884
LOG
LOG 45320
LOG
LOG 7483
LOG
LOG 4217
LOG
LOG 4227
LOG
LOG 37259
LOG
LOG 6980
LOG
LOG 4314
LOG
LOG 3954
LOG
LOG 40932
LOG
LOG 7390
LOG
LOG 4869
LOG
LOG 4340
LOG
LOG 36753
LOG
LOG 8176
LOG
LOG 5180
LOG
LOG 4133
LOG
LOG 57793
LOG
LOG 16424
LOG
LOG 9230
LOG
LOG 6440
LOG
LOG 41975
LOG
LOG 9540
LOG
LOG 6629
LOG
LOG 5149
LOG
LOG 16668
LOG
LOG 9210
LOG
LOG 3705
LOG
LOG 4004
LOG
LOG 5127
LOG
LOG 5120
LOG
LOG 5.700000
this is logo i got that i excetly expect
and I want to store it in a variable

Related

Multiline Spring Boot logs from CRI-O logs on OpenShift

Software Stack
Software
Version
OpenShift
4.4
Fluent-Bit
1.7.0-rc4
ElasticCloudStack
7.10.2
I have a DaemonSet for fluent-bit pods to read OpenShift logs from /var/log/containers on the worker nodes.
Logs are coming to Elastic and viewable on Kibana.
Everything work well until we have to parse a Java stacktrace on the logs. Each line is sent a separate log entry.
How can I ensure that the stacktrace is read as one?
I have the following CRI-O parser:
[PARSER]
# http://rubular.com/r/tjUt3Awgg4
Name cri
Format regex
Regex ^(?<time>[^ ]+) (?<stream>stdout|stderr) (?<logtag>[^ ]*) (?<log>.*)$
Time_Key time
Time_Format %Y-%m-%dT%H:%M:%S.%L%z
and the following Merge_Parser for Spring Boot log entry
[PARSER]
Name springboot
Format regex
Regex /^(?<date>[0-9]+-[0-9]+-[0-9]+\s+[0-9]+:[0-9]+:[0-9]+.[0-9]+)\s+(?<log_level>[Aa]lert|ALERT|[Tt]race|TRACE|[Dd]ebug|DEBUG|[Nn]otice|NOTICE|[Ii]nfo|INFO|[Ww]arn?(?:ing)?|WARN?(?:ING)?|[Ee]rr?(?:or)?|ERR?(?:OR)?|[Cc]rit?(?:ical)?|CRIT?(?:ICAL)?|[Ff]atal|FATAL|[Ss]evere|SEVERE|EMERG(?:ENCY)?|[Ee]merg(?:ency)?)\s+(?<pid>[0-9]+)\s+---\s+\[(?<thread>.*)\]\s+(?<class_name>.*)\s+:\s+(?<message>.*)$/
Time_Key time
Time_Format %Y-%m-%
The CRI-O log entry is as below for the log entries with and without excpetion
2021-02-09T11:19:04.815514933+00:00 stdout F 2021-02-09 11:19:04.814 DEBUG 1 --- [nio-8080-exec-8] x.x.x.service.DocumentService : retrieved: []
2021-02-09T11:19:04.817387066+00:00 stdout F 2021-02-09 11:19:04.816 ERROR 1 --- [nio-8080-exec-4] x.x.x.exceptions.RestExceptionHandler : 422 Status Code - EntityNotFoundException - XXXXXXXXXXXXXXXXXXXXXXXXX
2021-02-09T11:19:04.817387066+00:00 stdout F
2021-02-09T11:19:04.817387066+00:00 stdout F xxx.xxx.microservices.exceptions.EntityNotFoundException: XXXXXXXXXXXXXXXXXXXXXXXXX
2021-02-09T11:19:04.817387066+00:00 stdout F at xxx.xxx.microservices.service.XXXXXXX.lambda$getXXXData$3(XYXYXYXYX.java:139) ~[classes!/:na]
2021-02-09T11:19:04.817387066+00:00 stdout F at java.base/java.util.Optional.orElseThrow(Optional.java:408) ~[na:na]
2021-02-09T11:19:04.817387066+00:00 stdout F at xxx.xxx.microservices.rest.internal.XXXXXXX.getXXXX(XXXXXXXXXXX.java:101) ~[classes!/:na]
2021-02-09T11:19:04.817387066+00:00 stdout F at jdk.internal.reflect.GeneratedMethodAccessor854.invoke(Unknown Source) ~[na:na]
2021-02-09T11:19:04.817387066+00:00 stdout F at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:na]
2021-02-09T11:19:04.817387066+00:00 stdout F at java.base/java.lang.reflect.Method.invoke(Method.java:566) ~[na:na]
2021-02-09T11:19:04.817387066+00:00 stdout F at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:190) ~[spring-web-5.2.7.RELEASE.jar!/:5.2.7.RELEASE]
I have read this article (https://coralogix.com/log-analytics-blog/parsing-multiline-logs-the-complete-guide/) but it however only address the Spring Boot only before the cluster metadata is added.

rsyslog server generating log files named for IP address instead of access_log

I have a syslog-ng server configured to send all apache log messages to a remote rsyslog server. Here are the pertinent part of my syslog-ng server's config:
source s_http {
file("/var/log/httpd/access_log" flags(no-parse));
};
...
destination loghost { tcp("10.0.0.48" port(514)); };
...
log { source(s_http); destination(loghost); };
I was hoping to find on the remote rsyslog server (10.0.0.48) the file: /apps/log/my-web-server/access_log. but instead I find several files in the /apps/log/my-web-server/ named for the IP address of the clients that hit my-web-server with a .log extension.
[root#10.0.0.48]# pwd
/apps/log/my-web-server
[root#10.0.0.48]# ls -l
total 140
-rw-------. 1 root root 4862 Aug 14 16:39 10.0.0.97.log
-rw-------. 1 root root 193 Aug 14 15:45 10.0.0.201.log
Why aren't the log messages going into one file named access_log?
Update:
On the rsyslog server at 10.0.0.48 I see these lines in the /etc/rsyslog.conf
$template RemoteStore, "/apps/log/%HOSTNAME%/%PROGRAMNAME%.log"
$template RemoteStoreFormat, "%msg%\n"
:source, !isequal, "localhost" -?RemoteStore;RemoteStoreFormat
:source, isequal, "last" STOP
what does that mean?
I needed to change ...
source s_http {
file("/var/log/httpd/access_log" flags(no-parse));
};
... to this ...
source s_http {
file("/var/log/httpd/access_log" program-override("apache_access_log"));
};

Extract logs between time frame with non specific time stamp using bash

I want to fetch log between two time stamps but i do not have specific time stamps with me. I can use the command sed for fetching if I have specific time stamp in log using the following command
sed -rne '/$StartTime/,/$EndTime/'p <filename>
My query is that since the specific StartTime and EndTime which I'm fetching from my DB might not be present in the log file, I will have to fetch the log between times near to the StartTime and EndTime that I provide using >= and <= signs. I tried the following command but it does not work.
awk '$0>=st && $0<=et' st=$StartTime et=$EndTime <filename>
Sample input and output
Input
Time retrieved from DB
StartTime - 2017-11-02 10:20:00
EndTime - 2017-11-02 11:20:00
The time present in log
T1 - 2017-11-02 10:17:44
T2 - 2017-11-02 11:19:32
Output: Entire Log text between T1 & T2
Sample Log
2017-03-03 10:43:18,736 [main] WARN - ORACLE_HOSTNAME=xxxxxxxxxx[OVERRIDES:
xxxxxxxxxxxxxxxx]
2017-03-03 10:43:18,736 [main] WARN - NLS_DATE_FORMAT=DD-MON-YYYY
HH24:MI:SS [OVERRIDES: DD-MON-YYYY HH24:MI:SS]
2017-03-03 10:43:18,736 [main] WARN - xxxxUsername=MDMPIUSER [OVERRIDES: MDMPIUSER]
2017-03-03 10:43:18,736 [main] WARN - BUNDLE_GEMFILE=uri:classloader://installer/Gemfile [OVERRIDES: uri:classloader://installer/Gemfile]
2017-03-03 10:43:18,736 [main] WARN - TIMEOUT=900 [OVERRIDES: 900]
2017-03-03 10:43:18,736 [main] WARN - SHLVL=4 [OVERRIDES: 4]
2017-03-03 10:43:18,736 [main] WARN - HISTSIZE=1000 [OVERRIDES: 1000]
2017-03-03 10:43:18,736 [main] WARN - JAVA_HOME=/usr/java/jdk1.8.0_60/jre [OVERRIDES: /usr/java/jdk1.8.0_60/jre]
2017-03-03 10:43:20,156 [main] WARN - APP_PROPS=/home/xxx/conf/appProperties [OVERRIDES: /home/xxx/conf/appProperties]
You can try
awk -v start="$StartTime" -v end="$EndTime" '
function fonct(date)
{
gsub(/-|,| |:/,"",date)
return date
}
BEGIN{
start=fonct(start)
end=fonct(end)
}
{
a=fonct($1$2)
if (a>=start && a<=end)print $0
}' infile

How do I add a header to a message using the header-enricher Spring Cloud Stream App Starter?

I'm trying to add a header with a key of "order_id" and a value based on a property in the payload to my messages. I then send the result to a log sink where I can inspect the headers after the header processor. Here's the stream:
stream create --name add-header-to-message-stream
--definition
":aptly-named-destination
> add-order_id-header: header-enricher
--header.enricher.headers='order_id=payload.order.id \\n fizz=\"buzz\"'
| log
--log.expression=headers"
I do not see keys of "order_id" or "fizz" in the headers map when I tail the log sink. I'm able to deploy the stream and run data through the pipeline with no errors. How do I add headers to my messages?
This works fine for me, but only with a single header...
dataflow:>stream create foo --definition "time --fixedDelay=5 |
header-enricher --headers='foo=payload.substring(0, 1)' |
log --expression=#root " --deploy
With result
2017-06-21 08:28:38.459 INFO 70268 --- [-enricher.foo-1] log-sink : GenericMessage [payload=06/21/17 08:28:38, headers={amqp_receivedDeliveryMode=PERSISTENT, amqp_receivedRoutingKey=foo.header-enricher, amqp_receivedExchange=foo.header-enricher, amqp_deliveryTag=1, foo=0, amqp_consumerQueue=foo.header-enricher.foo, amqp_redelivered=false, id=302f1d5b-ba90
I am told that this...
--headers='foo=payload.substring(0, 1) \n bar=payload.substring(1,2)'
...or this...
--headers='foo=payload.substring(0, 1) \u000a bar=payload.substring(1,2)'
should work, but I get a parse error...
Cannot find terminating ' for string time --fixedDelay=5 | header-enricher --headers='foo=payload.substring(0, 1)
bar=payload.substring(1,2)' | log --expression=#root
...I am reaching out to the shell/deployer devs and will provide an update if I have one.
I tested with a literal value (single header) too...
dataflow:>stream create foo --definition "time --fixedDelay=5 |
header-enricher --headers='foo=\"bar\"' |
log --expression=#root " --deploy
2017-06-21 08:38:17.684 INFO 70916 --- [-enricher.foo-1] log-sink : GenericMessage [payload=06/21/17 08:38:17, headers={amqp_receivedDeliveryMode=PERSISTENT, amqp_receivedRoutingKey=foo.header-enricher, amqp_receivedExchange=foo.header-enricher, amqp_deliveryTag=8, foo=bar, amqp_consumerQueue=foo.header-enricher.foo, amqp_redelivered=false, id=a92f4908-af13-53aa-205d-e25e204d04a3, amqp_consumerTag=amq.ctag-X51lhhRWBbEDVSyzp3rGmg, contentType=text/plain, timestamp=1498048697684}]

how to turn off tomcat5 'session expire' log messages?

I've been trying to turn off these excessive log messages without success.
INFO | jvm 1 | 2010/08/19 14:36:30 | DEBUG [ContainerBackgroundProcessor[StandardEngine[Catalina]]] (ManagerBase.java:677) - Start expire sessions StandardManager at 1282242990088 sessioncount 0
INFO | jvm 1 | 2010/08/19 14:36:30 | DEBUG [ContainerBackgroundProcessor[StandardEngine[Catalina]]] (ManagerBase.java:685) - End expire sessions StandardManager processingTime 0 expired sessions: 0
I added the following line to my app's WEB-INF/logging.properties file.
org.apache.catalina.session.ManagerBase.level=WARNING
Is this right? Is there somewhere else that I need to put it?
Try the following under Tomcat/conf/logging.properties:
org.apache.catalina.session.ManagerBase.level = INFO
And as mentioned, verify that this is not being overwritten in your custom log4j.properties file.
This has worked for our team with Tomcat v6.37 and log4j1.2.13.jar.
From the source code of ManagerBase.java, that seems to be the right setting
protected Log log = LogFactory.getLog(ManagerBase.class);
....
if(log.isDebugEnabled())
log.debug("Start expire sessions "
+ getName() + " at " + timeNow + " sessioncount " + sessions.length);
I wonder if the logging.properties in the TOMCAT_HOME/conf/ is overriding the one in the WEB-INF
Can you change it there as well?
I know this is old but I put this in my logback.xml and it stopped the messages:
<logger name="org.apache.catalina.session.ManagerBase">
<level value="INFO" />
</logger>

Resources