I have several configuration files in /etc/rsyslog.d, e.g., 01-templates.conf, 02-error-logs.conf, 03-system-logs.conf, and have the following line in my /etc/rsyslog.conf file
# Include all config files in /etc/rsyslog.d/
$IncludeConfig /etc/rsyslog.d/*.conf
What order do the /etc/rsyslog.d/*.conf files get loaded in? Where is this documented? I read the Rsyslog configuration page but did not see it.
According to this and this post to the rsyslog-users mailing list, it's processed in alphabetical order.
But with the exception of some 7.2.x and 7.3.x versions where a bug caused them to be read in reverse order.
Related
I am trying to generate the report in JMeter by merging .jtl files content getting below error -
File '/home/ajij/jmeter_tests_cli/merged.jtl' does not contain the field names header, ensure the jmeter.save.saveservice.* properties are the same as when the CSV file was created or the file may be read incorrectly when generating report
An error occurred: Error while processing samples: Consumer failed with message :Could not parse timeStamp <timeStamp> using format defined by property jmeter.save.saveservice.timestamp_format=ms on sample timeStamp,elapsed,label,responseCode,responseMessage,threadName,dataType,success,failureMessage,bytes,sentBytes,grpThreads,allThreads,URL,Latency,IdleTime,Connect
Note -
If i run 2 independent jtl files then will get run successfully
Command to execute JTL file -
../Documents/apache-jmeter-5.2/bin/jmeter.sh -g merged.jtl -o ./folder
JTL File content -
https://drive.google.com/file/d/1j6kZ7mUj0IbT6hWS0KR3BsZVK4t6wQzj/view?usp=sharing
Quick Help will be appreciated !!!
As per JMeter's documentation:
The dashboard generator is a modular extension of JMeter. Its default behavior is to read and process samples from CSV files to generate HTML files containing graph views. It can generate the report at end of a load test or on demand.
As of current latest stable version JMeter 5.3 generating dashboards from .jtl files in XML format is not supported so either re-run your test with the following JMeter property defined:
jmeter.save.saveservice.output_format=csv
or review your way of "merging" the result files, if it supports CSV output format - go for it.
Did you try removing the XML tags? I haven't tested yet using the CSV you shared, but it might be enough removing the XML tags and merging "manually" the files, I'd merge the files by removing the second headers and the XML tags, remember JMeter process the requests by time so it will order the requests by timestamp
I have a requirement where my elk has to pick each file in a folder as a single log.
I have parent_folder/ inside which a folder will be created for a run run_folder/ inside which a few types of log files are created. I need to push each file as a single log into elastic search.
Folder Structure
parent_folder/run1/file1.log
parent_folder/run1/file2.err
parent_folder/run1/file3.diff
...
parent_folder/run2/file1.log
parent_folder/run2/file2.err
parent_folder/run2/file3.diff
Elastic search should have
doc1{
message: the content of parent_folder/run1/file1.log
}
doc2{
message: the content of parent_folder/run1/file2.err
}
doc3{
message: the content of parent_folder/run2/file2.err
}
... so on
These files like parent_folder/run2/file2.err are written once and never changed or touched again, no need to monitor for changes.
Thanks
With filebeat, you can make use of multiline patterns. Find a pattern that never match on your log file and configure something like below in filebeat configuration.
multiline.pattern: 'never_matching_pattern'
multiline.match: after
Reference: https://discuss.elastic.co/t/filebeat-send-the-entire-logfile-as-a-single-message/118265
I am trying to configure NXlog to work with AlienVault based on the guide here
I installed the custom config file from AlienVault and modified the destination IP only. When I did this I could not get the NXlog service to start - Then I reinstalled the default config but I still cannot get it to open.
I edited the file in notepad which I thing should be safe, however I have read here that it is possibly the UTF-8 BOM - I am not sure how to check if there is one but I do not believe there is because I only used notepad.
The first line in the config file looks like so:
define ROOT C:\Program Files (x86)\nxlog
The NXlog Log file with the errors is only displaying this error:
nxlog failed to start: Invalid keyword: define at C:\Program Files (x86)\nxlog\conf\nxlog.conf:1
Not very helpful - Seems to be choking on the very first word - Anyone seen this before???
I'm pretty sure that's caused by the UTF-8 BOM in your config file. I suggest using and checking with an editor that can handle this. In HEX mode you can confirm whether the file has a BOM or not.
The NXLog EE v4.0 can cope with the BOM properly BTW.
As B0ti mentioned, my problem was caused by the BOM - I couldn't figure out how to fix this on windows so I downloaded the file into a Linux environment and fixed it there. To do so follow these steps -
First I verified there was a BOM in place with the file command:
ex: file filename.txt -This will print information about the file - if there is a BOM you will see that.
Next I followed the answer here for removing the BOM:
Basically just do this in the Linux box - sed '1s/^\xEF\xBB\xBF//' < orig.txt > new.txt
Then I transferred the new file back to the Windows box and all was right with the world!
I have my camel configured to download all files from a specific FTP directory. Now this is all easy enough and everything seems to be working fine. However, I am running into errors when the files contain a space in their names such as File 123.csv. I know I could specifically target the files with an escape character. The only difficulty is that these files are dynamic in nature and change daily, so I will not know which files may or may not have spaces.
I figure I can just read all the file URI and make adjustments from there. But I was wondering if there is any Camel specific way to handle this.
Errors: java.lang.IllegalArgumentException: Illegal character in path at index 60: hdfs://test.net/user/CamelTests/File Layout.csv
GenericFileOnCompletion - Rollback file strategy: org.apache.camel.component.file.strategy.GenericFileRenameProcessStrategy#fe8d1b for file: RemoteFile[File Layout.csv]
Camel Code
from("{{ftp.serverLP}}/Memo/Loss?username=ftp&password=pass")
.to("hdfs2://Test.net/user/CamelTests/?fileSystemType=HDFS")
.log("Downloaded file ${file:name} complete.");
Try changing the .to(..) to use a non-HDFS file system.
The error posted seems to indicate a problem with the destination to which the files are being copied (HDFS), not the FTP source.
I would like to delete a file after all the rows in the file have been processed.
My streams look like
source (file --fixedDelay=0 --outputType=text/plain --dir=XXX) |
splitter --expression=payload.split('\\n') |
transform -> filter -> sink
My files are stored in a directory, that is being watched by the file module. I would like that each file is deleted after it has been processed.
Thanks.
Indeed it is surprising that the file source does not have an option for deleting the file after processing, which can be confirmed by looking at the configuration file in xd/modules/source/file/config/file.xml (as of version 1.1.0).
While the file source does not have this option, the sftp source does have it. Hence you could use the sftp source. This will require an ssh server on the machine where spring xd is installed. Does this help?
You may also want to add your own custom source module by configuring a file transformer. File transformers support a delete-files="true" option:
http://docs.spring.io/spring-integration/reference/html/files.html