Rsyslog forwarding over HTTP - rsyslog

I would like rsyslog to forward log messages via HTTP to the service which will process them.
I don't see the exact http-forwarding module for Rsyslog, and I don't want to create another listener on another port just for handling incoming TCP connections, as it would be required with the TCP-output module.
Is it possible or what are the alternatives to process Rsyslog messages by HTTP handler?

Since rsyslog version 8.2202, there is the omhttp module.
Here is an example of what you'd need to implement in /etc/rsyslog.conf:
# include the omhttp module
module(load="omhttp")
# template for each indivdual message, not the format of the resulting batch
template(name="tpl_omhttp_forwarding" type="string" string="%msg%")
# action to send ALL log (files) messages via http
*.* {
action(
type="omhttp"
server="192.1.1.1"
serverport="443"
template="tpl_omhttp_forwarding"
batch="on"
batch.format="jsonarray"
batch.maxsize="10"
action.resumeRetryCount="-1"
)
}
All the action parameters for omhttp, you can find here
NOTE:
Depending on the OS you're using, you may need to build it yourself, or use the repositories here.
For some platforms, there is no omhttp package because of missing or too old dependencies.

There is a new output module called omhttp. I'm looking into it as well, but am having difficulty finding documentation.
https://github.com/rsyslog/rsyslog/issues/3024
Edit: Updated docs are here
https://www.rsyslog.com/doc/v8-stable/configuration/modules/omhttp.html#message-batching

Related

I have a problem on sending Downstream to a gateway

I'm using a dragino DLOS8 gateway and a dragino end node lt-22222-l. I wrote a script to read and show the values in my end node's inputs but I couldn't control my relays. I found an example of a script (in a dragino article titled Communication with ABP End Node) showing this function to control them( it controlled the digital outputs but I changed it to relays) which is:
echo "${DEV_2},imme,hex,030101" > /var/iot/push/down
I even tried with more specified one:
echo "${DEV_1},imme,hex,030101,20,1,SF12,869525000,1" > /var/iot/push/down
in the article it indicates that I have to create a file in the directory /var/iot/push for downstream purpose. I tried using winscp and the command touch down but it deleted few seconds after. if there is anyone that used those devices or knows about this please help me.
I had a similar problem with Dragino, also with device logfiles in /var/iot/channels directory. Got information from Dragino support that those files are "consumed" by MQTT and TCP processes, so are periodically deleted: I undestand that LoRaWAN or MQTT or TCP application has to work with those files as soon as they are generated.
Notice that "imme" sends downstream immediately to C type devices, maybe "time" (downstream after receiving data from node) is better for your application.

Download one file at a time through the same session in Apache Camel FTP

I want to implement following use case with Apache Camel FTP:
On a remote location I have 0 to n amount of files stored.
When I receive a command, using FTP, I want to download one file as a byte array (which one does not matter), if any files are available.
When the file is downloaded, I want to save it in a database as a blob.
Then I want to delete the stored/processed file on the remote location
Wait for the next download command and once received go back to step 1.
The files have to be downloaded through the same FTP session.
My problem is that if I use a normal FTP route, it downloads all available files.
When I tell the route to only download one, I have to create a new route for the other files and I cannot reuse the FTP session.
Is there a way to implement this use case with Apache Camel FTP?
Camel-ftp doesn't consume all available files at once it consumes them individually one after another meaning that each file gets processed separately. If you need to process them in some specific order you can try using file-name or modified date with sortBy option.
If you want to control when file gets downloaded i.e when command gets called you can call FTP Consumer endpoint using pollEnrich
Example:
// 1. Loads one file from ftp-server with timeout of 3 seconds.
// 2. logs the body and headers
from("direct:example")
.pollEnrich("ftp:host:port/directoryName", 3000)
.to("log:loggerName?showBody=true&showHeaders=true");
You can call the direct consumer endpoint with ProducerTemplate you can obtain from CamelContext or change it to whatever consumer endpoint fits your use case.
If you need to use dynamic URI you can use simple to provide the URI for poll-enrich and also also provide timeout afterwards.
from("direct:example")
.pollEnrich()
.simple("ftp:host:port/directoryName?fileName=${headers.targetFile}")
.timeout(3000)
.to("log:loggerName?showBody=true&showHeaders=true");

send payara logs to graylog via syslog and set correct source

I have a graylog instance that's running a UDP-Syslog-Input on Port 1514.
It's working wonderfully well for all the system logs of the linux servers.
When I try to ingest payara logs though [1], the "source" of the message is set to "localhost" in graylog, while it's normally the hostname of the sending server.
This is suboptimal, because in the best case I want the application logs with correct source in graylog also.
I googled around and found:
https://github.com/payara/Payara/blob/payara-server-5.2021.5/nucleus/core/logging/src/main/java/com/sun/enterprise/server/logging/SyslogHandler.java#L122
It seems like the syslog "source" is hard-coded into payara (localhost).
Is there a way to accomplish sending payara-logs with the correct "source" set?
I have nothing to do with the application server itself, I just want to receive the logs with the correct source (the hostname of the sending server).
example log entry in /var/log/syslog for payara
Mar 10 10:00:20 localhost [ INFO glassfish ] Bootstrapping Monitoring Console Runtime
I suspect I want the "localhost" in above example set to fqdn of the host.
Any ideas?
Best regards
[1]
logging.properties:com.sun.enterprise.server.logging.SyslogHandler.useSystemLogging=true
Try enabling "store full message" in the syslog input settings.
That will add the full_message field to your log messages and will contain the header, in addition to what you see in the message field. Then you can see if the source IP is in the UDP packet. If so, collect those messages via a raw/plaintext UDP input and the source should show correctly.
You may have to parse the rest of the message via an extractor or pipeline rule, but at least you'll have the source....
Well,
this might not exactly be a good solution but I tweaked the rsyslog template for graylog.
I deploy the rsyslog-config via Puppet, so I can generate "$YOURHOSTNAME-PAYARA" dynamically using the facts.
This way, I at least have the correct source set.
$template GRAYLOGRFC5424,"<%PRI%>%PROTOCOL-VERSION% %TIMESTAMP:::date-rfc3339% YOURHOSTNAME-PAYARA %APP-NAME% %PROCID% %MSGID% %STRUCTURED-DATA% %msg%\n"
if $msg contains 'glassfish' then {
*.* #loghost.domain:1514;GRAYLOGRFC5424
& ~
} else {
*.* #loghost.domain:1514;RSYSLOG_SyslogProtocol23Format
}
The other thing we did is actually activating application logging through log4j and it's syslog appender:
<Syslog name="syslog_app" appName="DEMO" host="loghost" port="1514" protocol="UDP" format="RFC5424" facility="LOCAL0" enterpriseId="">
<LoggerFields>
<KeyValuePair key="thread" value="%t"/>
<KeyValuePair key="priority" value="%p"/>
<KeyValuePair key="category" value="%c"/>
<KeyValuePair key="exception" value="%ex"/>
</LoggerFields>
</Syslog>
This way, we can ingest the glassfish server logs and the independent application logs into graylog.
The "LoggerFields" in log4j.xml appear to be key-value pairs for the "StructuredDataElements" according to RFC5424.
https://logging.apache.org/log4j/2.x/manual/appenders.html
https://datatracker.ietf.org/doc/html/rfc5424
That's the problem with UDP Syslog. The sender gets to set the source in the header. There is no "best answer" to this question. When the information isn't present, it's hard for Graylog to pass it along.
It sounds like you may have found an answer that works for you. Go with it. Using log4j solves two problems and lets you define the source yourself.
For those who face a similar issue, a simpler way to solve the source problem might be to use a static field. If you send the payara syslog messages to their own input, you can create a static field that could substitute for the source to identify traffic from that source. Call it "app_name" or "app_source" or something and use that field for whatever sorting you need to do.
Alternatively, if you have just one source for application messages, you could use a pipeline to set the value of the source field to the IP or FQDN of the payara server. Then it displays like all the rest.

Failover when reading from FTP sites using Camel

We need to download multiple files on a hourly basis from a vendors FTP site. The vendor provides two ftp sites for fault tolerance with both sites having identical files. I would like to setup a Camel route to download the files from ftp site A if its available and if not try ftp site B. The following code is incorrect but it may highlight what I am trying to achieve.
from("timer://timer1?fixedRate=true&period=60m")
.loadBalance()
.failover(-1, false, true)
.to("direct:ftp-symbolguides-1")
.to("direct:ftp-symbolguides-2")
.end();
from("direct:ftp-symbolguides-1")
.to("ftp://SiteA?localWorkDirectory=c:/temp&passiveMode=true&noop=true&idempotentKey=${file:name}-${file:size}&idempotentRepository=#idempotentRepository")
.to("file:/c:/temp/inbox");
from("direct:ftp-symbolguides-2")
.to("ftp://SiteB?password=publicftp&localWorkDirectory=c:/temp&passiveMode=true&noop=true&idempotentKey=${file:name}-${file:size}&idempotentRepository=#idempotentRepository")
.to("file:/c:/temp/inbox");
Does anyone have any thoughts on how I can achieve this?
When the connection fails by default ftp consumer produces a WARN message instead of throwing an exception, but by the throwExceptionOnConnectionFailed property you can change to throw exception, and handle exception from the
PollingConsumerPollStrategy[1] rollback() method.
http://camel.apache.org/maven/current/camel-core/apidocs/org/apache/camel/spi/PollingConsumerPollStrategy.html

How to receive a specific file thru BizTalk FTP receive port

My orchestration receives a message that contains a file name and I want to pick that file from my FTP. I can configure FTP receive port to receive all files from some folder in the FTP, but how do I receive a file with specific name?
The process looks like
I would rather recommend your solution of writing a custom .NET component which will fetch the file from FTP location (you can call that component from your expression shape).
Dynamically creating Receive Ports/Receive locations and later removing them won't scale and possibly will get you into serious trouble.
I'm not sure if this link helps - specifically the CreateFtpReceiveLocation method - i.e. programatically adding a receive location (pseudo dynamic receive location).
You would also need to remove the location afterwards. I'm guessing that you can also set the FileMask on the Transport Properties of the 'dynamic' Location to the filename in the Custom Props of the TransportTypeData, and would need to remove the Location once you are done with the file.

Resources