Rsyslog Regex and DynaFile template - rsyslog

I'm using the below template to extract only required fields from the firewall log to save bandwidth and license cost in Splunk. It's working when I save it to the static file but I need file names should be saved in timestamp so that I will rotate the old logs. I'm trying to use DynaFile but I don't know how can I use both templates for a single log?
Working template with static file:
template(name="clean" type="string"
string="%TIMESTAMP% %HOSTNAME% %msg:R,ERE,0,DFLT:type=\"([^\"]*)\"--end% %msg:R,ERE,0,DFLT:subtype=\"([^\"]*)\"--end% %msg:R,ERE,0,DFLT:level=\"([^\"]*)\"--end% %msg:R,ERE,0,DFLT:eventtime=[1-9]+--end% %msg:R,ERE,0,DFLT:srcip=(.*) srcport=[0-9]+--end% %msg:R,ERE,0,DFLT:srcintf=\"([^\"]*)\"--end%\n"
)
if $hostname == '192.168.0.1' then action(type="omfile" file="/var/log/firewall.log" template="clean")
How can I save this outcome of this template using DynaFile? Thanks for your time.

The omfile module accepts the parameter "dynaFile=" instead of "file=" to specify a template for a dynamic filename.
If you just use %timestamp% in your filename, it will probably create a new file for each message, as the timestamp includes hours, minutes and seconds. One possibility is to convert the timestamp into a standard format called rfc3339, and then just take the year-month-date part of that string, using a property replacer.
template(name="mydynafile" type="string" string="/var/log/my-%timestamp:1:10:date-rfc3339%.log")
if $hostname == '192.168.0.1' then action(type="omfile" dynaFile="mydynafile" template="clean")

Related

Give a name to a pdf printed on screen in Genexus

With a Genexus procedure, setting the call protocol to Http and the output_file rule, you can create a report and show to the user a pdf, basic Genexus tool. My problem is that I can't set the name of this pdf, it ignores the parameter of the output_file rule and if I try to save the pdf manually, it's named as the name of the procedure.
Can I set the name of the pdf somehow? Better if I can send it as parameter
Add this code to the procedure.
// &DocumentFriendlyName is varchar(100)
&HttpResponse.AddHeader(!"Content-Type", !"application/pdf")
&HttpResponse.AddHeader(!"Content-Disposition", !"attachment;filename=" + &DocumentFriendlyName + !".pdf")
If you don't want to download the PDF directly, create the PDF on the server, then use a &Window object to show it.
&Window.Url = &DownloadPdfUrl
&Window.Open()
If you add the rule:
Output_file(&FileName, 'PDF');
It does not generate the file with the value of the variable &FileName?

upload multiple Test cases at one time in ALM?

I'm trying to upload multiple Test cases at one go. How to upload multiple Test cases at one time in ALM ?
All flow files which you would upload should be updated with name attribute.
Make sure the src folder has a properties file named as “multipleFlows.properties” or you would have to create it.
Update the multipleFlows.properties file with all the flow ids and flow xml path that you would like to upload through ALMSync as mentioned below.
Ex: multipleFlows.properties file should contain as below format
flow1_id=flow1_xml_path
flow2_id=flow2_xml_path
flow3_id=flow3_xml_path
flow4_id=flow4_xml_path
Open the Run Configuration ALMSync >> Arguments tab and update the arguments as
createTestCase flow_map multipleFlows

Rsyslog omprog pass message to scripts

Accurately, I want to filter logs and send some warning email.
Firstly, I tried ommail, but unfortunately, this module only support mail server which do not need authentication, but my mail server needs.
So I tried to use omprog, I wrote a python script to logon to my mail server, it will recieve one parameter which is the log and send it as mail body.
Then I got the problem, I cannot pass the log to my script, if I try like this, $msg will be recognized as a string .
if $fromhost-ip == "x.x.x.x" then {
action(type="omprog"
binary="/usr/bin/python3 /home/elancao/Python/sendmail.py $msg")
}
I tried to search the official doc.
module(load="omprog")
action(type="omprog"
binary="/path/to/log.sh p1 p2 --param3=\"value 3\""
template="RSYSLOG_TraditionalFileFormat")
but in the sample, what they are using is a string "p1", not a dynamic parameter.
Can you please help? Thanks a lot!
The expected use of omprog is for your program to read stdin and there it will find the full default RSYSLOG_FileFormat template data (with date, host, tag, msg). This is useful, as it means you can write your program so that it is started only once, and then it can loop and handle all messages as they arrive.
This cuts down on the overhead of restarting your program for each message, and makes it react faster. However, if you prefer, your program can exit after reading one line, and then rsyslog will restart it for the next message. (You may want to implement confirmMessages=on).
If you just want the msg part as data, you can use template=... in the action to specify your own minimal template.
If you really must have the msg as an argument, you can use the legacy filter syntax:
^program;template
This will run program once for each message, passing it as argument the output of the template. This is not recommended.
if omprog script is not doing or not saving to a file the problem is that :
rsyslog is sending the full message to that script so you need to define or use a template
your script needs to listen to and return an
example in perl whit omprog
#my $input = join( '-', #ARGV ); ///not working I lost 5 hours of my life
my $input = ; now this is what you need
Hope this what the perl/python/rsyslog community needs.

SSIS dynamic flat file connection to load daily file with date-time,minute,second timestamp

I have to load the daily csv file from a network location which has the date time stamp with minute and second when it gets exported from api and saved to a network location.
I am trying to make my package dynamic so it does not change when the file name changes every other day. I have tried using an expression in the flat file manager connection properties but that not working either.
My file name looks like following:
DS_All_users_with_additional_fields_2018_12_11_10_00.csv
which i have tried to solve my using the following expression but things gets complicated if there is delay in the csv export and the minute and second changes in the file name:
#[User::DataLoadDir]+"DS_All_users_with_additional_fields_"+(DT_STR,4,1252)YEAR( DATEADD( "dd", -1, getdate() ))+"_"+(DT_STR,4,1252)MONTH( DATEADD( "dd", -1, getdate() ))+"_"+(DT_STR,4,1252)DAY( DATEADD( "dd", 0, getdate() ))+"_10_00.csv"
Any suggestions how to solve this problem?
You can use a foreach loop file enumerator and apply a filespec expression of:
DS_All_users_with_additional_fields*.csv
The * servers as a wild card and will pick up all files matching that string. You can work with this in order to make it flexible based off your needs. In this case, the job will scan for all files that are available in a specific folder that matches the above string. This can then be assigned to a variable, which you can use to dynamically set the connection string.
I don't think you can add the * into the connection string itself.
Update
To set a connection manager's connection string property, see the photo below. It is import to note that this solution will change the work flow. Your initial work flow was telling the connection manager what file to specifically look for. However, by implementing a foreach loop, the job is now searching for any and all files that are available in a specific folder path. Note: you will need to make sure that you include the fully qualified domain name (FQDN) in the connection string variable (i.e., \\networkpath\filename.csv)
Are the files that you need to import the only files in that directory with a name that starts with DS_All_users_with_additional_fields_? If so, use a Script Task to find the most recent one and set the variable used in the connection string to this name. The following example uses LINQ to look for files that begin with the name you listed, then sorts them by the date they were created on, and returns the name of the first one. The Name property below will include the extension. You can also get the complete file path by changing this to the FullName property, in which case you could just use this value for the variable used by the flat file connection string, as opposed to concatenating it with the #[User::DataLoadDir] variable. This example does reference System.IO and System.Linq as specified below.
using System.IO;
using System.Linq;
string filePath = Dts.Variables["User::DataLoadDir"].Value.ToString();
DirectoryInfo di = new DirectoryInfo(filePath);
FileInfo mostRecentFile = (from f in di.GetFiles().Where(x =>
x.Name.StartsWith("DS_All_users_with_additional_fields_"))
orderby f.CreationTime descending
select f).First();
//The Name property below can be changed to FullName to get the complete file path
Dts.Variables["User::VariableHoldingFileName"].Value = mostRecentFile.Name;

BIRT: Specifying XML Datasource file as parameter does not work

Using BIRT designer 3.7.1, it's easy enough to define a report for an XML file data source; however, the input file name is written into the .rptdesign file as constant value, initially. Nice for the start, but useless in real life. What I want is start the BIRT ReportEngine via the genReport.bat script, specifying the name of the XML data source file as parameter. That should be trivial, but it is surprisingly difficult...
What I found out is this: Instead of defining the XML data source file as a constant in the report definition you can use params["datasource"].value, which will be replaced by the parameter value at runtime. Also, in BIRT Designer you can define the Report Parameter (datasource) and give it a default value, say "file://d:/sample.xml".
Yet, it doesn't work. This is the result of my Preview attempt in Designer:
Cannot open the connection for the driver: org.eclipse.datatools.enablement.oda.xml.
org.eclipse.datatools.connectivity.oda.OdaException: The xml source file cannot be found or the URL is malformed.
ReportEngine, started with 'genReport.bat -p "datasource=file://d:/sample.xml" xx.rptdesign' says nearly the same.
Of course, I have made sure that the XML file exists, and tried different spellings of the file URL. So, what's wrong?
What I found out is this: Instead of defining the XML data source file as a constant in the report definition you can use params["datasource"].value, which will be replaced by the parameter value at runtime.
No, it won't - at least, if you specify the value of &XML Data Source File as params["datasource"].value (instead of a valid XML file path) at design time then you will get an error when attempting to run the report. This is because it is trying to use the literal string params["datasource"].value for the file path, rather than the value of params["datasource"].value.
Instead, you need to use an event handler script - specifically, a beforeOpen script.
To do this:
Left-click on your data source in the Data Explorer.
In the main Report Design pane, click on the Script tab (instead of the Layout tab). A blank beforeOpen script should be visible.
Copy and paste the following code into the script:
this.setExtensionProperty("FILELIST", params["datasource"].value);
If you now run the report, you should find that the value of the parameter datasource is used for the XML file location.
You can find out more about parameter-driven XML data sources on BIRT Exchange.
Since this is an old thread but still usefull, i ll add some info :
In the edit datasource, add some url to have sample data to create your dataset
Create your dataset
Then remove url as shown
add some script

Resources