I've been scouring through the rsyslog documentation for a way to anonymize mysql log data by removed quoted strings. I've successfully managed to detect strings with sensitive data using the :contains property but I can't seem to find a way to replace.
I've looked through the property options and the regex functionality. I believe I may be missing something because none of those provide a straight way for find and replace.
AFAIK, there's no way currently to do regex replace in rsyslog. The cleanest way (I see) for achieving what you need is to parse your logs with mmnormalize (more documentation can be found at liblognorm, which is the library mmnormalize uses). Then, you can access all the parsed properties, and put whatever you want in templates. Templates let you select what properties from the messages get written in MySQL.
The benefit of this solution is that mmnormalize should be faster than using regular expressions. The problem is that you'll probably need a new version of rsyslog (probably 8.x) to get it working properly.
Related
go-pg is a Golang library for PostgreSQL. In SQL one could update an entire column by applying a regular expression, e.g.:
update <some-column> set x = regexp_replace(x,'^.*\/[0-9]+(.*)$', '\1hello');
Problem
According the README, one could perform a bulk update. However, no information regarding regular expression were neither found in the issue tracker, nor in the documentation.
Question
Does this library support regexp_replace updates?
It does not support it as an ORM, but it supports plain SQL. I personally do not like to run it as such, but there seems to be no other choice when this library is used at the moment. One benefit is that the statement will be run in the flow of the go app. For example, once the file paths have been changed on disk, the database could be updated in a controlled way.
Is it possible to save a bunch of queries into a single JSON file to import in Kibana Console?
I know there's an option to save a single query[2] and the Kibana console is based on local storage, but I would like to load up the queries based on parameters, such that changing the params(e.g load_from=filename.json) should load up a different set of queries.
For example, when I open http://localhost:5601/app/kibana#/dev_tools/console?load_from=filename.json, it should open the Kibana console with ES queries from the file.
EDIT: As a workaround, it's possible to do this with Postman API Client or similar API clients.
Solution:
EDIT 2 on 22/02/2022: Kibana Spaces is the answer. It lets you organize dashboards and other saved objects into meaningful categories[3]. Whenever you load http://localhost:5601/ it lets you choose the space you want to work with. Having multiple browser tabs with different saved spaces should work for most cases.
[2] https://www.elastic.co/guide/en/kibana/master/save-load-delete-query.html
[3] https://www.elastic.co/guide/en/kibana/master/xpack-spaces.html
Unfortunately, that's not possible yet.
Elastic is (supposedly) working on a new Kibana feature (tabbed console panes #10095) that will provide support for better organizing the code in the Dev Tools application. The issue has been opened for a while and not much seems to be happening, so we'll see.
The release date of that feature is not known yet.
Is there a way to configure the names of the files exported from Logging?
Currently the file exported includes colons. This are invalid characters as a path element in hadoop, so PySpark for instance cannot read these files. Obviously the easy solution is to rename the files, but this interferes with syncing.
Is there a way to configure the names or change them to no include colons? Any other solutions are appreciated. Thanks!
https://github.com/apache/hadoop/blob/trunk/hadoop-common-project/hadoop-common/src/site/markdown/filesystem/introduction.md
At this time, there is no way to change the naming convention when exporting log files as this process is automated on the backend.
If you would like to request to have this feature available in GCP, I would suggest creating a PIT. This page allows you to report bugs and request new features to be implemented within GCP.
My Input clause produces a value A and another value B, and I want to send an email notification with some text that contain the result of A/B in it. I had an early version that worked locally where I used a Groovy script in a Transform clause so I had a new variable ctx.payload.result holding the result of A/B, but elastic.co will only let me use Expression scripts because of security reasons. I also tried to resolve the expression inline in the email's body {{var}} tag, but apparently it doesn't resolve expressions.
Remember, I can't use Groovy to modify the payload. Any ideas?
Elastic's Cloud allows users to enable the Groovy scripts as well (disabled by default). From the cluster creation page of Cloud:
Elasticsearch can use scripts to implement flexible ranking, filtering, faceting and more. It is important to restrict their usage, as they enable arbitrary code execution. When enabling sandboxed scripts the Painless scripting language is enabled in 5.0. In the older versions "expression" and "mustache" languages will be enabled. When enabling all scripts, Groovy and any language provided by plugins will be available.
Fairly new to the use of Jenkins, but I am looking for a way to get test results and feed it back into Jenkins.
What this means is, I have a bash scripts that collects metrics on a number of applications and checks to see whether or not the files exist. I collect this data to a plain text file, basically with counters 1/5, 2/5, 5/10 etc.
The output can be however I want it, but I was wondering if there is a good/clean process that can take this data file and output it nicely inside of Jenkins web interface?
We also use Trac as well.. so if there is a Trac plugin that can do something similar, that would be good too.
The best practices would say to escape them and use them as parameters to a parameterized jenkins build or a file save/capture. Parameters are finicky and subject to url encoding, so I would personally do file passing using a shared filesystem such as S3. Your best bet is https://wiki.jenkins-ci.org/display/JENKINS/Remote+access+API