Print/extract changelog.md specific sections - shell

I need to extract/print specific sections using a shell script for the following changelog file in markdown:
# Changelog
All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [Unreleased]
## [4.0.75 - 507] - 2019-06-24
### Fixed
- Changelog text
## [4.0.75 - 506] - 2019-06-21
### Fixed
- Changelog text
## [4.0.75 - 505] - 2019-06-17
### Fixed
- Changelog text
- Changelog text
- Changelog text
## [4.0.75 - 504] - 2019-06-11
### Added
- Changelog text
## [4.0.74 - 503] - 2019-05-29
### Added
- Changelog text
- Changelog text
## [4.0.73 - 502] - 2019-05-22
Examples of what I would like to achieve:
Input section 4.0.75 - 507
Desired output:
## [4.0.75 - 507] - 2019-06-24
### Fixed
- Changelog text
Input section 4.0.75
Desired output:
## [4.0.75 - 507] - 2019-06-24
### Fixed
- Changelog text
## [4.0.75 - 506] - 2019-06-21
### Fixed
- Changelog text
## [4.0.75 - 505] - 2019-06-17
### Fixed
- Changelog text
- Changelog text
- Changelog text
## [4.0.75 - 504] - 2019-06-11
### Added
- Changelog text
What would be the best solution using a shell script ? I tried awk without any success.
Thank you for your help

Assuming your blank lines are indeed blank (eg, there's no whitespace), you could do:
awk 's{ print ""; print}; $0~v {print; s=1; next}; s=0' RS= v=4.0.74 input

Related

Gitlab CI is it possible to keep common lines in one file and refer it in other files

.gitlab-ci.yml
include:
- local: file-1.gitlab-ci.yml
- local: file-2.gitlab-ci.yml
file-1.gitlab-ci.yml
some-script: &some-script
.......
job1:
script:
- *some-script
file-2.gitlab-ci.yml
some-script: &some-script
.......
job2:
script:
- *some-script
Here some-script(shown in the example) is same in both files file-1 and file-2 but I have to maintain it in two places. It is possible to keep some-script in a common file say file-3 and use it from there in file-1 file-2, so that if there are changes, the changes only need to be done in file-3
Using the !reference tag could solve your problem.
You could move the some-script part to its own file-3 file.
.some-script:
script:
- ...
And then reference the script block from file-3 in file-1/file-2.
include:
- local: file-3.gitlab-ci.yml
job1:
script:
- !reference [.some-script, script]

YAML Modify Alias Sequence Elements

I'm working on a configuration file format for a program and I was wondering if it is possible to modify specific elements of a sequence defined in an alias.
For example,
# Example multi-model device configuration.
---
aliases:
- &cisco_default
vendor: cisco
cmds:
- terminal length 0 # keep
- show version | include Model number # keep
- show boot | include BOOT path-list # change this one below
- "dir flash: | include bin$" # and this one
- quit # keep
config:
- *cisco_default
- <<: *cisco_default
models:
- c4500
- c3650
cmds:
- show boot | include BOOT variable
- "dir bootflash: | include bin$"
I am using Go to process and unmarshal the YAML into a struct. So, if this behavior is not possible with plain YAML, is there an easy way to modify the cmds sequence using Go's text templates or something similar? Also, I need to preserve the order of the commands.
Got a solution by aliasing the cmds. Here is a working configuration that allows looping the commands in order:
---
aliases:
- &cisco_default
vendor: cisco
cmds: &cisco_cmds
0: terminal length 0
1: show version | include Model number
2: show boot | include BOOT path-list
3: "dir flash: | include bin$"
4: quit
config:
# Default Cisco configuration.
- *cisco_default
# Cisco 4500 and 3650 model configuration.
- <<: *cisco_default
models:
- c4500
- c3650
cmds:
<<: *cisco_cmds
2: show boot | include BOOT variable
3: "dir bootflash: | include bin$"

FileBeat Service is not starting due to yml configuration

This is my filebeat.yml file …
i am getting error :1053 whenever i am starting filebeat service.
may be some mistake i am doing in this file, please correct me where i am wrong.
###################### Filebeat Configuration Example #########################
# This file is an example configuration file highlighting only the most common
# options. The filebeat.full.yml file from the same directory contains all the
# supported options with more comments. You can use it as a reference.
#
# You can find the full configuration reference here:
# https://www.elastic.co/guide/en/beats/filebeat/index.html
#=========================== Filebeat prospectors =============================
filebeat.prospectors:
# Each - is a prospector. Most options can be set at the prospector level, so
# you can use different prospectors for various configurations.
# Below are the prospector specific configurations.
# Paths that should be crawled and fetched. Glob based paths.
paths:
- E:\ELK-STACK\logstash-tutorial-dataset.log
input_type: log
document_type: apachelogs
# document_type: apachelogs
#paths:
# - E:\ELK-STACK\mylogs.log
#fields: {log_type: mypersonal-logs}
#- C:\Logs\GatewayService\GatewayService-Processor.Transactions-20170810
# - C:\ECLIPSE WORKSPACE\jcgA1\jcgA1\logs-logstash.*
# Exclude lines. A list of regular expressions to match. It drops the lines that are
# matching any regular expression from the list.
#exclude_lines: ["^DBG"]
# Include lines. A list of regular expressions to match. It exports the lines that are
# matching any regular expression from the list.
#include_lines: ["^ERR", "^WARN"]
# Exclude files. A list of regular expressions to match. Filebeat drops the files that
# are matching any regular expression from the list. By default, no files are dropped.
#exclude_files: [".gz$"]
# Optional additional fields. These field can be freely picked
# to add additional information to the crawled log files for filtering
#fields:
# level: debug
# review: 1
### Multiline options
# Mutiline can be used for log messages spanning multiple lines. This is common
# for Java Stack Traces or C-Line Continuation
# The regexp Pattern that has to be matched. The example pattern matches all lines starting with [
#multiline.pattern: ^\[
# Defines if the pattern set under pattern should be negated or not. Default is false.
#multiline.negate: false
# Match can be set to "after" or "before". It is used to define if lines should be append to a pattern
# that was (not) matched before or after or as long as a pattern is not matched based on negate.
# Note: After is the equivalent to previous and before is the equivalent to to next in Logstash
#multiline.match: after
#================================ General =====================================
# The name of the shipper that publishes the network data. It can be used to group
# all the transactions sent by a single shipper in the web interface.
#name:
# The tags of the shipper are included in their own field with each
# transaction published.
#tags: ["service-X", "web-tier"]
# Optional fields that you can specify to add additional information to the
# output.
#fields:
# env: staging
#================================ Outputs =====================================
# Configure what outputs to use when sending the data collected by the beat.
# Multiple outputs may be used.
#-------------------------- Elasticsearch output ------------------------------
#output.elasticsearch:
# Array of hosts to connect to.
# hosts: ["localhost:9200"]
# Optional protocol and basic auth credentials.
#protocol: "https"
#username: "elastic"
#password: "changeme"
#----------------------------- Logstash output --------------------------------
output.logstash:
# The Logstash hosts
hosts: ["localhost:5043"]
# Optional SSL. By default is off.
# List of root certificates for HTTPS server verifications
#ssl.certificate_authorities: ["/etc/pki/root/ca.pem"]
# Certificate for SSL client authentication
#ssl.certificate: "/etc/pki/client/cert.pem"
# Client Certificate Key
#ssl.key: "/etc/pki/client/cert.key"
#================================ Logging =====================================
# Sets log level. The default log level is info.
# Available log levels are: critical, error, warning, info, debug
#logging.level: debug
# At debug level, you can selectively enable logging only for some components.
# To enable all selectors use ["*"]. Examples of other selectors are "beat",
# "publish", "service".
#logging.selectors: ["*"]
Actually what i am trying to do is, i am trying to use multiple logs specifying "document_type", if i remove "document_type" then it works, but why "document_type"(as i see this depcreated in filebeat 5.5) or "fields" is not working in it.
please help.
You have a syntax error in your config file.
The filebeat.prospectors keys wants an array value, but you are passing it a hash instead.
Plus, you have indentation problems.
This is a corrected version of your config file (without comments for brevity)
filebeat.prospectors:
-
paths:
- E:\ELK-STACK\logstash-tutorial-dataset.log
input_type: log
document_type: apachelogs
output.logstash:
hosts: ["localhost:5043"]

Bash - Insert string with newlines after the information, that looks like block of text

I have a string ($pastestring) with newlines, that I declared like this
pastestring=$'\n##### Branch FREEZE enable/disable:\n Freeze: enable '
I have a document (with path $file). The document is rather big, but my task is: when i find a text, that is simular to this.
######## Branch Owner #########
Owner: James Jones
Mail: James#gmail.com
###############################
(so the amount of "#" symbols can be different, Owner and Mail parameters can be different too)
I need to make a new line and paste information that I have inside of document, so it should look like
######## Branch Owner #########
Owner: James Jones
Mail: James#gmail.com
###############################
##### Branch FREEZE enable/disable
Freeze: enable
First of all, I don't really know, how to make a pattern that will help me to find a text, that looks like the text above (I don't know, how to pattern with newlines inside).
If I am trying to paste my test like this (in this case I look only at "###############################" symbols)
sed -i "/###############################/a "${pastestring}"" $file
I get a lot of mistakes because of newlines and special symbols inside of string.
sed: -e expression #1, char 35: expected \ after `a', `c' or `i'
Any help to fixing this appreciated!
UPDATE
#Inian
The content of the $file looks like this
######## Branch Owner #########
Owner: Jones Jane
Mail: Jones#gmail.com
###############################
##### Branch RELEASE enable/disable
Release: disable
##### Branch configuration enable/disable
Release: enable
##### Branch output enable/disable
Release: enable
I need to put my string in file after
######## Branch Owner #########
Owner: Jones Jane
Mail: Jones#gmail.com
###############################
It can be identified, by the way, that it looks
##### (some symbols #)
Owner: (some text)
Mail: (some text)
##### (some symbols #)
And I don't really know, how to make a pattern for it.
The result should be
######## Branch Owner #########
Owner: Jones Jane
Mail: Jones#gmail.com
###############################
##### Branch FREEZE enable/disable
Freeze: disable
##### Branch RELEASE enable/disable
Release: disable
##### Branch configuration enable/disable
Release: enable
##### Branch output enable/disable
Release: enable
UPDATE
#Inian, could you take a look again? I have just found out another problem here. Your scripts pastes the text twice. The content of file is
######## Branch Owner #########
Owner: Jones Jane
Mail: Jones#gmail.com
###############################
##### Branch RELEASE enable/disable
Release: disable
##### Branch configuration enable/disable
Release: enable
##### Branch output enable/disable
Release: enable
##### Branch Contributors #####
user1
user2
user3
user4
yooseftal
markout
(and other users)
###############################
#### Code Review enable/disable
CR: enable
So the script pastes text also after the Branch Contributors section
If you don't mind using Awk for this, you can do it as below,
pastestring=$'\n##### Branch FREEZE enable/disable:\nFreeze: enable '
awk -v string="${pastestring}" '/Branch Owner/{print; flag=1; next}$0 ~ "^[#]*$" && flag && NF{print; print string; flag=0; next}1' file "$file"
which produces an output as below,
######## Branch Owner #########
Owner: James Jones
Mail: James#gmail.com
###############################
##### Branch FREEZE enable/disable:
Freeze: enable
The key to the solution is to identify the regex only for the line starting with # which can be done by
$0 ~ "^[#]*$"
which means if line contains only the # character, print that line and print the line required and the condition && NF to ensure the regex doesn't match blank lines for performing the new line insertion. The /Branch Owner/{flag=1; next} is to ensure no other similar tags are matched for appending below.
The part -v string="${pastestring}" imports the variable in bash context to the context of Awk since them both being different.
In-case, you want to over-write the file with the new contents, create a temporary file from the output of awk and re-direct it back to the original file ( equivalent of your sed in-place editing)
awk -v string="${pastestring}" '/Branch Owner/{print; flag=1; next}$0 ~ "^[#]*$" && flag && NF{print; print string; flag=0; next}1' "$file" > temp && mv temp "$file"
A Perl solution that performs an inplace replacement and checks Mail pattern:
perl -0777 -i -pne 's/(Mail:[^\n]*\n#+\n)/\1\n##### Branch FREEZE enable\/disable\nFreeze: enable\n/g' your_file
$ cat your_file
######## Branch Owner #########
Owner: Jones Jane
Mail: Jones#gmail.com
###############################
##### Branch FREEZE enable/disable
Freeze: enable
##### Branch RELEASE enable/disable
Release: disable
##### Branch configuration enable/disable
Release: enable
##### Branch output enable/disable
Release: enable

SaltStack: Reverse engineering where a file comes from

If you look at a host which was set up be SaltStack, then it is sometimes like looking at a binary file with vi.
You have no clue how the config/file was created.
This makes trouble shooting errors hard. Reverse engineering where a file comes from takes too much time.
My goal: Make it easy to find the way from looking at the unix config file on the minion (created by salt) to the source where this configuration came from. Like $Id$ in svn and cvs.
One idea a friend and I had:
The state file.managed should (optionally) add the source of the file.
Example:
My sls file contains this:
file_foo_bar:
file.managed:
- source:
- salt://foo/bar
Then the created file should contain this comment.
# Source: salt://foo/bar
Of course this is not simple, since there are different ways to put comments into configuration files.
Is this feasible? Or is there a better solution to my goal.
Update
Usually I know what I did wrong and can find the root easily. The problem arises if several people work on a state tree.
This is a starting point where you can get the date and time of the modified file when its managed by Salt by using Salt Pillar.
Lets call our variable salt_managed. Create a pillar file like the following:
{% set managed_text = 'Salt managed: File modified on ' + salt.cmd.run('date "+%Y-%m-%d %H:%M:%S"') %}
salt_managed: {{ managed_text | yaml_dquote }}
Then on the minion when you call the pillar you will get the following result:
$ salt-call pillar.get salt_managed
local:
Salt managed: File modified on 2016-10-18 11:12:40
And you can use this by adding it on the top of your config files for example like this:
{{ pillar.get('salt_managed') }}
Update:
I found a work around that might be useful for someone. Lets say we have a multiple states that could modify the same file. How can we know that State X is the responsible for modifying that file ? by doing the following steps:
1- I have created a state like this one:
Create a File:
file.managed:
- name: /path/to/foofile
- source: salt://statedir/barfile
Add file header:
file.prepend:
- name: /path/to/foofile
- text: "This file was managed by using this salt state {{ sls }}"
The contents of barfile is:
This is a new file
2- Call the state from the minion and this will be the result:
$ salt-call state.sls statedir.test
local:
----------
ID: Create a File
Function: file.managed
Name: /path/to/foofile
Result: True
Comment: File /path/to/foofile updated
Started: 07:50:45.254994
Duration: 1034.585 ms
Changes:
----------
diff:
New file
mode:
0644
----------
ID: Add file header
Function: file.prepend
Name: /path/to/foofile
Result: True
Comment: Prepended 1 lines
Started: 07:50:46.289766
Duration: 3.69 ms
Changes:
----------
diff:
---
+++
## -1,1 +1,2 ##
+This file was managed by using this salt state statedir.test
This is a new file
Summary for local
------------
Succeeded: 2 (changed=2)
Failed: 0
------------
Total states run: 2
Currently the content of foofile is:
This file was managed by using this salt state statedir.test
This is a new file

Resources