Appending to a configuration file - bash

I am creating a script which updates hosts in an application, the config file for each host looks like that below. The script generates the hosts correctly but I need to append every } with a comma , except the last host.
I have tried numerous things but the closest I have got is putting the hosts content on a single line and running a IFS statement against it. Im also not sure how best to approach this, can anyone advise?
{
"cmd": "ssh user#webserver",
"inTerminal": "new",
"name": "webserver",
"theme": "basic",
"title": "Webserver",
}
example of what I am trying to achieve
{
"cmd": "ssh user#webserver",
"inTerminal": "new",
"name": "webserver",
"theme": "basic",
"title": "Webserver",
},
{
"cmd": "ssh user#db",
"inTerminal": "new",
"name": "db server",
"theme": "basic",
"title": "db",
},
{
"cmd": "ssh user#mail",
"inTerminal": "new",
"name": "mail server",
"theme": "basic",
"title": "mail server",
}

You can do things like:
#!/bin/sh
for f in $(generate-host-list); do
read -d \000 c < "$f"
list="$list${list+,
}$c"
done
echo "$list"
If you are just writing to a file that can be simpler (no need for the read, just cat the file). Similarly, if you don't care about munging whitespace, you could do list="$list${list+,}$(cat "$f"). If you are using bash or some other shells you can do non-portable things like += to clean it up.

You can do it like this:
sed '$q; s/^}$/},/' <in_file >out_file
The above sed command works as follows: First check if you've reached the last
line, and if so quit. Otherwise, it'll check if the only character on the line
is }, and if so replace it with },.

Related

How to extract data from a JSON file into a variable

I have the following json format, basically it is a huge file with several of such entries.
[
{
"id": "kslhe6em",
"version": "R7.8.0.00_BNK",
"hostname": "abacus-ap-hf-test-001:8080",
"status": "RUNNING",
},
{
"id": "2bkaiupm",
"version": "R7.8.0.00_BNK",
"hostname": "abacus-ap-hotfix-001:8080",
"status": "RUNNING",
},
{
"id": "rz5savbi",
"version": "R7.8.0.00_BNK",
"hostname": "abacus-ap-hf-test-005:8080",
"status": "RUNNING",
},
]
I wanted to fetch all the hostname values that starts with "abacus-ap-hf-test" and without ":8080" into a variable and then wanted to use those values for further commands over a for loop something like below. But, am bit confused how can I extract such informaion.
HOSTAME="abacus-ap-hf-test-001 abacus-ap-hf-test-005"
for HOSTANAME in $HOSTNAME
do
sh ./trigger.sh
done
The first line command update to this:
HOSTAME=$(grep -oP 'hostname": "\K(abacus-ap-hf-test[\w\d-]+)' json.file)
or if you sure that the hostname end with :8080", try this:
HOSTAME=$(grep -oP '(?<="hostname": ")abacus-ap-hf-test[\w\d-]+(?=:8080")' json.file)
you will find that abacus-ap-hf-test[\w\d-]+ is the regex, and other strings are the head or the end of the regex content which for finding result accuracy.
Assuming you have valid JSON, you can get the hostname values using jq:
while read -r hname ; do printf "%s\n" "$hname" ; done < <(jq -r .[].hostname j.json)
Output:
abacus-ap-hf-test-001:8080
abacus-ap-hotfix-001:8080
abacus-ap-hf-test-005:8080

bash or powershell parsing jason file content

I would like to manipulate the content of jason file.
I've tried with powershell or linux bash but I was unable to get what I want.
On linux, I was thinking to use jq tool, despite obtains data, I cannot manipulate them.
jq '.[].pathSpec, .[].scope' jasonfilepath
Current output:
"file"
"file"
"/u01/app/grid/*/bin/oracle"
"/u01/app/oracle/product/*/db_1/bin/oracle"
My goal is to obtain something similar as:
scope pathSpec
Like:
file /u01/app/grid/*/bin/oracle
file /u01/app/oracle/product/*/db_1/bin/oracle
Jason file sample
[
{
"actions": [
"upload",
"detect"
],
"deep": false,
"dfi": true,
"dynamic": true,
"inject": false,
"monitor": false,
"pathSpec": "/u01/app/grid/*/bin/oracle",
"scope": "file"
},
{
"actions": [
"upload",
"detect"
],
"deep": false,
"dfi": true,
"dynamic": true,
"inject": false,
"monitor": false,
"pathSpec": "/u01/app/oracle/product/*/db_1/bin/oracle",
"scope": "file"
}
]
Do you have any idea to get this kind of expected output in Powershell and bash?
Thanks by advance,
Assuming a JSON input file named file.json:
In a Linux / Bash environment, use the following:
jq -r '.[] | .scope + " " + .pathSpec' file.json
In PowerShell, use the following (adapted from a comment by JohnLBevan):
(Get-Content -Raw file.json | ConvertFrom-Json) |
ForEach-Object { '{0} {1}' -f $_.scope, $_.pathSpec }
Note the (...) around the pipeline with the ConvertFrom-Json call, which is necessary in Windows PowerShell (but no longer in PowerShell (Core) 7+) to ensure that the parsed JSON array is enumerated in the pipeline, i.e. to ensure that its elements are sent one by one - see this post for more information.

json processing in .sh files

[
{
"id": "1636ea48-28b7-783a-48dd-5e041f10d9e6",
"name": "Test_Component1",
"desiredVersions": [],
"children": false
},
{
"id": "1636f939-136f-4609-ab93-238b1af193fe",
"name": "Test_Component2",
"desiredVersions": [],
"children": false
}
]
I am writing command in Execute Shell window in Jenkins. I have this json in a variable. I want to extract both Id values so further processing in next set of command can be done.
Using jq:
$ echo "$var" | jq '.[].id'
"1636ea48-28b7-783a-48dd-5e041f10d9e6"
"1636f939-136f-4609-ab93-238b1af193fe"
is it a string? If so, you can use a regexp expression to extract the ID´s values. Like:
(\"id\"\:\W)\"(.+)(\"\,)

Bad indentation of a sequence entry bitbucket pipelines

I currently have a step in bitbucket pipelines which does some stuff. The last step is to start an aws ecs task, like this:
- step:
name: Migrate database
script:
- curl "https://s3.amazonaws.com/aws-cli/awscli-bundle.zip" -o "awscli-bundle.zip"
- apt-get update
- apt-get install -y unzip python
- unzip awscli-bundle.zip
- ./awscli-bundle/install -b ~/bin/aws
- export PATH=~/bin:$PATH
- aws ecs run-task --cluster test-cluster --task-definition test-task --overrides '{ "containerOverrides": [ { "name": "test-container", "command": [ "echo", "hello world" ], "environment": [ { "name": "APP_ENV", "value": "local" } ] } ] }' --network-configuration '{ "awsvpcConfiguration": { "subnets": ["subnet-xxxxxxx"], "securityGroups": ["sg-xxxxxxx"], "assignPublicIp": "ENABLED" }}' --launch-type FARGATE
This fails the validation with the error:
Bad indentation of a sequence entry bitbucket pipelines
Splitting the statement up on multiple lines is not working either. What would be the correct approach here?
The issue is you have a colon followed by a space, which causes the YAML parser to interpret this as a map and not a string.
The easiest solution would be to move
aws ecs run-task --cluster test-cluster --task-definition test-task --overrides '{ "containerOverrides": [ { "name": "test-container", "command": [ "echo", "hello world" ], "environment": [ { "name": "APP_ENV", "value": "local" } ] } ] }' --network-configuration '{ "awsvpcConfiguration": { "subnets": ["subnet-xxxxxxx"], "securityGroups": ["sg-xxxxxxx"], "assignPublicIp": "ENABLED" }}' --launch-type FARGATE
Into a script file, and call it from Pipelines.
You could also remove all the spaces after any ':' characters. But given the amount of JSON there, you'd likely encounter the same issue again when modifying it. So the script file is probably the easier option here.

Changing files (using sed) in Packer script leaves files unchanged

currently I am looking into building a build-pipeline using packer and docker.
This is my packer.json:
{
"builders": [{
"type": "docker",
"image": "php:7.0-apache",
"commit": true
}],
"provisioners": [
{
"type": "file",
"source": "./",
"destination": "/var/www/html/"
},
{
"type": "shell",
"inline": [
"chown -R www-data:www-data /var/www/html",
"sed '/<Directory \\/var\\/www\\/>/,/<\\/Directory>/ s/AllowOverride None/AllowOverride all/' /etc/apache2/apache2.conf",
"sed '/<VirtualHost/,/<\\/VirtualHost>/ s/DocumentRoot \\/var\\/www\\/html/DocumentRoot \\/var\\/www\\/html\\/web/' /etc/apache2/sites-enabled/000-default.conf"
]
}
]
}
The Shell Script inside the provisioners section contains some sed commands for changing the AllowOverride and DocumentRoot variables inside the apache config.
When packer runs this script it is working all fine and I am getting a positive sed output, so sed seems to work fine. But in the docker image the files are unchanged.
Copying the files in the file provisioner is working fine.
What am I doing wrong?
It seems you're missing -i (or --in-place) flag in your sed commands. Try with:
"sed -i <expression> <file>"

Categories

Resources