How can I add variable content in file?
sh "ssh root#${host} 'echo '$vari' > text.txt'"
This gives empty file
Without variable it works:
sh "ssh root#${host} 'echo some text > text.txt'"
You can use the writeFile command offered by Jenkins:
writeFile file: "text.txt", text: YOUR_VARIABLE
This are 2 ways how you can write the content of a variable to a file:
pipeline {
agent any
environment {
VAR = "hello world!"
}
stages {
stage('Write to file'){
steps{
sh 'echo "${VAR}" > test.txt'
sh "echo ${VAR} >> test.txt"
sh 'cat test.txt'
}
}
}
}
Output:
[test] Running shell script
+ echo hello world!
[Pipeline] sh
[test] Running shell script
+ echo hello world!
[Pipeline] sh
[test] Running shell script
+ cat test.txt
hello world!
hello world!
Related
I have a trap method which neither access global variables nor receive variables using $*. It looks like this:
#!/usr/bin/env bash
set -euo pipefail
IFS=$'\n\t'
declare -a arr
finish_exit() {
echo "* = $*"
echo "arr = ${arr[*]}"
}
trap 'finish_exit "${arr[#]}"' EXIT
main() {
arr+=("hello")
arr+=("world")
}
main | tee -a /dev/null
The script prints ''.
If I remove the | tee -a ... snippet, the script prints 'hello\nworld' twice as expected.
Now, how can I pipe the output to a logfile WITHOUT loosing all context?
One solution would be to route everything from the script call, like so:
./myscript.sh >> /dev/null, but I think there should be a way to do this inside the script so I can log EVERY call, not just those run by cron.
Another solution I investigated was:
main() {
...
} >> /dev/null
But this will result in no output on the interactive shell.
Bonus karma for explanations why this subshell will "erase" global variables before the trap function is being called.
will make trap method not see global variables
The subshell does "sees" global variables, it does not execute the trap.
why this subshell will "erase" global variables
From https://www.gnu.org/savannah-checkouts/gnu/bash/manual/bash.html :
Command substitution, commands grouped with parentheses, and asynchronous commands are invoked in a subshell environment that is a duplicate of the shell environment, except that traps caught by the shell are reset to the values that the shell inherited from its parent at invocation.
how can I pipe the output to a logfile WITHOUT loosing all context?
#!/bin/bash
exec 1> >(tee -a logfile)
trap 'echo world' EXIT
echo hello
or
{
trap 'echo world' EXIT
echo hello
} | tee -a logfile
And research: https://serverfault.com/questions/103501/how-can-i-fully-log-all-bash-scripts-actions and similar.
The following script:
#!/usr/bin/env bash
set -euo pipefail
{
set -euo pipefail
IFS=$'\n\t'
declare -a arr
finish_exit() {
echo "* = $*"
echo "arr = ${arr[*]}"
}
trap 'finish_exit "${arr[#]}"' EXIT
main() {
arr+=("hello")
arr+=("world")
}
main
} | tee -a /dev/null
outputs for me:
* = hello
world
arr = hello
world
I added that set -o pipefail before the pipe, to preserve the exit status.
Say I have a bash script and I want some variables to appear when sourced and others to only be accessible from within the script (both functions and variables). What's the convention to achieve this?
Let's say test.sh is your bash script.
What you can do is extract all the common items and put them in common.sh which can be sourced by other scripts.
The BASH_SOURCE array helps you here:
Consider this script, source.sh
#!/bin/bash
if [[ ${BASH_SOURCE[0]} == "$0" ]]; then
# this code is run when the script is _executed_
foo=bar
privFunc() { echo "running as a script"; }
main() {
privFunc
publicFunc
}
fi
# this code is run when script is executed or sourced
answer=42
publicFunc() { echo "Hello, world!"; }
echo "$0 - ${BASH_SOURCE[0]}"
[[ ${BASH_SOURCE[0]} == "$0" ]] && main
Running it:
$ bash source.sh
source.sh - source.sh
running as a script
Hello, world!
Sourcing it:
$ source source.sh
bash - source.sh
$ declare -p answer
declare -- answer="42"
$ declare -p foo
bash: declare: foo: not found
$ publicFunc
Hello, world!
$ privFunc
bash: privFunc: command not found
$ main
bash: main: command not found
I have a script test.sh and I am trying to ssh and call that scrip't function in the same script:
#!/bin/sh
testme()
{
echo "hello world"
}
ssh myserver "/opt/scripts/test.sh; testme"
But I keep getting testme command not found
What is the correct way of calling a function from a script after ssh?
If you use Bash on both sides, you can have it serialize the function for you:
#!/bin/bash
testme()
{
echo "hello world"
}
ssh myserver "$(declare -f testme); testme"
If you need sh compatibility, this is not an option.
I am trying to create a multi line file in Jenkins pipeline script using the below commands.
sh "echo \"line 1\" >> greetings.txt"
sh "echo \"line 2\" >> greetings.txt"
echo "The contents of the file are"
sh 'cat greetings.text'
sh 'rm -rf greetings.txt'
Unforunately , I am not able to create the file named greetings.txt. Can any one please let me know, where I am going wrong.
Results in Jenkins console:
[tagging] Running shell script
+ echo 'line 1'
[Pipeline] sh
[tagging] Running shell script
+ echo 'line 2'
[Pipeline] echo
The contents of the file are
[Pipeline] sh
[tagging] Running shell script
+ cat greetings.text
cat: greetings.text: No such file or directory
Any suggestions would be helpful.
Thanks!
This can be solve this by using single quotes with sh, so you don't need to use escaping. Also you have to create an initial file with > and add content with >>:
pipeline{
agent any
stages{
stage('write file'){
steps{
sh 'echo "line 1" > greetings.txt'
sh 'echo "line 2" >> greetings.txt'
echo "The contents of the file is"
sh 'cat greetings.txt'
sh 'rm -rf greetings.txt'
}
}
}
}
output:
[test] Running shell script
+ echo line 1
[Pipeline] sh
[test] Running shell script
+ echo line 2
[Pipeline] echo
The contents of the file is
[Pipeline] sh
[test] Running shell script
+ cat greetings.txt
line 1
line 2
[Pipeline] sh
[test] Running shell script
+ rm -rf greetings.txt
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS
It's not finding a file named greetings.text because you didn't create one (a litte typo in the extension in your cat line). Try sh 'cat greetings.txt', or even better adjusted your script:
sh "echo \"line 1\" >> greetings.txt"
sh "echo \"line 2\" >> greetings.txt"
echo "The contents of the file are"
sh 'cat greetings.txt'
sh 'rm -rf greetings.txt'
If you want to use multiline commands, you can also use this syntax:
sh """
echo \"line 1\" >> greetings.txt
echo \"line 2\" >> greetings.txt
echo "The contents of the file are:"
cat greetings.txt
rm -rf greetings.txt
"""
From the last example, this should generate an output like:
Running shell script
+ echo 'line 1'
+ echo 'line 2'
+ echo 'The contents of the file are:'
The contents of the file are:
+ cat greetings.txt
line 1
line 2
+ rm -rf greetings.txt
I want to execute a cURL command, extract Json variable(using jq) & save in variable in Jenkins Pipeline. In Freestyle project under Build when I select Execute Shell and give below commands & I am getting valid Success output with all the values.
,
deployment_info=$(curl -H "Authorization: Basic a123=" "https://api.sample.com/v1")
rev_num=$(jq -r .environment[0].revision[0].name <<< "${deployment_info}" )
env_name=$(jq -r .environment[0].name <<< "${deployment_info}" )
api_name=$(jq -r .name <<< "${deployment_info}" )
org_name=$(jq -r .organization <<< "${deployment_info}" )
declare -r num1=1
pre_rev=$(expr "$rev_num" - "$num1")
echo $rev_num
echo $api_name
echo $org_name
echo $env_name
echo $pre_rev
Now I want to execute the same set of commands in a Pipeline. So this is my Pipeline,
def deployment_info
def rev_num
def env_name
def org_name
def api_name
def pre_rev
def num1
node {
stage('Integration Tests') {
sh "deployment_info=\$(curl --header 'Authorization: Basic abc123=' 'https://api.sample.com/v1')"
sh "rev_num=\$(jq -r .environment[0].revision[0].name <<< \"${deployment_info}\")"
sh "env_name=\$(jq -r .environment[0].name <<< \"${deployment_info}\" ) "
sh "api_name=\$(jq -r .name <<< \"${deployment_info}\" ) "
sh "org_name=\$(jq -r .organization <<< \"${deployment_info}\" )"
sh "declare -r num1=1"
sh "pre_rev=\$(expr \"$rev_num\" - \"$num1\")"
sh "echo $rev_num"
sh "echo $api_name"
sh "echo $org_name"
sh "echo $env_name"
sh "echo $pre_rev"
}
}
The cURL cmd is getting executed & a valid JSON response is seen in console, but after that, I am getting this error,
[Pipeline] sh
[curlpip] Running shell script
++ jq -r '.environment[0].revision[0].name'
+ rev_num=null
[Pipeline] sh
[curlpip] Running shell script
++ jq -r '.environment[0].name'
+ env_name=null
[Pipeline] sh
[curlpip] Running shell script
++ jq -r .name
+ api_name=null
[Pipeline] sh
[curlpip] Running shell script
++ jq -r .organization
+ org_name=null
[Pipeline] sh
[curlpip] Running shell script
+ declare -r num1=1
[Pipeline] sh
[curlpip] Running shell script
++ expr null - null
expr: non-integer argument
+ pre_rev=
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
ERROR: script returned exit code 2
Finished: FAILURE
Any help is appreciated.