Terraform, Looking for a simple way to use double quotation marks in commands? - provisioning

I need a simple way of using regular quotations " in the provisioner "remote-exec" block of my terraform script. Only " will work for what I would like to do and just trying \" doesn't work. Whats the easiest way to have terraform interpret my command literally. For reference here is what I am trying to run:
provisioner "remote-exec" {
inline = [
"echo 'DOCKER_OPTS="-H tcp://0.0.0.0:2375 -H unix:///var/run/docker.sock"' > /etc/default/docker",
]
}

Escaping with backslashes works fine for me:
$ cat main.tf
resource "null_resource" "test" {
provisioner "local-exec" {
command = "echo 'DOCKER_OPTS=\"-H tcp://0.0.0.0:2375\"' > ~/terraform/37869163/output"
}
}
$ terraform apply .
null_resource.test: Creating...
null_resource.test: Provisioning with 'local-exec'...
null_resource.test (local-exec): Executing: /bin/sh -c "echo 'DOCKER_OPTS="-H tcp://0.0.0.0:2375"' > ~/terraform/37869163/output"
null_resource.test: Creation complete
Apply complete! Resources: 1 added, 0 changed, 0 destroyed.
...
$ cat output
DOCKER_OPTS="-H tcp://0.0.0.0:2375"

Related

How to call a variable of string with spaces in a terraform provisioner?

I am trying to run terraform provisioner which is calling my ansible playbook , now I am passing public key as a variable from user . When passing public key it doesnt take the entire key and just ssh-rsa , but not a complete string.
I want to pass the complete string as "ssh-rsa Aghdgdhfghjfdh"
The provisioner in terraform which I am running is :
resource "null_resource" "bastion_user_provisioner" {
provisioner "local-exec" {
command = "sleep 30 && ansible-playbook ../../../../ansible/create-user.yml --private-key ${path.module}/${var.project_name}.pem -vvv -u ubuntu -e 'username=${var.username}' -e 'user_key=${var.user_key}' -i ${var.bastion_public_ip}, -e 'root_shell=/bin/rbash' -e 'raw_password=${random_string.bastion_password.result}'"
}
}
If i run playbook alone as:
ansible-playbook -i localhost create-user.yml --user=ubuntu --private-key=kkk000.pem -e "username=kkkkk" -e 'user_key='ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC+GWlljlLzW6DOEo"' -e root_shell="/bin/bash"
it works,
But I want the string to be in a terraform variable which is passed in provisioner.
I want to have key copied to a file as
ssh-rsa AWRDkj;jfdljdfldkf'sd.......
and not just
ssh-rsa
You are getting bitten by the -e key=value splitting that goes on with the command-line --extra-args interpretation [citation]. What you really want is to feed -e some JSON text, to stop it from trying to split on whitespace. That will also come in handy for sufficiently complicated random string passwords, which would otherwise produce a very bad outcome when trying to pass them on the command-line.
Thankfully, there is a jsonencode() function that will help you with that problem:
resource "null_resource" "bastion_user_provisioner" {
provisioner "local-exec" {
command = <<SH
set -e
sleep 30
ansible -vvv -i localhost, -c local -e '${jsonencode({
"username"="${var.username}",
"user_key"="${var.user_key}",
"raw_password"="${random_string.bastion_password.result}",
})}' -m debug -a var=vars all
SH
}
}

Escape double quotes in a Jenkins pipeline file's shell command

Below is a snippet from my Jenkins file -
stage('Configure replication agents') {
environment {
AUTHOR_NAME="XX.XX.XX.XX"
PUBLISHER_NAME="XX.XX.XX.XX"
REPL_USER="USER"
REPL_PASSWORD="PASSWORD"
AUTHOR_PORT="4502"
PUBLISHER_PORT="4503"
AUTHOR="http://${AUTHOR_NAME}:${AUTHOR_PORT}"
PUBLISHER="http://${PUBLISHER_NAME}:${PUBLISHER_PORT}"
S_URI= "${PUBLISHER}/bin/receive?sling:authRequestLogin=1"
}
steps {
sh 'curl -u XX:XX --data "status=browser&cmd=createPage&label=${PUBLISHER_NAME}&title=${PUBLISHER_NAME}&parentPath =/etc/replication/agents.author&template=/libs/cq/replication/templates/agent" ${AUTHOR}/bin/wcmcommand'
}
The above command, in Jenkins console, is printed as
curl -u XX:XX --data status=browser&cmd=createPage&label=XXXX&title=XXX&parentPath =/etc/replication/agents.author&template=/libs/cq/replication/templates/agent http://5XXXX:4502/bin/wcmcommand
Note how the double quotes "" are missing.
I need to preserve the double quotes after --data in this command. How do I do it?
I tried using forward slashes but that didnt work.
Cheers
To expand on my comment, a quick test revealed its the case.
You need to escape twice, once the quote for the shell with a slash, and once that slash with a slash for groovy itself.
node() {
sh 'echo "asdf"'
sh 'echo \"asdf\"'
sh 'echo \\"asdf\\"'
}
Result
[Pipeline] {
[Pipeline] sh
+ echo asdf
asdf
[Pipeline] sh
+ echo asdf
asdf
[Pipeline] sh
+ echo "asdf"
"asdf"
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
After long time of struggling and googling, this is what has worked for me on similar use case:
sh("ssh root#my.server.com \"su user -c \\\"mkdir ${newDirName}\\\"\"")
Update: How I think it gets interpreted
1] sh extension strips first escaping (\" becomes " and \\ becomes \, first and last " are not part of input)
ssh root#my.server.com "su user -c \"mkdir ${newDirName}\""
2] ssh command strips second level of escaping (\" becomes ", while outer " also not part of input)
su user -c "mkdir ${newDirName}"
I had double quotes inside the variable, so escaped single quotes worked for me:
sh "git commit -m \'${ThatMayContainDoubleQuotes}\'"
I needed the output to be with trailing \\ so I had to do something like this
echo 'key1 = \\\\"__value1__\\\\"' > auto.file
File looks like
cat auto.file
key1 = \\"__value1__\\"
Dependent Script
export value1="some-value"
var=${value1}
# Read in template one line at the time, and replace variables
tmpfile=$(mktemp)
sed -E 's/__(([^_]|_[^_])*)__/${\\1}/g' auto.file > ${tmpfile}
while read auto
do
eval echo "$auto"
done < "${tmpfile}" > autoRendered.file
rm -f ${tmpfile}
Rendered File looks like
cat autoRendered.file
key1 = "some-value"
For anyone who comes looking for a fix to a similar issue with quoting numbers during helm install/upgrade, you can use --set-string instead of --set
Ref: https://helm.sh/docs/chart_best_practices/values/#consider-how-users-will-use-your-values

Escape chars in Terraform local exec provisioner

I want to chain Terraform and Ansible using the local-exec provisioner;
However since this requires input to Ansible from Terraform I am stuck with the following complex command:
provisioner "local-exec" {
command = 'sleep 60; ansible-playbook -i ../ansible/inventory/ ../ansible/playbooks/site.yml --extra-vars "rancher_server_rds_endpoint="${aws_db_instance.my-server-rds.endpoint}" rancher_server_elastic_ip="${aws_eip.my-server-eip.public_ip}""'
}
which keeps returning
illegal char
error;
any suggestion about escaping correctly?
If the ansible-playbook command was to run directly in the shell it would be:
ansible-playbook -i inventory playbooks/site.yml --extra-vars "my_server_rds_endpoint=my-server-db.d30ikkj222.us-west-1.rds.amazonaws.com rancher_server_elastic_ip=88.148.17.236"
(paths differ)
Terraform syntax states that:
Strings are in double-quotes.
So you need to replace single quotes with double ones, and then escape quotes inside, for example:
provisioner "local-exec" {
command = "sleep 60; ansible-playbook -i ../ansible/inventory/ ../ansible/playbooks/site.yml --extra-vars \"rancher_server_rds_endpoint='${aws_db_instance.my-server-rds.endpoint}' rancher_server_elastic_ip='${aws_eip.my-server-eip.public_ip}'\""
}
The only way I know, that will work for any special characters in variables, is to use environment, for example:
provisioner "local-exec" {
command = join(
" ", [
"sleep 60;",
"ansible-playbook -i ../ansible/inventory/",
"../ansible/playbooks/site.yml",
"--extra-vars",
"rancher_server_rds_endpoint=\"$RANCHER_SERVER_RDS_ENDPOINT\"",
"rancher_server_elastic_ip=\"$RANCHER_SERVER_ELASTIC_IP\""
]
)
environment = {
RANCHER_SERVER_RDS_ENDPOINT = aws_db_instance.my-server-rds.endpoint
RANCHER_SERVER_ELASTIC_IP = aws_eip.my-server-eip.public_ip
}
}

Accessing Shell variable from within Jenkins Pipeline

I am trying the below line in my Jenkins Pipeline. In the below set of lines, I am assigning the variable IMAGE_NAME in a shell, and trying to access that in the Jenkins Pipeline script, but not able to do that. Any idea on how to do that?
stage('Build: Get Image') {
steps {
echo 'Getting docker image'
sh "IMAGE_NAME=`grep -ri \"Successfully built\"
$BUILD_FILE_NAME | awk \'{print \$3}\'`"
echo "Image Name is:$IMAGE_NAME"
}
}
You can define it as env variable:
env.some_var = 'AAAA'
And print with:
sh 'echo ${env.some_var}'
proxy_host = 'abc.com'
stage('Docker Up') {
steps{
script{
sh("""
echo ${http_proxy}
""")
}
}
Catch here is to use double quotes " to execute the shell script. I tested it and it works fine.

Jenkins: Pipeline sh bad substitution error

A step in my pipeline uploads a .tar to an artifactory server. I am getting a Bad substitution error when passing in env.BUILD_NUMBER, but the same commands works when the number is hard coded. The script is written in groovy through jenkins and is running in the jenkins workspace.
sh 'curl -v --user user:password --data-binary ${buildDir}package${env.BUILD_NUMBER}.tar -X PUT "http://artifactory.mydomain.com/artifactory/release-packages/package${env.BUILD_NUMBER}.tar"'
returns the errors:
[Pipeline] sh
[Package_Deploy_Pipeline] Running shell script
/var/lib/jenkins/workspace/Package_Deploy_Pipeline#tmp/durable-4c8b7958/script.sh: 2:
/var/lib/jenkins/workspace/Package_Deploy_Pipeline#tmp/durable-4c8b7958/script.sh: Bad substitution
[Pipeline] } //node
[Pipeline] Allocate node : End
[Pipeline] End of Pipeline
ERROR: script returned exit code 2
If hard code in a build number and swap out ${env.BUILD_NUMBER} I get no errors and the code runs successfully.
sh 'curl -v --user user:password --data-binary ${buildDir}package113.tar -X PUT "http://artifactory.mydomain.com/artifactory/release-packages/package113.tar"'
I use ${env.BUILD_NUMBER} within other sh commands within the same script and have no issues in any other places.
This turned out to be a syntax issue. Wrapping the command in ''s caused ${env.BUILD_NUMBER to be passed instead of its value. I wrapped the whole command in "s and escaped the nested. Works fine now.
sh "curl -v --user user:password --data-binary ${buildDir}package${env.BUILD_NUMBER}.tar -X PUT \"http://artifactory.mydomain.com/artifactory/release-packages/package${env.BUILD_NUMBER}.tar\""
In order to Pass groovy parameters into bash scripts in Jenkins pipelines (causing sometimes bad substitions) You got 2 options:
The triple double quotes way [ " " " ]
OR
the triple single quotes way [ ' ' ' ]
In triple double quotes you can render the normal parameter from groovy using ${someVariable} ,if it's environment variable ${env.someVariable} , if it's parameters injected into your job ${params.someVariable}
example:
def YOUR_APPLICATION_PATH= "${WORKSPACE}/myApp/"
sh """#!/bin/bash
cd ${YOUR_APPLICATION_PATH}
npm install
"""
In triple single quotes things getting little bit tricky, you can pass the parameter to environment parameter and using it by "\${someVaraiable}" or concating the groovy parameter using ''' + someVaraiable + '''
examples:
def YOUR_APPLICATION_PATH= "${WORKSPACE}/myApp/"
sh '''#!/bin/bash
cd ''' + YOUR_APPLICATION_PATH + '''
npm install
'''
OR
pipeline{
agent { node { label "test" } }
environment {
YOUR_APPLICATION_PATH = "${WORKSPACE}/myapp/"
}
continue...
continue...
continue...
sh '''#!/bin/bash
cd "\${YOUR_APPLICATION_PATH}"
npm install
'''
//OR
sh '''#!/bin/bash
cd "\${env.YOUR_APPLICATION_PATH}"
npm install
'''
Actually, you seem to have misunderstood the env variable. In your sh block, you should access ${BUILD_NUMBER} directly.
Reason/Explanation: env represents the environment inside the script. This environment is used/available directly to anything that is executed, e.g. shell scripts.
Please also pay attention to not write anything to env.*, but use withEnv{} blocks instead.
Usually the most common issue for:
Bad substitution
error is to use sh instead of bash.
Especially when using Jenkins, if you're using Execute shell, make sure your Command starts with shebang, e.g. #!/bin/bash -xe or #!/usr/bin/env bash.
I can definitely tell you, it's all about sh shell and bash shell. I fixed this problem by specifying #!/bin/bash -xe as follows:
node {
stage("Preparing"){
sh'''#!/bin/bash -xe
colls=( col1 col2 col3 )
for eachCol in ${colls[#]}
do
echo $eachCol
done
'''
}
}
I had this same issue when working on a Jenkins Pipeline for Amazon S3 Application upload.
My script was like this:
pipeline {
agent any
parameters {
string(name: 'Bucket', defaultValue: 's3-pipeline-test', description: 'The name of the Amazon S3 Bucket')
string(name: 'Prefix', defaultValue: 'my-website', description: 'Application directory in the Amazon S3 Bucket')
string(name: 'Build', defaultValue: 'public/', description: 'Build directory for the application')
}
stages {
stage('Build') {
steps {
echo 'Running build phase'
sh 'npm install' // Install packages
sh 'npm run build' // Build project
sh 'ls' // List project files
}
}
stage('Deploy') {
steps {
echo 'Running deploy phase'
withCredentials([[$class: 'AmazonWebServicesCredentialsBinding', accessKeyVariable: 'AWS_ACCESS_KEY_ID', credentialsId: 'AWSCredentials', secretKeyVariable: 'AWS_SECRET_ACCESS_KEY']]) {
sh 'aws s3 ls' // List AWS S3 buckets
sh 'aws s3 sync "${params.Build}" s3://"${params.Bucket}/${params.Prefix}" --delete' // Sync project files with AWS S3 Bucket project path
}
}
}
}
post {
success {
echo 'Deployment to Amazon S3 suceeded'
}
failure {
echo 'Deployment to Amazon S3 failed'
}
}
}
Here's how I fixed it:
Seeing that it's an interpolation call of variables, I had to substitute the single quotation marks (' ') in this line of the script:
sh 'aws s3 sync "${params.Build}" s3://"${params.Bucket}/${params.Prefix}" --delete' // Sync project files with AWS S3 Bucket project path
to double quotation marks (" "):
sh "aws s3 sync ${params.Build} s3://${params.Bucket}/${params.Prefix} --delete" // Sync project files with AWS S3 Bucket project path
So my script looked like this afterwards:
pipeline {
agent any
parameters {
string(name: 'Bucket', defaultValue: 's3-pipeline-test', description: 'The name of the Amazon S3 Bucket')
string(name: 'Prefix', defaultValue: 'my-website', description: 'Application directory in the Amazon S3 Bucket')
string(name: 'Build', defaultValue: 'public/', description: 'Build directory for the application')
}
stages {
stage('Build') {
steps {
echo 'Running build phase'
sh 'npm install' // Install packages
sh 'npm run build' // Build project
sh 'ls' // List project files
}
}
stage('Deploy') {
steps {
echo 'Running deploy phase'
withCredentials([[$class: 'AmazonWebServicesCredentialsBinding', accessKeyVariable: 'AWS_ACCESS_KEY_ID', credentialsId: 'AWSCredentials', secretKeyVariable: 'AWS_SECRET_ACCESS_KEY']]) {
sh 'aws s3 ls' // List AWS S3 buckets
sh "aws s3 sync ${params.Build} s3://${params.Bucket}/${params.Prefix} --delete" // Sync project files with AWS S3 Bucket project path
}
}
}
}
post {
success {
echo 'Deployment to Amazon S3 suceeded'
}
failure {
echo 'Deployment to Amazon S3 failed'
}
}
}
That's all
I hope this helps
I was having the issue with showing the {env.MAJOR_VERSION} in an artifactory of jar file . show I approaches by keeping of environment step in Jenkinsfile.
pipeline {
agent any
environment {
MAJOR_VERSION = 1
}
stages {
stage('build') {
steps {
sh 'ant -f build.xml -v'
}
}
}
post {
always{
archiveArtifacts artifacts: 'dist/*.jar', fingerprint: true
}
}
}
I got the issue solved and then it was not showing me bad substitution in my Jenkins build output. so environment step plays a more role in Jenkinsfile.
suggestion from #avivamg didn't worked for me, here is the syntax which works for me:
sh "python3 ${env.WORKSPACE}/package.py --product productname " +
"--build_dir ${release_build_dir} " +
"--signed_product_dir ${signed_product_dir} " +
"--version ${build_version}"
I got similar issue. But my usecase is little different
steps{
sh '''#!/bin/bash -xe
VAR=TRIAL
echo $VAR
if [ -d /var/lib/jenkins/.m2/'\${params.application_name}' ]
then
echo 'working'
echo ${VAR}
else
echo 'not working'
fi
'''
}
}
here I'm trying to declare a variable inside the script and also use a parameter from outside
After trying multiple ways
The following script worked
stage('cleaning com/avizva directory'){
steps{
sh """#!/bin/bash -xe
VAR=TRIAL
echo \$VAR
if [ -d /var/lib/jenkins/.m2/${params.application_name} ]
then
echo 'working'
echo \${VAR}
else
echo 'not working'
fi
"""
}
}
changes made :
Replaced triple single quotes --> triple double quotes
Whenever I want to refer to local variable I used escape character
$VAR --> \$VAR
This caused the error Bad Substitution:
pipeline {
agent any
environment {
DOCKER_IMAGENAME = "mynginx:latest"
DOCKER_FILE_PATH = "./docker"
}
stages {
stage('DockerImage-Build') {
steps {
sh 'docker build -t ${env.DOCKER_IMAGENAME} ${env.DOCKER_FILE_PATH}'
}
}
}
}
This fixed it: replace ' with " on sh command
pipeline {
agent any
environment {
DOCKER_IMAGENAME = "mynginx:latest"
DOCKER_FILE_PATH = "./docker"
}
stages {
stage('DockerImage-Build') {
steps {
sh "docker build -t ${env.DOCKER_IMAGENAME} ${env.DOCKER_FILE_PATH}"
}
}
}
}
The Jenkins Script is failing inside the "sh" command-line E.g:
sh 'npm run build' <-- Fails referring to package.json
Needs to be changed to:
sh 'npm run ng build....'
... ng $PATH is not found by the package.json.

Resources