Groovy shell script doesn't recognize export command - bash

I'm running a groovy shell script where I'm trying to set the proxy before running the aws command:
export http_proxy=http://proxy.url.com:8099
aws s3 ls
But I'm getting this error:
Caught: java.io.IOException: Cannot run program "export": error=2, No such file or directory
java.io.IOException: Cannot run program "export": error=2, No such file or directory
at com.capitalone.cep.lensOps.run(lensOps.groovy:13)
The export command works fine when I run it in bash so what should I do in groovy to get it to work?

export is a shell feature and not an external program. To run a program with a certain value in its environment, you can instead use env.
env http_proxy=http://proxy.url.com:8099 aws s3 ls

Related

Unix export command path

I want to run a single line command via NiFi ExecuteStreamCommand processor. I want to run a gsutil command and before doing that I want to export environment variable GOOGLE_APPLICATION_CREDENTIALS.
So the command would be
export GOOGLE_APPLICATION_CREDENTIALS='/temp/abc.json'
However, NiFi needs the path of the command. On the server when I checked which export I do not get its path:
[user#server1 ~]$ which export
/usr/bin/which: no export in (/opt/teradata/client/14.10/tbuild/bin:/usr/lib64/qt-3.3/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/puppetlabs/bin:/home/user/.local/bin:/home/user/bin:/usr/local/google-cloud-sdk/bin/)
If its a builtin command, how do I get NiFi to run it?
how do I get NiFi to run it?
Use env to run a command with modified environment.
/usr/bin/env GOOGLE_APPLICATION_CREDENTIALS='/temp/abc.json' gsutil

Unable to execute shell scripts on cygwin(mintty terminal) using Jenkins

Unable to execute shell scripts on Cygwin(mintty terminal) using Jenkins pipeline but I can able to execute same scripts directly on Cygwin(mintty terminal), cygwin is installed on jenkins windows agent, anything I am missing on Jenkins windows agent to configure? and also my shell scripts are not executing on cygwin when using the Jenkins, they are executing on windows command prompt, can someone please advise how to make the jenkins to use the cygwin terminalto execute shell scripts instead of command prompt? .Thank you.
Below is the Jenkins pipeline how I am calling test.sh(test.sh is on jenkins windows agent)
stage('Import') {
steps {
script {
sh "/home/kumar/test.sh"
}
}
}
I see below error when execute the build.
/home/kumar/test.sh
FIND: Invalid switch
mv: cannot stat '/home/kumar/TEST_FILES/*': No such file or directory
below find command used in script.
find /cygdrive/d/Jenkins/workspace/sam-org/testing/app_name/src/abc -name "*.dsx" -exec cp "{}" /home/kumar/ITEST_FILES \

How to run command after source env shell inside bash script

I am trying to run a command after setting up an environment. This command runs a python script which depends on the environment.
I have the following code:
#!/bin/bash
source ~/some/linux/env/shell
python test.py
However, the "python test.py" only runs after I exit the env shell.
I want to be able to run the "python test.py" inside this new shell env.
Firstly adding python running and interpreting directory.
#!/usr/bin/env python
And you have to give a code execution authority. You can give a permission below
chmod a+x test.py
Now you can run it from the command line.
./test.py

Shell command doesn't function, but echo'ing the command and manually running it works?

I'm trying to run the following lines of shell script:
var_files=$(var_file_selector)
echo ${var_files}
terraform apply ${var_files} deploy/$1
Where var_files resolves to "deploy/vars/vars.tfvars". When I run the script, I get the following error:
invalid value "\"deploy/vars/vars.tfvars\"" for flag -var-file: Error reading "deploy/vars/vars.tfvars": open "deploy/vars/vars.tfvars": no such file or directory
However if I echo out the whole command:
echo terraform apply ${var_files} deploy/$1
I get:
terraform apply -var-file="deploy/vars/vars.tfvars" deploy/cluster
Which I can run manually from the terminal (in the same working directory that I'm running the script from) and it works just fine. What am I not understanding here?

run .sh script via Jenkins to execute aws command error

my problem is that i try to execute shell script to copy created files from msbuild to AWS s3 via Jenkins.
Then i add new build step "Execute Shell" and set to execute shell script by command: sh publishS3.sh nothing happens and files doesn't apper in s3 bucket.
my Jenkins use Local Windows Server.
Then i try to execute the shell script by typing sh publishS3.sh in Jenkins local directory all ok , files was copyed secessfully to s3 bucket , but if i try to do it from jenkins nothing was happen. My publishS3.sh script is:
#!/bin/bash
aws s3 cp Com.VistaDraft.Common.dll s3://download.vistadraft.com/MVP
i was tryed to to check witch output i receive after execute by adding at the end command > output.txt but Jenkins generate an empty file. If i try to do the same locally i was receive an message that i secessfully copyed files to s3. i Set the shell script path of jenkins C:\Program Files\Git\git-bash.exe and using git-bash.exe locally too. Maybe whom know where is a problem ? Please suggest.
You could try to add -ex in the first line of the script to allow you to see what it's doing and ease the debugging:
#!/bin/bash -ex
# rest of script
Make sure the aws tool is in the PATH of the environment where Jenkins runs your script. It might help if you specify full path to the command.
You could put which aws in your script to see what's going on.

Resources