Redirecting shell output in Jenkins to a file - shell

I have a job in jenkins which executes many python scripts in a shell:
#!/bin/bash -x
mkdir -p $WORKSPACE/validation/regression
rm -f $WORKSPACE/validation/regression/*.latest
cd $WORKSPACE/PythonTests/src/
# Execute test cases
python tests.py 031 > $WORKSPACE/validation/regression/TP031output_b$BUILD_NUMBER.log
python tests.py 052 > $WORKSPACE/validation/regression/TP052output_b$BUILD_NUMBER.log
python tests.py 060 > $WORKSPACE/validation/regression/TP060output_b$BUILD_NUMBER.log
My intention is that each script output (which I can see in my terminal if I execute them manually) is stored in a log file with that classic redirection.
It used to work, but now it just creates an empty file. I can't find out what has changed since then.
Any hint?

This worked for me in Jenkins Build execute shell:
[command] 2>&1 | tee [outputfile]

Related

sed shell with jenkins deployment

I'm working on something at the moment and just now I even wonder if what I am working on is even possible.
I want to SSH from jenkins to a shell script and use variables form a rc file that are in a git Repository. (The Shell script and rc file are in the same repo)
Nothing that I tried works and now I'm starting to wondering if it's even possible.
Here's is my local script but i get the same output on jenkins.
docker exec -it test-container bash 'sed -f <(printf "s/${DOMAIN}\.%s/www.&.${DOMAIN_SUFFIX_STAGE}/g\n" ${LANG_KEYS}) /var/www/foo/sed/test.txt > /var/www/foo/sed/new-2.txt'
No matter what I do I get this error
bash: sed -f <(printf "s/${DOMAIN}\.%s/www.&.${DOMAIN_SUFFIX_STAGE}/g\n" ${LANG_KEYS}) /var/www/foo/sed/test.txt > /var/www/foo/sed/new-2.txt: No such file or directory
And yes I can confirm that the directory is there
Here's an easier way to reproduce your problem:
$ bash "echo Hello"
bash: echo Hello: No such file or directory
This happens because the expected syntax is bash yourfile. The string you are passing is not a useful filename, so it fails.
To run a string argument as a command, you can use bash -c commandstring:
$ bash -c "echo Hello"
Hello
This makes bash interpret the parameter as a shell command to execute, instead of a filename to open.

process does not log when run as background

I want to run this command inside a docker container ( ubuntu:18.04 image ):
(cd inse/; sh start.sh > log.txt 2>&1 ;) &
but when I run it, it does not log it to log.txt. When I run it this way:
(cd inse/; sh start.sh > log.txt 2>&1 ;)
It locks the forground (as it should do) and when I kill it I see that the log.txt file is filled with log stuff, which means It works correctly.
Why is this behaviour happening?
The contents of start.sh is:
#!/usr/bin/env sh
. venv/bin/activate;
python3 main.py;
UPDATE:
Actually this command is not the entry point of container and I run it inside another shell but inside a long running container (testing container).
Using with nohup, no success:
(cd inse/; nohup sh start.sh | tee log.txt;) &
I think this problem refers to using () the subshell concept inside sh. It seems it does not let output go anywhere when ran in background.
UPDATE 2:
Even this does not work:
sh -c "cd inse/; sh start.sh > log.txt 2>&1 &"
UPDATE 3:
Not even this:
sh -c "cd inse/; sh start.sh > log.txt 2>&1;" &
I found what was causing the problem.
It was buffered python output. This problem is caused by python.
I should have used python unbuffered output:
python -u blahblah
Try to use this command and please check that have full access to that folder where log.txt is created.use CMD/RUN step in Dockerfile to run start.sh.
CMD /inse/start.sh > log.txt 2>&1 ;
OR
RUN /inse/start.sh > log.txt 2>&1 ;

Print all script output to file from within another script

English is not my native language, please accept my apologies for any language issues.
I want to execute a script (bash / sh) through CRON, which will perform various maintenance actions, including backup. This script will execute other scripts, one for each function. And I want the entirety of what is printed to be saved in a separate file for each script executed.
The problem is that each of these other scripts executes commands like "duplicity", "certbot", "maldet", among others. The "ECHO" commands in each script are printed in the file, but the outputs of the "duplicity", "certbot" and "maldet" commands do not!
I want to avoid having to put "| tee --append" or another command on each line. But even doing this on each line, the "subscripts" do not save in the log file. That is, ideally in the parent script, you could specify in which file each script prints.
Does not work:
sudo bash /duplicityscript > /path/log
or
sudo bash /duplicityscript >> /path/log
sudo bash /duplicityscript | sudo tee –append /path/log > /dev/null
or
sudo bash /duplicityscript | sudo tee –append /path/log
Using exec (like this):
exec > >(tee -i /path/log)
sudo bash /duplicityscript
exec > >(tee -i /dev/null)`
Example:
./maincron:
sudo ./duplicityscript > /myduplicity.log
sudo ./maldetscript > /mymaldet.log
sudo ./certbotscript > /mycertbot.log
./duplicityscript:
echo "Exporting Mysql/MariaDB..."
{dump command}
echo "Exporting postgres..."
{dump command}
echo "Start duplicity data backup to server 1..."
{duplicity command}
echo "Start duplicity data backup to server 2..."
{duplicity command}
In the log file, this will print:
Exporting Mysql/MariaDB...
Exporting postgres...
Start duplicity data backup to server 1...
Start duplicity data backup to server 2...
In the example above, the "ECHO" commands in each script will be saved in the log file, but the output of the duplicity and dump commands will be printed on the screen and not on the log file.
I made a googlada, I even saw this topic, but I could not adapt it to my necessities.
There is no problem in that the output is also printed on the screen, as long as it is in its entirety, printed on the file.
try 2>&1 at the end of the line, it should help. Or run the script in sh -x mode to see what is causing the issue.
Hope this helps

Jenkins Running shell scripts in reverse?

My jenkins server has a problem of running shell commands in reverse order.
I specify the commands to run
copy over a file to another server
run the update script
For example,
$nohup scp -i .ssh/blah -o StrictHostKeyChecking=no foo.txt tomcat#foo.coo.com:/tmp/FOO.txt &> /dev/null
$nohup ssh -t -t -n -i .ssh/blah -o StrictHostKeyChecking=no tomcat#foo.coo.com '/home/tomcat/bin/update.sh /tmp/FOO.txt.war'
instead the jenkins output console would show:
running update.sh
copying over the file
the same problem also occurs when i pair the two commands into one with &&
and it happens with all my jobs on jenkins
i'm currently running jenkins 1.469 on a tomcat6 server
any help would be appreciated thanks!
EDIT:
i'm running these commands as batch tasks for each job. the problem doesn't seem to be jenkins as this ran correctly
[workspace] $ /bin/sh -xe /tmp/tomcat6-tomcat6-tmp/hudson8724999678434432030.sh
+ echo 1
1
+ echo 2
2
+ echo 3
3
+ echo 4
4
The use of &> to redirect both stdout and stderr is a feature of the bash shell. If you want to use bash-specific features, you need to let Jenkins know the build step should be executed using bash.
This can be done in two ways:
1) Change the default shell in Jenkins global configuration or
2) The first line of your build step must start with #!/bin/bash ...
Note that /bin/sh is not always a symlink to /bin/bash.

Difference between piping a file to sh and calling a shell file

This is what was trying to do:
$ wget -qO- www.example.com/script.sh | sh
which quietly downloads the script and prints it to stdout which is then piped to sh. This unfortunately doesn't quite work, failing to wait for user input at various points, aswell as a few syntax errors.
This is what actually works:
$ wget -qOscript www.example.com/script.sh && chmod +x ./script && ./script
But what's the difference?
I'm thinking maybe piping the file doesn't execute the file, but rather executes each line individually, but I'm new to this kind of thing so I don't know.
When you pipe to sh , stdin of that shell/script will be the pipe. Thus the script cannot take e.g. user input from the console. When you run the script normally, stdin is the console - where you can enter input.
You might try telling the shell to be interactive:
$ wget -qO- www.example.com/script.sh | sh -i
I had the same issue, and after tinkering and googling this is what worked for me.
wget -O - www.example.com/script.sh | sh

Resources