How to fix hanging Ubuntu on Windows associated with batch programming - bash

I have a batch file that contains this:
bash -c "shell/rsync_A.sh"
bash -c "shell/rsync_B.sh"
Each of the shell scripts look like this:
rsync_A.sh:
rsync --info=progress2 -rptz --delete -e "ssh -i /root/.ssh/[MY_CERT].pem" [MY_REMOTE_UBUNTU_ON_AWS]:[MY_REMOTE_FOLDER1] [MY_LOCAL_DESTINATION_FOLDER1]
rsync --info=progress2 -rptz --delete -e "ssh -i /root/.ssh/[MY_CERT].pem" [MY_REMOTE_UBUNTU_ON_AWS]:[MY_REMOTE_FOLDER2] [MY_LOCAL_DESTINATION_FOLDER2]
rsync_B.sh:
rsync --info=progress2 -rptz --delete -e "ssh -i /root/.ssh/[MY_CERT].pem" [MY_REMOTE_UBUNTU_ON_AWS]:[MY_REMOTE_FOLDER3] [MY_LOCAL_DESTINATION_FOLDER3]
The problem is that bash always, without fail, hangs when I run the batch file. The first rsync command always seems to run fine, the second always fails (whether inside the same sh file or a different one).
By "hangs" I mean that I see a blinking cursor but no bash prompt and there is no way to get out of it without restarting the entire system (lxssmanager hangs when attempting to restart).
Everything always runs 100% fine when I enter bash and run the shell scripts, but as soon as I get batch involved it breaks.

I have no idea why or how... but the solution was to uninstall BitDefender.

Related

running multiple command from bash script with out loosing control

I want to run these two command in a loop:
for i in cat input:
do
winpty Kubectl exec -it $i -n image -c podname -- sh
2nd command
done
When I am running the .sh file, the first command works fine and after than nothing is happening.Can anybody help on this?I am running through gitbash from windows machine
I'm a bash rookie, but maybe it's because of the lack of a defined -d directory for unzipped files?

Calling rsync in bash from Windows cmd

I am trying to run rsync from a batch file. The command is
SET CMD="rsync -P -rptz --delete -e 'ssh -i /root/.ssh/CERTIFICATE.pem' SOURCE_ADDRESS /mnt/c/Users/MYNAME/IdeaProjects/PROJECT/SUBFOLDER/SUBFOLDER/SUBFOLDER/SUBFOLDER/LASTFOLDER"
bash %CMD%
This works fine if I run the command after typing bash, but when I run the command from cmd with the bash precursor it says No such file or directory.
Additionally, when playing around and trying to debug bash ends up hanging... i.e. if I open bash I get no prompt, just a blinking cursor.
Any help is appreciated.
To run a command with bash you need to use the -c option
bash -c "%CMD%"
Without it the first non-option parameter will be treated as a *.sh shell script, which rsync isn't and will cause an error
If arguments remain after option processing, and neither the -c nor the -s option has been supplied, the first argument is assumed to be the name of a file containing shell commands.
Note that the cmd in Windows is not DOS even though they have a few similar commands. The rest are vastly different

Print all script output to file from within another script

English is not my native language, please accept my apologies for any language issues.
I want to execute a script (bash / sh) through CRON, which will perform various maintenance actions, including backup. This script will execute other scripts, one for each function. And I want the entirety of what is printed to be saved in a separate file for each script executed.
The problem is that each of these other scripts executes commands like "duplicity", "certbot", "maldet", among others. The "ECHO" commands in each script are printed in the file, but the outputs of the "duplicity", "certbot" and "maldet" commands do not!
I want to avoid having to put "| tee --append" or another command on each line. But even doing this on each line, the "subscripts" do not save in the log file. That is, ideally in the parent script, you could specify in which file each script prints.
Does not work:
sudo bash /duplicityscript > /path/log
or
sudo bash /duplicityscript >> /path/log
sudo bash /duplicityscript | sudo tee –append /path/log > /dev/null
or
sudo bash /duplicityscript | sudo tee –append /path/log
Using exec (like this):
exec > >(tee -i /path/log)
sudo bash /duplicityscript
exec > >(tee -i /dev/null)`
Example:
./maincron:
sudo ./duplicityscript > /myduplicity.log
sudo ./maldetscript > /mymaldet.log
sudo ./certbotscript > /mycertbot.log
./duplicityscript:
echo "Exporting Mysql/MariaDB..."
{dump command}
echo "Exporting postgres..."
{dump command}
echo "Start duplicity data backup to server 1..."
{duplicity command}
echo "Start duplicity data backup to server 2..."
{duplicity command}
In the log file, this will print:
Exporting Mysql/MariaDB...
Exporting postgres...
Start duplicity data backup to server 1...
Start duplicity data backup to server 2...
In the example above, the "ECHO" commands in each script will be saved in the log file, but the output of the duplicity and dump commands will be printed on the screen and not on the log file.
I made a googlada, I even saw this topic, but I could not adapt it to my necessities.
There is no problem in that the output is also printed on the screen, as long as it is in its entirety, printed on the file.
try 2>&1 at the end of the line, it should help. Or run the script in sh -x mode to see what is causing the issue.
Hope this helps

SSH: Send remote command to local background

So I have a problem similar to how to send ssh job to background.
I have a windows c# program automated to execute tcpdump on a remote linux os using http://sshnet.codeplex.com/. I'm trying to execute tcpdump on the remote linux and leave it running after I disconnect.
I've been doing a lot of debugging using plink, but cannot seem to achieve the desired result. I've tried:
plink root#10.5.1.1 bash -c "tcpdump -i eth0 -w test.cap"
but it holds the sshclient until I ctrl+C (not going to work for automated solution). I've also tried variations of:
plink root#10.5.1.1 bash -c "tcpdump -i eth0 -w test.cap &"
but either the command is not executed at all (test.cap does not exist) or is terminated immediately (test.cap contains 1 line). During testing, I've left a ping going, so the capture should have somthing...
The previously mentioned link solves the problem with screen, but the remote linux os is not configurable and does not have screen. Any suggestions are welcome.
In the latter case, your tcpdump process is probably being aborted when you disconnect. Try:
plink root#10.5.1.1 bash -c "nohup tcpdump -i eth0 -w test.cap &"
See the manpage for nohup. You may also want to consider redirecting stdout and stderr to a file or /dev/null to prevent nohup from writing output to a file:
plink root#10.5.1.1 bash -c "nohup tcpdump -i eth0 -w test.cap >/dev/null 2>&1 &"
I had a similar problem while starting a remote application. This pattern worked for me on Debian servers:
ssh root#server "nohup /usr/local/bin/app -c cfg &; exit"
addition: for another test the above didn't work, ie. the command didn't start on the remote server. Adding a command that returns successfully before the exit seems to work.
ssh root#server "nohup /usr/local/bin/otherapp &; w; exit"
I had a similar situation:
(on windows machine) i wanted to create a ms batch script to open an SSH connection to a raspberry pi and execute a local script in the background.
I found that combining both Raj's and fahd's answers did the trick for me:
my ms batch script:
plink -load "raspberry Pi" -t -m startCommand.txt
the content of startCommand.txt is as follows:
nohup /home/pi/myscript >/dev/null 2>&1 &
w
exit
The ">/dev/null 2>&1 " is important!
I found out (the hard way) that the RPi's SDcard kept getting full by an extremely large nohup.out file (and with a full SDcard, the RPi couldn't even login properly)
reasoning:
I used the -load to load a saved session in PuTTY (i do this because i am authenticating with public/private keys instead of passwords, but this should be the same as simply typing in the host)
then -t (as recommended by Raj)
then -m to load a list of commands in that file
without the parameter "-t" and without the "w" and "exit", my batch script would just run, not execute 'myscript' and close again.
I had the same issue. I had a scrip in which I had nohup tcpdump .... & . I could not use ssh to run it as it dies when the ssh finished. The solution I came up with was super simple. I just added sleep 5 to the end of my script and it works just fine. It seems tcpdump needs some seconds to go to background safely before you exit even with nohup.
I had the same problem, and I found that the "-t" option seems to be important to nohup. I found the nohup wasn't taking affect without the "-t" option.
ssh -t user#remote 'nohup tcpdump -i any -w /tmp/somefile &>/dev/null & sleep 2'
I think that I've nailed it, at least in IBM AIX
I'm using
ssh -tq user#host "/path/start-tcpdump.ksh"
(authentication is done by publick key).
I was having inconsistent results using simple "nohup tcpdump .... &", sometimes it worked, sometimes it did not, sometimes it even blocked and I had to disconnect the session.
So far, this is working ok, I can't really say WHY it is working, but it is...
This is my start-tcpip.ksh
#!/usr/bin/ksh
HOST=$(uname -n)
FILTER="port not 22"
(tcpdump -i en1 -w $HOST-en1.cap $FILTER >/dev/null 2>&1 ) &
sleep 2
(tcpdump -i en2 -w $HOST-en2.cap $FILTER >/dev/null 2>&1 ) &
sleep 2
exit 0

Difference between piping a file to sh and calling a shell file

This is what was trying to do:
$ wget -qO- www.example.com/script.sh | sh
which quietly downloads the script and prints it to stdout which is then piped to sh. This unfortunately doesn't quite work, failing to wait for user input at various points, aswell as a few syntax errors.
This is what actually works:
$ wget -qOscript www.example.com/script.sh && chmod +x ./script && ./script
But what's the difference?
I'm thinking maybe piping the file doesn't execute the file, but rather executes each line individually, but I'm new to this kind of thing so I don't know.
When you pipe to sh , stdin of that shell/script will be the pipe. Thus the script cannot take e.g. user input from the console. When you run the script normally, stdin is the console - where you can enter input.
You might try telling the shell to be interactive:
$ wget -qO- www.example.com/script.sh | sh -i
I had the same issue, and after tinkering and googling this is what worked for me.
wget -O - www.example.com/script.sh | sh

Resources