macOS terminal bug "can't see typed characters" after execution ffmpeg coommand inside shell script - bash

I execute from Terminal:
sh 1.sh
1.sh contents:
#!/bin/bash
S1=$(ffmpeg -i correct.wav -af silencedetect=noise=-50dB:d=0.1 -f null - 2>&1 | grep silence_duration -m 1 | awk '{print $NF}')
echo $S1
After execution I can't see any typed characters in macOS Terminal. Only invisible typing reset helps to solve the problem, but all previous output clears.
How can I modify the code to solve this bug? I guess the problem somewhere in 2>&1.
Bug screenshot if press enter several times after script execution:

Related

Syntax error: unexpected end of file. Bash

I want to set up a teamspeak bot, and I have this script to start this.
#!/bin/bash
if [ $1 = 'stop' ]
then
echo stop >> /root/ts3bot/tmp/log.txt
date >>/root/ts3bot/tmp/log.txt
echo ======================
screen -S bot -X quit
fi
if [ $1 = 'start' ]
then
echo start >> /root/ts3bot/tmp/log.txt
date >> /root/ts3bot/tmp/log.txt
echo ======================
screen -dmS bot php core.php
ps ax | grep -v grep | grep -v -i SCREEN | grep links >> /root/ts3bot/tmp/log.txt
fi
<here is an extra blank line>
but when I type bash bot.sh it says syntax error: unexpected end of file
I don't know what I did wrong :/ the chmod is set on 755
Thanks!
I suspect you may have copied this shell script from a Microsoft Windows box over to a Linux or Unix server. If so, the problem might be that you have DOS/Windows line endings, which can cause unpredictable results in scripts.
To check the script for bad line endings on a Linux or Unix server, you can dump the file (sort of like a hex dump) by typing the following at the shell prompt:
$ od -c bot.sh | less
And look for \n or \r or \r\n. If lines appear to have a \r at the end, then you've found the problem.
To FIX this line-ending problem, you can use a tool like dos2unix if it's installed on your system. If you don't have dos2unix but you're on a Linux server, you may be able to do this instead:
$ sed -i 's/\r//' bot.sh
to convert the file.
Lastly ... see the first line of the script, #!/bin/bash? Because of that, you don't need to run this with bash bot.sh, you can just execute it directly with ./bot.sh.

Cannot redirect the ouput of hwclock -r command

I am implementing a shell script and I want to analyse the output shown by hwclock -r (--show) command which displays the RTC time and date.
To do that I tried things like: hwclock -r | grep -v "grep" | grep "error" > /dev/null
to see if an error happened while reading RTC registers.
The problem is that output is only and always forwarded to console. I tried to forward output to a file then analyse its content and I also tried to use tee -a command to direct output to both console and a file, but with no success.
Is there a solution to that or an explanation to what is happening with hwclock -r command.
In advance Thank you.
I just solved it by forwarding error messages to a file then make the analysis.
hwclock -r 2> file.txt; grep -v "grep" | grep "error" > /dev/null will do the job.
You omitted file.txt in the first grep.
If you just want to check for "error", with a not too old bash this will also do, in a shorter way:
hwclock -r |& grep error >/dev/null

Running vi within a bash script and executing vi commands to edit another file

So I've made a script which is collecting data from many different files:
#!/bin/bash
mkdir DATAPOOL"$1"
grep achi *out>runner
grep treat *out>>runner
cat runner | grep Primitive *gout | grep '= '|awk '{print $1,$6}' > CellVolume"$1".txt
cat runner | grep ' c ' *gout | grep 'Angstrom '|awk '{print $1,$3}' > Cellc"$1".txt
cat runner | grep 'Final energy ' *gout |awk '{print $1,$5}' > CellEnergy"$1".txt
etc etc
cat runner |awk '{print "~/xtlanal",$1," > ",$1}' >runner2
vi runner2
:1,$s/gout:/xtl/
:1,$s/gout:/dat/
:wq
source runner2
grep Summary *dat | grep 'CAT-O ' |awk '{print $1,$6}' > AVE_NaO_"$1".txt
mv *txt DATAPOOL"$1"
So I end up with all the required text files when run without the vi part and so I know it all works. Furthermore when I run it with the vi commands, it just stops running at the vi command and then i can manually enter the 3 commands and I end up with the correct results. What I'm struggling with is I cant get vi to run the commands on its own so I can just execute the file multiple times within different directories and not have to manually enter commands time and time again.
Any help would be greatly appreciated.
Cheers
something like this as a bash script:
#!/bin/bash
vi filename.txt -c ':g/^/m0' -c ':wq'
where -c execute a command. Here the command is to reverse the lines in a textfile. After done, :wq to save and exit. (man vi to get more about -c)
If you don't want to type -c twice, you can do it this way:
vi -c "g/^/m0 | wq" filename.txt
For scripted editing tasks, you can use ed instead of vi:
ed runner2 <<'END'
1,$s/gout:/xtl/
1,$s/gout:/dat/
w
q
END
For global line-oriented search and replace, sed is a good choice:
sed -i 's/gout:/xtl/; s/gout:/dat/' runner2
Tested on VIM - Vi IMproved 8.0 (2016 Sep 12, compiled Apr 10 2018 21:31:58)
The vi -c "g/^/m0 | wq" filename.txt may appear to work, but it does not actually!
Typing vi -c "g/^/m0 | wq" filename.txt will result in vi writing and quitting before any major changes are made to the file. (using the pipe in this situation will attempt to execute the wq line by line forcing it to quit before the intended operation)
In order to see a demonstration try typing it without the q and see how slow it works writing line by line:
vi -c "g/^/m0 | w" filename.txt
The more efficient way is using -c as B. Kocis states, or use +.
As B. Kocis stated:
#!/bin/bash
vi filename.txt -c ':g/^/m0' -c ':wq'
or
vi filename.txt +g/^/m0 +wq

Can bash -v output be redirected?

starting bash with -v option produces a long output to the console
$ bash -v
source ~/Dropbox/bin/tim_functions.sh
\#!/bin/bash
...several hundred more lines
I would like to capture the output to a file to make it easier to browse through, but I have tried bash -v 2>&1 > out_bash.txt and bash -v | tee out_bash.txt and cannot capture the information on the terminal screen within a file. It is as if the verbose output is neither stderr or stdout. How can this be?
Can anyone suggest a way to capture the output of bash -v ?
bash -v 2>&1 > out_bash.txt
is not what you want, it should be
bash -v >out_bash.txt 2>&1
I poked around and found this http://www.commandlinefu.com/commands/view/3310/run-a-bash-script-in-debug-mode-show-output-and-save-it-on-a-file
On the website they use
bash -x test.sh 2>&1 | tee out.test, but I tested it with
bash -v test.sh 2>&1 | tee out.test and it worked fine.
you can also use the exec command in the script to redirect all output:
#!/bin/bash
exec >> out.txt 2>> out.txt
set -x
set -v
echo "testing debug of shell scripts"
ls
After reading other helpful answers, I believe this issue has to do with how bash is sending the verbose information to tty--which is somehow different than stderr or stdout. It can be caught with the following work around:
$ screen -L
$ bash -v
$ exit #from the bash session
$ exit #from the screen session
This results in a screenlog.0 file being generated containing the output.
The bash -v output of interest was on a mac running 10.7.3 (Lion) with
$ bash --version
GNU bash, version 3.2.48(1)-release (x86_64-apple-darwin11)
Copyright (C) 2007 Free Software Foundation, Inc.)
Another 10.6.8 mac I tried had a less (interesting/verbose) output, despite a similar .bashrc file.
You can use,
bash -v 2>&1 | tee file.txt
or
bash -v 2>&1 | grep search_string
Have you tried wrapping your child bash in a subshell?
( bash -v ) 2>&1 > out_bash.txt

How execute bash script line by line?

If I enter bash -x option, it will show all the line. But the script will execute normaly.
How can I execute line by line? Than I can see if it do the correct thing, or I abort and fix the bug. The same effect is put a read in every line.
You don't need to put a read in everyline, just add a trap like the following into your bash script, it has the effect you want, eg.
#!/usr/bin/env bash
set -x
trap read debug
< YOUR CODE HERE >
Works, just tested it with bash v4.2.8 and v3.2.25.
IMPROVED VERSION
If your script is reading content from files, the above listed will not work. A workaround could look like the following example.
#!/usr/bin/env bash
echo "Press CTRL+C to proceed."
trap "pkill -f 'sleep 1h'" INT
trap "set +x ; sleep 1h ; set -x" DEBUG
< YOUR CODE HERE >
To stop the script you would have to kill it from another shell in this case.
ALTERNATIVE1
If you simply want to wait a few seconds before proceeding to the next command in your script the following example could work for you.
#!/usr/bin/env bash
trap "set +x; sleep 5; set -x" DEBUG
< YOUR CODE HERE >
I'm adding set +x and set -x within the trap command to make the output more readable.
The BASH Debugger Project is "a source-code debugger for bash that follows the gdb command syntax."
If your bash script is really a bunch of one off commands that you want to run one by one, you could do something like this, which runs each command one by one when you increment a variable LN, corresponding to the line number you want to run. This allows you to just run the last command again super easy, and then you just increment the variable to go to the next command.
Assuming your commands are in a file "it.sh", run the following, one by one.
$ cat it.sh
echo "hi there"
date
ls -la /etc/passwd
$ $(LN=1 && cat it.sh | head -n$LN | tail -n1)
"hi there"
$ $(LN=2 && cat it.sh | head -n$LN | tail -n1)
Wed Feb 28 10:58:52 AST 2018
$ $(LN=3 && cat it.sh | head -n$LN | tail -n1)
-rw-r--r-- 1 root wheel 6774 Oct 2 21:29 /etc/passwd
Have a look at bash-stepping-xtrace.
It allows stepping xtrace.
xargs: can filter lines
cat .bashrc | xargs -0 -l -d \\n bash
-0 Treat as raw input (no escaping)
-l Separate each line (Not by default for performances)
-d \\n The line separator

Resources