Script not working as Command line - bash

i've created simple bash script that do the following
:
#!/usr/bin/env bash
cf ssh "$1"
When I run the command line from the CLI like cf ssh myapp its running as expected, but when I run the script like
. myscript.sh myapp
I got error: App not found
I dont understand what is the difference, I've provided the app name after I invoke the script , what could be missing here ?
update
when I run the script with the following its working, any idea why the "$1" is not working ...
#!/usr/bin/env bash
cf ssh myapp

When you do this:
. myscript.sh myapp
You don't run the script, but you source the file named in the first argument. Sourcing means reading the file, so it's as if the lines in the file were typed on the command line. In your case what happens is this:
myscript.sh is treates as the file to source and the myapp argument is ignored.
This line is treated as a comment and skipped.
#!/usr/bin/env bash
This line:
cf ssh "$1"
is read as it stands. "$1" takes the value of $1 in the calling shell. Possibly - most likely in your case - it's blank.
Now you should know why it works as expected when you source this version of your script:
#!/usr/bin/env bash
cf ssh myapp
There's no $1 to resolve, so everything goes smoothly.
To run the script and be able to pass arguments to it, you need to make the file executable and then execute it (as opposed to sourcing). You can execute the script for example this way:
./script.bash arg1 arg2

Related

Is there a way to execute a bash script in Cygwin without explicitly typing bash my_script_name.sh?

I have a bash script that contains an AWK program:
#!/bin/bash
awk -v ARPT_IDENT=$1 '
BEGIN { FS=OFS="\t" }
$1 == ARPT_IDENT { print }' ANAV.TXT
I named the script file: select-NAV-for-ARPT_IDENT.sh
I can execute the script at a Cygwin command line like this:
bash select-ANAV-for-ARPT_IDENT.sh US01017
That works fine.
But, but, but, ...
I would really like to execute the script by just specifying the name of the script:
select-ANAV-for-ARPT_IDENT.sh US01017
Is there a way to execute a bash script in Cygwin without explicitly typing bash ...?
Note: I did try this: chmod +x select-ANAV-for-ARPT_IDENT.sh
And then executed it:
select-ANAV-for-ARPT_IDENT.sh US01017
But bash gave this error message:
-bash: select-ANAV-for-ARPT_IDENT.sh: command not found
Either specify the path to the script, or save it to a directory that's present in the $PATH variable.

Dot (source) command doesn't work in script, but works in terminal

Script file simplified for experimenting:
#!/bin/sh
if test -f /home/vl/docker-test/envvars; then . /home/vl/docker-test/envvars; fi
envvars file content:
export APACHE_RUN_USER=www-data
Nothing happens after running script, no output, no error.
Checking if env contains variable from envvars, no, it doesn't:
$ env | grep -i apache
output is empty.
But:
$ if test -f /home/vl/docker-test/envvars; then . /home/vl/docker-test/envvars; fi
$ env | grep -i apache
APACHE_RUN_USER=www-data
What i'm doing wrong in my script?
. applies within the current running environment. In the first case, that's the script (and goes away when the script is done). In the second case, it's the shell. If you want the script to influence the shell it runs in, then you need to . it into the shell, not run it as a script.
So if your script is bring-in-vars, you are currently doing something like this (running it as a child):
./bring-in-vars
And you need to be doing this (sourcing it into the current shell):
. ./bring-in-vars
Children cannot, by design, modify their parents.

Bindsym does not execute i3wm command

This shortcut does not work in i3wm. It's supposed to show the window list of open apps.
Nothing visible happens, when keyboard shortcut is pressed.
bindsym $mod+space exec bash -c "/home/george/./dmenu-i3-window-jumper.sh"
However the script runs fine from terminal.
The bash code for the script:
https://github.com/minos-org/minos-desktop-tools/blob/master/tools/dmenu-i3-window-jumper
This is a two side issue
First some small config stuff:
I think you got an extra dot in there as ./ in that context just represents the folder preceding it (i.e: /home/george)
You can use the $HOME variable as a stand in for your home folder, i3 will pick it up
I would argue there is really no need for the bash -c, since your file has both a .sh extension and a #!/bin/sh header on the first line, which means you just need to give it execution permissions with chmod +x and it will run with bash anyways.
So in synthesis, you gotta
chmod +x /home/george/dmenu-i3-window-jumper.sh
so the script can be run without calling bash directly,
and your bindsym could be simplified to
bindsym $mod+space exec "$HOME/dmenu-i3-window-jumper.sh"
And then there is the script stuff:
You see, around line 44 the script checks to see if the STDIN is in a terminal, if its not then it tries to pipe a file to the arg array
if [ ! -t 0 ]; then
#add input comming from pipe or file to $#
set -- "${#}" $(cat)
fi
This seems to be the main problem, since you're not running the command in a terminal and you're not giving it a file either.
Your options are A: changing the if so it will always pass an empty string to the argument array
if [ ! -t 0 ]; then
#add input comming from pipe or file to $#
set -- "${#}" ""
fi
or B: create a dummy file with touch ~/dummy and then pass it to the script on the bindsym
bindsym $mod+space exec "$HOME/dmenu-i3-window-jumper.sh < $HOME/dummy"
Both seem to work fine on my setup, good luck!

shell script : write sdterr & sdtout to file

I know this has been asked many times, but I can find a suitable answer in my case.
I croned a backup script using rsync and would like to see all output, errors or not, from the all script commands. I must write the command inside the script itself, and do not want to see output in my shell.
I have been trying with no success. Below part of the script.
#!/bin/bash
.....
BKLOG=/mnt/backup_error_$now.txt
# Log everything to log file
# something like
exec 2>&1 | tee $BKLOG
# OR
exec &> $BKLOG
I have been adding at the script beginig all kinds of exec | tee $BKLOG with adding &>, 2>&1at various part of the command line, but all failed. I either get an empty log file or incomplete. I need to see on log file what rsync has done, and the error if script failed before syncing.
Thank you for help. My shell is zsh, so any solution in zsh is welcomed.
To redirect all the stdout/stderr to a file place this line on top of your script:
BKLOG=/mnt/backup_error_$now.txt
exec &> "$BKLOG"

Problem with bash script

I'm using this bash script:
for a in `sort -u $HADOOP_HOME/conf/slaves`; do
rsync -e ssh -a "${HADOOP_HOME}/conf" ${a}:"${HADOOP_HOME}"
done
for a in `sort -u $HBASE_HOME/conf/regionservers`; do
rsync -e ssh -a "${HBASE_HOME}/conf" ${a}:"${HBASE_HOME}"
done
When I call this script directly from shell, there are no problems and it works fine. But when I call this script from another script, although the script does its job, I get this message at the end:
sort: open failed: /conf/slaves: No such file or directory
sort: open failed: /conf/regionservers: No such file or directory
I have set $HADOOP_HOME and $HBASE_HOME in /etc/profile and the script does the job right. But I don't understand why it gives this message in the end.
Are you sure it's doing it right? When you call this script from the shell it is acting as an interactive shell which reads and sources /etc/profile and ~/.bash_profile if it exists. When you call it from another script it is running as non-interactive and wont source those files. If you want a non-interactive shell to source a file you can do this by setting the BASH_ENV environment variable.
#!/bin/bash
export BASH_ENV=/etc/profile
./call/to/your/HADOOP/script.sh
Everything points to those variables not being defined when your script runs.
You should ensure that they are set for your script. Before the first loop, place the line:
echo "[${HADOOP_HOME}] [${HBASE_HOME}]"
and make sure that doesn't output "[] []" (or even one "[]").
Additionally, put a set +x line at the top of the script - this will output lines before executing them and you can see what's being done.
Keep in mind that some shells don't pass on environment variables to subshells unless you explicitly export them (setting them is not enough).

Resources