I want to change the command so that command line flag(options) are placed before command line arguments, as which is done automatically by GNU getopt.
Mac use BSD getopt so that function is lacked. I want to tweak the bash so that upon executing of one command, I run a script that parse the flags and arguments reorder them and execute the reordered command.
In this way, both
ls -lh /tmp
ls /tmp -lh
will work in my Mac's terminal.
You can't safely write a general purpose tool to do the job of reordering arguments on a command line unless you know what the optstring argument to the getopt() function looks like for each command. Consider:
make something -f makefile
cp something -f makefile
In the first command, you have to move both the -f and makefile to the front to canonicalize the command invocation. In the second, you must only move the -f; if you move the following file name too, you rewrite the command and destroy your data.
Of course, you also have to know what the getopt_long() argument strings look like if the command takes long-form --force or --file=makefile style arguments too.
Frankly, you'd do better to use POSIXLY_CORRECT in your environment on Linux and forget about the insidious flexibility it provides, and learn to write your options before your arguments at all times. Your code will work across all Unix-like machines better if you do that.
You could install GNU software in some directory other than /bin and /usr/bin (e.g. /usr/gnu/bin and then ensure that you place /usr/gnu/bin on your PATH ahead of the system directories. There are pre-built systems like fink, too. However, that won't help with tools from Apple that don't have analogues from GNU. And there's a 'danger' that shell scripts you write will not be portable to other Macs that don't have the same setup that you do.
Related
I have in my $PATH my own path to my scripts.
I do that adding on my .bashrc
PATH=$PATH:~/home/user/myownscripts
In that directory I have two scripts, but only can use one of them, the first one I did create. And if created another script doesn't work neither.
Only can call the first script created.
The ls command return the following output
first_script second_script third_script
And first_script is bold and green
Why? and how fix this problem?
The second and third scripts are not executable. Use ls -l (provides more details about the files) to see the permissions of the files and run the following command to make these scripts executable.
chmod +x second_script third_script
If you run ls -l again, you should notice that they now have the x bit set in their file permissions.
This is a basic and fundamental aspect to Unix systems and I’d suggest that you read a book or tutorial on shell programming on a Unix-like system.
I've seen in a number of places, including recommendations on this site (What is the preferred Bash shebang?), to use #!/usr/bin/env bash in preference to #!/bin/bash. I've even seen one enterprising individual suggest using #!/bin/bash was wrong and bash functionality would be lost by doing so.
All that said, I use bash in a tightly controlled test environment where every drive in circulation is essentially a clone of a single master drive. I understand the portability argument, though it is not necessarily applicable in my case. Is there any other reason to prefer #!/usr/bin/env bashover the alternatives and, assuming portability was a concern, is there any reason using it could break functionality?
#!/usr/bin/env searches PATH for bash, and bash is not always in /bin, particularly on non-Linux systems. For example, on my OpenBSD system, it's in /usr/local/bin, since it was installed as an optional package.
If you are absolutely sure bash is in /bin and will always be, there's no harm in putting it directly in your shebang—but I'd recommend against it because scripts and programs all have lives beyond what we initially believe they will have.
The standard location of bash is /bin, and I suspect that's true on all systems. However, what if you don't like that version of bash? For example, I want to use bash 4.2, but the bash on my Mac is at 3.2.5.
I could try reinstalling bash in /bin but that may be a bad idea. If I update my OS, it will be overwritten.
However, I could install bash in /usr/local/bin/bash, and setup my PATH to:
PATH="/usr/local/bin:/bin:/usr/bin:$HOME/bin"
Now, if I specify bash, I don't get the old cruddy one at /bin/bash, but the newer, shinier one at /usr/local/bin. Nice!
Except my shell scripts have that !# /bin/bash shebang. Thus, when I run my shell scripts, I get that old and lousy version of bash that doesn't even have associative arrays.
Using /usr/bin/env bash will use the version of bash found in my PATH. If I setup my PATH, so that /usr/local/bin/bash is executed, that's the bash that my scripts will use.
It's rare to see this with bash, but it is a lot more common with Perl and Python:
Certain Unix/Linux releases which focus on stability are sometimes way behind with the release of these two scripting languages. Not long ago, RHEL's Perl was at 5.8.8 -- an eight year old version of Perl! If someone wanted to use more modern features, you had to install your own version.
Programs like Perlbrew and Pythonbrew allow you to install multiple versions of these languages. They depend upon scripts that manipulate your PATH to get the version you want. Hard coding the path means I can't run my script under brew.
It wasn't that long ago (okay, it was long ago) that Perl and Python were not standard packages included in most Unix systems. That meant you didn't know where these two programs were installed. Was it under /bin? /usr/bin? /opt/bin? Who knows? Using #! /usr/bin/env perl meant I didn't have to know.
And Now Why You Shouldn't Use #! /usr/bin/env bash
When the path is hardcoded in the shebang, I have to run with that interpreter. Thus, #! /bin/bash forces me to use the default installed version of bash. Since bash features are very stable (try running a 2.x version of a Python script under Python 3.x) it's very unlikely that my particular BASH script will not work, and since my bash script is probably used by this system and other systems, using a non-standard version of bash may have undesired effects. It is very likely I want to make sure that the stable standard version of bash is used with my shell script. Thus, I probably want to hard code the path in my shebang.
There are a lot of systems that don't have Bash in /bin, FreeBSD and OpenBSD just to name a few. If your script is meant to be portable to many different Unices, you may want to use #!/usr/bin/env bash instead of #!/bin/bash.
Note that this does not hold true for sh; for Bourne-compliant scripts I exclusively use #!/bin/sh, since I think pretty much every Unix in existence has sh in /bin.
For invoking bash it is a little bit of overkill. Unless you have multiple bash binaries like your own in ~/bin but that also means your code depends on $PATH having the right things in it.
It is handy for things like python though. There are wrapper scripts and environments which lead to alternative python binaries being used.
But nothing is lost by using the exact path to the binary as long as you are sure it is the binary you really want.
#!/usr/bin/env bash
is definitely better because it finds the bash executable path from your system environment variable.
Go to your Linux shell and type
env
It will print all your environment variables.
Go to your shell script and type
echo $BASH
It will print your bash path (according to the environment variable list) that you should use to build your correct shebang path in your script.
I would prefer wrapping the main program in a script like below to check all bash available on system. Better to have more control on the version it uses.
#! /usr/bin/env bash
# This script just chooses the appropriate bash
# installed in system and executes testcode.main
readonly DESIRED_VERSION="5"
declare all_bash_installed_on_this_system
declare bash
if [ "${BASH_VERSINFO}" -ne "${DESIRED_VERSION}" ]
then
found=0
all_bash_installed_on_this_system="$(\
awk -F'/' '$NF == "bash"{print}' "/etc/shells"\
)"
for bash in $all_bash_installed_on_this_system
do
versinfo="$( $bash -c 'echo ${BASH_VERSINFO}' )"
[ "${versinfo}" -eq "${DESIRED_VERSION}" ] && { found=1 ; break;}
done
if [ "${found}" -ne 1 ]
then
echo "${DESIRED_VERSION} not available"
exit 1
fi
fi
$bash main_program "$#"
Normally #!path/to/command will trigger bash to prepend the command path to the invoking script when executed. Example,
# file.sh
#!/usr/bin/bash
echo hi
./file.sh will start a new process and the script will get executed like /bin/bash ./file.sh
Now
# file.sh
#!/usr/bin/env bash
echo hi
will get executed as /usr/bin/env bash ./file.sh which quoting from the man page of env describes it as:
env - run a program in a modified environment
So env will look for the command bash in its PATH environment variable and execute in a separate environment where the environment values can be passed to env like NAME=VALUE pair.
You can test this with other scripts using different interpreters like python, etc.
#!/usr/bin/env python
# python commands
Your question is biased because it assumes that #!/usr/bin/env bash is superior to #!/bin/bash. This assumption is not true, and here's why:
env is useful in two cases:
when there are multiple versions of the interpreter that are incompatible.
For example python 2/3, perl 4/5, or php 5/7
when the location depends on the PATH, for instance with a python virtual environment.
But bash doesn't fall under any of these two cases because:
bash is quite stable, especially on modern systems like Linux and BSD which form the vast majority of bash installations.
there's typically only one version of bash installed under /bin.
This has been the case for the past 20+ years, only very old unices (that nobody uses any longer) had a different location.
Consequently going through the PATH variable via /usr/bin/env is not useful for bash.
Add to these three good resons to use #!/bin/bash:
for system scripts (when not using sh) for which the PATH variable may not contain /bin.
For example cron defaults to a very strict PATH of /usr/bin:/bin which is fine, sure, but other context/environments may not include /bin for some peculiar reason.
when the user screwed-up his PATH, which is very common with beginners.
for security when for example you're calling a suid program that invokes a bash script. You don't want the interpreter to be found via the PATH variable which is entirely under the user's control!
Finally, one could argue that there is one legitimate use case of env to spawn bash: when one needs to pass extra environment variables to the interpreter using #!/usr/bin/env -S VAR=value bash.
But this is not a thing with bash because when you're in control of the shebang, you're also in control of the whole script, so just add VAR=value inside the script instead and avoid the aforementioned problems introduced by env with bash scripts.
I have made a few python scripts, but is there an easier way to run them? I am using cygwin.
python "C:\Users\Desk\Dropbox\scripts\wsort.py" > data11414_unsorted.txt < data11414_sorted.txt
I want something like this (not typing the path name or "python"):
wsort > data11414_unsorted.txt < data11414_sorted.txt
where wsort is a link to my real wsort.py
Add a
Shebang
to the script
#!/bin/python
then invoke like this
wsort.py > data11414_unsorted.txt < data11414_sorted.txt
First, your question has a Windows-style path (backslashes, beginning with C:) rather than a Cygwin path (/cygdrive/c/Users/Desk/Dropbox/scripts/wsort.py). That implies you're not actually using Cygwin, or if you are, you're ignoring a bunch of warnings.
The below assumes you're using Cygwin Bash (which should be what you get if you start Cygwin Terminal from the Start Menu) and Cygwin Python (which you've installed using Cygwin's setup.exe, not a Windows Python installer). If your not, you're making life more difficult for yourself than you need to.
That out the way, there's a bunch of steps you need to take:
First, make the script executable. Use the chmod command for that, from a Cygwin Bash shell:
chmod +x /cygdrive/c/Users/Desk/Dropbox/scripts/wsort.py
Second, tell the system how to execute it. Add the following line to the top of the script:
#!/bin/python
(That's a "shebang". Python sees it as a comment, so doesn't do anything with it, but Cygwin and other Linux-like systems will use that line to see which program to run the script with. In this case, Python.)
Third, make sure your line endings are correct. Cygwin expects Linux line endings and will fail without them. This may not be a problem, but there's no harm in doing this. Run the following command:
dos2unix /cygdrive/c/Users/Desk/Dropbox/scripts/wsort.py
At this point, you'll be able to call the script by specifying the full path to it in Cygwin. You can't yet run it without specifying where the script is explicitly.
The fourth step is making sure the script is "in your path", ie in one of the folders where Cygwin looks for scripts to run. There are lots of ways to do this, but the most sensible is probably to just add your scripts directory to your path. The following command will add your scripts directory to your path whenever you start a new Cygwin session:
echo 'PATH="/cygdrive/c/Users/Desk/Dropbox/scripts:$PATH"' >>~/.bashrc
You will need to restart your Cygwin terminal for that to take effect, however.
At that point, you'll be able to run the script in Cygwin just by typing wsort.py (and thus use it with redirections and so forth as in your question).
Finally, to be able to call it simply as wsort, there's a number of options. The obvious one is just renaming the file. More usefully (and without copying the file or doing anything liable to break with Dropbox syncing things), try creating an alias:
echo 'alias wsort=wsort.py' >>~/.bashrc
Again, you'll need to restart your Cygwin terminal for that to take effect.
Maybe use an alias ?
alias wsort = "Command_Used"
I have a script that I'm running on two computers. The first line of the code looks like this:
#!/usr/bin/gnuplot -p
On the second computer, I want to use a version of GNUplot stored at $HOME/bin/gnuplot.
$HOME/bin is included in the second computer's $PATH variable, so I tried starting the script with a generic call to GNUplot:
#!gnuplot -p
but this didn't work because the script then searched for ./gnuplot.
Is there a way to make the header line the same for both computers? I'd prefer that the solution be somehow contained to that one line, without the need for the script to differentiate between the computers or for me to tell it which computer it is on.
#!/usr/bin/env gnuplot makes it more portable.
#!/usr/bin/env gnuplot is a more portable way of expressing the interpreter, as noted here and here.
However, this introduces a problem you may not expect.
#!/usr/bin/gnuplot -p
makes a persistent GNUplot window, but running
#!/usr/bin/env gnuplot -p
which would seem like the obvious portable alternative doesn't work.
Some reasons for this are discussed here, but there doesn't really seem to be a way around it: arguments and /usr/bin/env don't really get along.
In the particular case of the -p flag, it is best to use
#!/usr/bin/env gnuplot
and then include the line
set term x11 persist
in your GNUplot script, which will do the same thing.
The obvious solution is
#!/usr/bin/env gnuplot
See this question and my answer for some discussion of the advantages and drawbacks of this method.
But in your case, you need to pass a -p argument to gnuplot -- and the #! line doesn't handle that syntax. You can use
#!/usr/bin/gnuplot -p
but if you try:
#!/usr/bin/env gnuplot -p
it will probably try to execute something named gnuplot -p, which of course doesn't exist (it passes "gnuplot -p" as a single argument to /usr/bin/env).
A solution I've used is to write an installation script that modifies the #! line for each system. Figuring out how to do that is left as an exercise.
If your home directory has the same full path name on both computers, you can do this:
#!/home/yourname/bin/gnuplot -p
On the computer where gnuplot lives in $HOME/bin, this does just what you want. On the other one, where you want to use /usr/bin/gnuplot, create $HOME/bin/gnuplot as a symbolic link to /usr/bin/gnuplot:
ln -s /usr/bin/gnuplot ~/bin/gnuplot
If you don't want to mess with your $HOME/bin directory on both machines, or if your $HOME has a different full path name, pick some other directory. For example, you might install it in /tmp/yourname/bin on both machines:
mkdir -p /tmp/yourname/bin
ln -s /usr/bin/gnuplot /tmp/yourname/bin/gnuplot # on one machine, OR
ln -s ~/bin/gnuplot /tmp/yourname/bin/gnuplot # on the other
and use:
#!/tmp/yourname/bin/gnuplot -p
Sorry if the title is bewildering. Let me illustrate by an example.
In Linux, the following two command works equally:
ls -lh /tmp
ls /tmp -lh
But in my Mac, the second one doesn't work.
I was wondering whether it's because they use different ls(GNU V.S. BSD), or there's some difference in the shell.
More importantly, how can I tweak Mac's terminal so the second one works, not only for ls, but other command as well.
You cannot tweak the shell, the difference is in GNU vs BSD tools as you suspected. You might try to compile the GNU toolchain on your Mac, but I'm not really sure it would be a good idea, as the OS might rely on these tools and incompatible changes might have unpredictable consequences. You could try compiling them to a /usr/local/ prefix and only use them in your own shell, but still — proceed with extreme caution.
It's the difference in versions of ls. The shell doesn't know the difference between dash options and other arguments; all are passed to the command, which determines their meaning.