ZSH arguments after file name - command-line-arguments

I switched from bash to zsh and I was wondering if there was a way to put arguments after file name like in bash
Example:
cp dir1 dir2 -r
Thank you

This depends only on the command, not on the shell. The shell passes the arguments in the order they're given, and makes no special treatment for arguments beginning with a -.

zsh has some expanded/different features in the area of globbing and tab completion (two of the primary reasons folks may switch to zsh). Both of these provide you interesting way to add command line parameters. Is that what you are asking about?
Note also that most commands are not impacted by the shell you choose: ls, awk, grep, vim, etc. Obviously things like alias and function that are shell commands are potentially different.

Related

Read file and run command in zsh

So I generally create job files with a list of commands in it. Then I execute it like so
cat jobFile | while read a; do $a; done
Which always works in bash. However, I've just started working in Mac which apparently uses zsh. And this command fails with "no such file" etc. I've tested the job file by running few lines from it manually, so it should be fine.
I've found questions on zsh read inbut they tend to be reading in from variables e.g. $a=('a' 'b' 'c') or echo $a
Thank you for your answers!
In bash, unquoted parameter expansions always undergo word-splitting, so if a="foo bar", then $a expands to two words, foo and bar. As a command, this means running the command foo with an argument bar.
In zsh, parameter expansions to not undergo word-splitting by default, which means the same expansion $a would produce a single word foo bar, treated as the name of the command to execute.
In either case, relying on parameter expansion to "parse" a shell command is fragile; in addition to word-splitting, the expansion is subject to pathname expansion (globbing), and you are limited to simple commands and their arguments. No pipes, lists (&&, ||), or redirections allowed, as everything will be treated as the command name and a sequence of arguments.
What you want in both shells is to simply treat your job file as a shell script, which can be executed in the current shell using the . command:
. jobFile
Why are you executing it in such a cumbersome way? Assuming jobFile is a file holding a sequence of bash commands, you can simply run it as
bash jobFile
If it contains a sequence of zsh commands, you can likewise run it as
zsh jobFile
If you follow this approach, I would however reflect in the name of the job file, what shell it is intended for, i.e.
bash jobFile.bash
zsh jobFile.zsh
and, if you write a job file so that it is supposed to be compatible with either shell, I would name it jobFile.sh.

prevent script injection when spawning command line with input arguments from external source

I've got a python script that wraps a bash command line tool, that gets it's variables from external source (environment variables). is there any way to perform some soft of escaping to prevent malicious user from executing bad code in one of those parameters.
for example if the script looks like this
/bin/sh
/usr/bin/tool ${VAR1} ${VAR2}
and someone set VAR2 as follows
export VAR2=123 && \rm -rf /
so it may not treat VAR2 as pure input, and perform the rm command.
Is there any way to make the variable non-executable and take the string as-is to the command line tool as input ?
The correct and safe way to pass the values of variables VAR1 and VAR2 as arguments to /usr/bin/tool is:
/usr/bin/tool -- "$VAR1" "$VAR2"
The quotes prevent any special treatment of separator or pattern matching characters in the strings.
The -- should prevent the variable values being treated as options if they begin with - characters. You might have to do something else if tool is badly written and doesn't accept -- to terminate command line options.
See Quotes - Greg's Wiki for excellent information about quoting in shell programming.
Shellcheck can detect many cases where quotes are missing. It's available as either an online tool or an installable program. Always use it if you want to eliminate many common bugs from your shell code.
The curly braces in the line of code in the question are completely redundant, as they usually are. Some people mistakenly think that they act as quotes. To understand their use, see When do we need curly braces around shell variables?.
I'm guessing that the /bin/sh in the question was intended to be a #! /bin/sh shebang. Since the question was tagged bash, note that #! /bin/sh should not be used with code that includes Bashisms. /bin/sh may not be Bash, and even if it is Bash it behaves differently when invoked as /bin/sh rather than /bin/bash.
Note that even if you forget the quotes the line of code in the question will not cause commands (like rm -rf /) embedded in the variable values to be run at that point. The danger is that badly-written code that uses the variables will create and run commands that include the variable values in unsafe ways. See should I avoid bash -c, sh -c, and other shells' equivalents in my shell scripts? for an explanation of (only) some of the dangers.
To avoid injections at best, consider switching to [T]csh.
Unlike Bourne Shells, the C Shell is "limited", thus instructing one to take different, safer paths to write scripts. The "limitations" imposed by the C Shell make it one of the most reliable Shells to work with.
(E.g: Nesting is minimal to impossible, thus preventing injections at all costs; there are better ways to achieve what one want.)

Create shell alias with semi-colon character

I've noticed that I have a tendency to mistype ls as ;s, so I decided that I should just create an alias so that instead of throwing an error, it just runs the command I mean.
However, knowing that the semi-colon character has a meaning in shell scripts/commands, is there any way to allow me to create an alias with that the semi-colon key? I've tried the following to no avail:
alias ;s=ls
alias ";s"=ls
alias \;=ls
Is it possible to use the semi-colon as a character in a shell alias? And how do I do so in ZSH?
First and foremost: Consider solving your problem differently - fighting the shell grammar this way is asking for trouble.
As far as I can tell, while you can define such a command - albeit not with an alias - the only way to call it is quoted, e.g. as \;s - which defeats the purpose; read on for technical details.
An alias won't work: while zsh allows you to define it (which, arguably, it shouldn't), the very mechanism that would be required to call it - quoting - is also the very mechanism that bypasses aliases and thus prevents invocation.
You can, however, define a function (zsh only) or a script in your $PATH (works in zsh as well as in bash, ksh, and dash), as long as you invoke it quoted (e.g., as \;s or ';s' or ";s"), which defeats the purpose.
For the record, here are the command definitions, but, again, they can only be invoked quoted.
Function (works in zsh only; place in an initialization file such as ~/.zshrc):
';s'() { ls "$#" }
Executable script ;s (works in dash, bash, ksh and zsh; place in a directory in your $PATH):
#!/bin/sh
ls "$#"

CMake's execute_process and arbitrary shell scripts

CMake's execute_process command seems to only let you, well, execute a process - not an arbitrary line you could feed a command shell. The thing is, I want to use pipes, file descriptor redirection, etc. - and that does not seem to be possible. The alternative would be very painful for me (I think)...
What should I do?
PS - CMake 2.8 and 3.x answer(s) are interesting.
You can execute any shell script, using your shell's support for taking in a script within a string argument.
Example:
execute_process(
COMMAND bash "-c" "echo -n hello | sed 's/hello/world/;'"
OUTPUT_VARIABLE FOO
)
will result in FOO containing world.
Of course, you would need to escape quotes and backslashes with care. Also remember that running bash would only work on platforms which have bash - i.e. it won't work on Windows.
execute_process command seems to only let you, well, execute a process - not an arbitrary line you could feed a command shell.
Yes, exactly this is written in documentation for that command:
All arguments are passed VERBATIM to the child process. No intermediate shell is used, so shell operators such as > are treated as normal arguments.
I want to use pipes
Different COMMAND within same execute_process invocation are actually piped:
Runs the given sequence of one or more commands with the standard output of each process piped to the standard input of the next.
file descriptor redirection, etc. - and that does not seem to be possible.
For complex things just prepare separate shell script and run it using execute_process. You can pass variables from CMake to this script using its parameters, or with prelimiary configure_file.
I needed to pipe two commands one after the other and actually learned that each COMMAND of the execute_process is piped already. So at least that much is resolved by simply adding commands one after the other:
execute_process(
COMMAND echo "Hello"
COMMAND sed -e 's/H/h/'
OUTPUT_VARIABLE GREETINGS
OUTPUT_STRIP_TRAILING_WHITESPACE)
Now the variable GREETINGS is set to hello.
If you indeed need a lot of file redirection (as you stated), you probably want to write an external script and then execute that script from CMakeLists.txt. It's really difficult to get all the escaping right in CMake.
If you can simplify your scripts to one command generating a file, then another handling that file, etc. then you can always use the INPUT_FILE and OUTPUT_FILE options. Or pass a filename to your command for the input.
It's often much cleaner to handle one file at a time. Although I understand that some commands may need multiple sources and destinations.

'which' command is incorrect

I have a shell script in my home directory called "echo". I added my home directory to my path, so that this echo would replace the other one.
To do this, I used: export PATH=/home/me:$PATH
When I do which echo, it shows the one I want. /home/me/echo
But when I actually do something like echo asdf it uses the system echo.
Am I doing something wrong?
which is an external command, so it doesn't have access to your current shell's built-in commands, functions, or aliases. In fact, at least on my system, /usr/bin/which is a shell script, so you can examine it and see how it works.
If you want to know how your shell will interpret a command, use type rather than which. If you're using bash, type -a will print all possible meanings in order of precedence. Consult your shell's documentation for details.
For most shells, built-in commands take precedence over commands in your $PATH. The whole point of having a built-in echo, for example, is that it's faster than loading /bin/echo into memory.
If you want your own echo command to override the shell's built-in echo, you can define it as a shell function.
On the other hand, overriding the built-in echo command doesn't strike me as a good idea in the first place. If it behaves the same as the built-in echo, there's not much point. If it doesn't, then it could break scripts that use echo expecting it to work a certain way. If possible, I suggest giving your command a different way. If it's an enhanced version of echo, you could even call it Echo.
It is likely using the shell's builtin.
If you want the one in your path you can do
`which echo` asdf
From this little article that explains the rules, here's a list in descending order of precedence:
Aliases
Shell functions
Shell builtin commands
Hash tables
PATH variable
echo is a shell builtin command (al least in bash) and PATH has the lowest priority. I guess you'll need to create a function or an alias.

Resources