I can't force KSH to pass \" as a parameter to command.
What I want to be executed LITERALLY is:
command \"foo bar\"
Below examples are executed in ksh with set -x executed beforehand to display the echo of each command in the exact format in which it would be executed (after it is parsed by ksh)
[oracle#localhost sf_vm]$ ksh
$ set -x
What I tried and got is:
$ command \"foo bar\"
+ command '"foo' 'bar"'
not what I need: missing \ and additional ' and "
$ command \\\"foo bar\\\"
+ command '\"foo' 'bar\"'
no luck again: additional ' outside and between foo and bar
$ command "\\\"foo bar\\\""
+ command '\"foo bar\"'
almost right but still unwanted ' on the outside
$ command '\"foo bar\"'
+ command '\"foo bar\"'
again almost right but still ' on the outside
The actual use case is that I'm passing USERID parameter to oracle expdb command in format of
expdp USERID=/ AS SYSDBA which needs to be enclosed in escaped double quotes like so:
expdp USERID=\"/ AS SYSDBA\"
I know I can workaround this by using a prm file for expdp but it's very inconvenient in my case (automated export)
I'm using ksh93 version if that makes any difference:
[oracle#localhost sf_vm]$ ll /etc/alternatives/ksh
lrwxrwxrwx. 1 root root 10 Oct 15 2018 /etc/alternatives/ksh -> /bin/ksh93
Let's start from your use case.
The following command, which you assert works
expdp USERID=\"/ AS SYSDBA\"
evaluates to a rather unintuitive argument list, as you can see below (note that this particular code only works for argument lists that don't contain literal newlines anywhere):
$ printf '%s\n' expdp USERID=\"/ AS SYSDBA\" | jq -Rnc '[inputs]'
["expdp","USERID=\"/","AS","SYSDBA\""]
...which is to say, you're passing expdp three arguments beyond the 0th/first one: USERID="/ is the first argument, AS is the second argument, and SYSDBA" is the third argument. The shell is correct when its set -x implementation prints this argument list as expdp 'USERID="/' AS 'SYSDBA"', as that argument list eval's identically to the above.
Frankly, I doubt very much that this is actually correct/intended usage for any tool. It would make more sense if your backslashes weren't escaped, and your intended usage were instead expdp USERID="/ AS SYSDBA", which would evaluate to (in JSON syntax) ["expdp", "USERID=/ AS SYSDBA"].
Ok, thanks to your comments about the set -x which I did interpreted wrongly I got it actually working with command \"foo bar\" besides what the set output would suggest it correctly passes the "foo bar" enclosed with brackets as a command paramater.
As it turned out my actual issue was the syntax in the parameter value itself.
For future reference following expdp command syntax is correct in ksh when value begin passed has spaces (in this case userid as sysdba):
expdp USERID=\"sys/password#SID as sysdba\" DIRECTORY=EXPDP DUMPFILE=HR.COUNTRIES.dmp TABLES=HR.COUNTRIES ...
Related
I have the following curl command in a file.
curl \
--request POST \
https://oauth2.googleapis.com/tokeninfo?id_token=eyJhbGci
How do I execute the command in bash/zsh shell within vim?
I tried to do :!<C-r>" (Ctrl + r then ") but it says
zsh:1: no match found: https://oauth2.googleapis.com/tokeninfo?id_token=eyJhbGci
shell returned 1
? is a glob character, zsh is looking for a file named http://oauth2.googleapis.com/tokeninfo + a single character + id_token=eyJhbGci and reporting that there are no matches.
Escape or quote it, any one of
https://oauth2.googleapis.com/tokeninfo\?id_token=eyJhbGci
'https://oauth2.googleapis.com/tokeninfo?id_token=eyJhbGci'
"https://oauth2.googleapis.com/tokeninfo?id_token=eyJhbGci"
or use setopt +o nomatch to make zsh behave like the default in other shells (if no matches, continue with the argument untouched).
Preferably just quote it.
Alternatively, you can use :help :w_c to pass arbitrary lines to an external command.
Select three lines and execute via sh:
vjj
:'<,'>w !sh
Execute current line:
:.w !sh
Execute whole buffer:
:%w !sh
Execute given range:
:12,34w !sh
See :help :range.
Yesterday I asked a similar question about escaping double quotes in env variables, although It didn't solve my problem (Probably because I didn't explain good enough) so I would like to specify more.
I'm trying to run a script (Which I know is written in Perl), although I have to use it as a black box because of permissions issue (so I don't know how the script works). Lets call this script script_A.
I'm trying to run a basic command in Shell: script_A -arg "date time".
If I run from the command line, it's works fine, but If I try to use it from a bash script or perl scrip (for example using the system operator), it will take only the first part of the string which was sent as an argument. In other words, it will fail with the following error: '"date' is not valid..
Example to specify a little bit more:
If I run from the command line (works fine):
> script_A -arg "date time"
If I run from (for example) a Perl script (fails):
my $args = $ENV{SOME_ENV}; # Assume that SOME_ENV has '-arg "date time"'
my $cmd = "script_A $args";
system($cmd");
I think that the problem comes from the environment variable, but I can't use the one quote while defining the env variable. For example, I can't use the following method:
setenv SOME_ENV '-arg "date time"'
Because it fails with the following error: '"date' is not valid.".
Also, I tried to use the following method:
setenv SOME_ENV "-arg '"'date time'"'"
Although now the env variable will containe:
echo $SOME_ENV
> -arg 'date time' # should be -arg "date time"
Another note, using \" fails on Shell (tried it).
Any suggestions on how to locate the reason for the error and how to solve it?
The $args, obtained from %ENV as you show, is a string.
The problem is in what happens to that string as it is manipulated before arguments are passed to the program, which needs to receive strings -arg and date time
If the program is executed in a way that bypasses the shell, as your example is, then the whole -arg "date time" is passed to it as its first argument. This is clearly wrong as the program expects -arg and then another string for its value (date time)
If the program were executed via the shell, what happens when there are shell metacharacters in the command line (not in your example), then the shell would break the string into words, except for the quoted part; this is how it works from the command line. That can be enforced with
system('/bin/tcsh', '-c', $cmd);
This is the most straightforward fix but I can't honestly recommend to involve the shell just for arguments parsing. Also, you are then in the game of layered quoting and escaping, what can get rather involved and tricky. For one, if things aren't right the shell may end up breaking the command into words -arg, "date, time"
How you set the environment variable works
> setenv SOME_ENV '-arg "date time"'
> perl -wE'say $ENV{SOME_ENV}' #--> -arg "date time" (so it works)
what I believe has always worked this way in [t]csh.
Then, in a Perl script: parse this string into -arg and date time strings, and have the program is executed in a way that bypasses the shell (if shell isn't used by the command)
my #args = $ENV{SOME_ENV} =~ /(\S+)\s+"([^"]+)"/; #"
my #cmd = ('script_A', #args);
system(#cmd) == 0 or die "Error with system(#cmd): $?";
This assumes that SOME_ENV's first word is always the option's name (-arg) and that all the rest is always the option's value, under quotes. The regex extracts the first word, as consecutive non-space characters, and after spaces everything in quotes.† These are program's arguments.
In the system LIST form the program that is the first element of the list is executed without using a shell, and the remaining elements are passed to it as arguments. Please see system for more on this, and also for basics of how to investigate failure by looking into $? variable.
It is in principle advisable to run external commands without the shell. However, if your command needs the shell then make sure that the string is escaped just right to to preserve quotes.
Note that there are modules that make it easier to use external commands. A few, from simple to complex: IPC::System::Simple, Capture::Tiny, IPC::Run3, and IPC::Run.
I must note that that's an awkward environment variable; any way to ogranize things otherwise?
† To make this work for non-quoted arguments as well (-arg date) make the quote optional
my #args = $ENV{SOME_ENV} =~ /(\S+)\s+"?([^"]+)/;
where I now left out the closing (unnecessary) quote for simplicity
I am trying to edit my .bashrc file with a custom function to launch xwin. I want it to be able to open in multiple windows, so I decided to make a function that accepts 1 parameter: the display number. Here is my code:
function test(){
a=$(($1-0))
"xinit -- :$a -multiwindow -clipboard &"
}
The reason why I created a variable "a" to hold the input is because I suspected that the input was being read in as a string and not a number. I was hoping that taking the step where I subtract the input by 0 would convert the string into an integer, but I'm not actually sure if it will or not. Now, when I call
test 0
I am given the error
-bash: xinit -- :0 -multiwindow -clipboard &: command not found
How can I fix this? Thanks!
Because the entire quoted command is acting as the command itself:
$ "ls"
a b c
$ "ls -1"
-bash: ls -1: command not found
Get rid of the double quotation marks surrounding your xinit:
xinit -- ":$a" -multiwindow -clipboard &
In addition to the double-quotes bishop pointed out, there are several other problems with this function:
test is a standard, and very important, command. Do not redefine it! If you do, you risk having some script (or sourced file, or whatever) run:
if test $num -eq 5; then ...
Which will fire off xinit on some random window number, then continue the script as if $num was equal to 5 (whether or not it actually is). This way lies madness.
As chepner pointed out in a comment, bash doesn't really have an integer type. To it, an integer is just a string that happens to contain only digits (and maybe a "-" at the front), so converting to integer is a non-opertation. But what you might want to do is check whether the parameter got left off. You can either check whether $1 is empty (e.g. if [[ -z "$1" ]]; then echo "Usage: ..." >&2 etc), or supply a default value with e.g. ${1:-0} (in this case, "0" is used as the default).
Finally, don't use the function keyword. bash tolerates it, but it's nonstandard and doesn't do anything useful.
So, here's what I get as the cleaned-up version of the function:
launchxwin() {
xinit -- ":${1:-0}" -multiwindow -clipboard &
}
That happens because bash interprets everything inside quotes as a String. A command is an array of strings which the first element is a binary file or a internal shell command. Subsequent strings in the array are taken as argument.
When you type:
"xinit -- :$a -multiwindow -clipboard &"
the shell thinks that everything you wrote is a command. Depending on the command/program you ran all the rest of the arguments can be a single string. But mostly you use quotes only if you are passing a argument that has spaces inside like:
mkdir "My Documents"
That creates a single directory named My Documents. Also, you could escape spaces like this.
mkdir My\ Documents
But remember, "$" is a special character like "\". It gets interpreted by the shell as a variable. "$a" will be substituted by its value before executing. If you use a simple quote ('$a') it will not be interpreted by the shell.
Also, "&" is a special character that executes the command in background. You should probably pass it outside the quotes also.
Consider this little shell script.
# Save the first command line argument
cmd="$1"
# Execute the command specified in the first command line argument
out=$($cmd)
# Do something with the output of the specified command
# Here we do a silly thing, like make the output all uppercase
echo "$out" | tr -s "a-z" "A-Z"
The script executes the command specified as the first argument, transforms the output obtained from that command and prints it to standard output. This script may be executed in this manner.
sh foo.sh "echo select * from table"
This does not do what I want. It may print something like the following,
$ sh foo.sh "echo select * from table"
SELECT FILEA FILEB FILEC FROM TABLE
if fileA, fileB and fileC is present in the current directory.
From a user perspective, this command is reasonable. The user has quoted the * in the command line argument, so the user doesn't expect the * to be globbed. But my script astonishes the user by using this argument in a command substitution which causes globbing of * as seen in the above output.
I want the output to be the following instead.
SELECT * FROM TABLE
The entire text in cmd actually comes from command line arguments to the script so I would like to preserve any * symbol present in the argument without globbing them.
I am looking for a solution that works for any POSIX shell.
One solution I have come up with is to disable globbing with set -o noglob just before the command substitution. Here is the complete code.
# Save the first command line argument
cmd="$1"
# Execute the command specified in the first command line argument
set -o noglob
out=$($cmd)
# Do something with the output of the specified command
# Here we do a silly thing, like make the output all uppercase
echo "$out" | tr -s "a-z" "A-Z"
This does what I expect.
$ sh foo.sh "echo select * from table"
SELECT * FROM TABLE
Apart from this, is there any other concept or trick (such as a quoting mechanism) I need to be aware of to disable globbing only within a command substitution without having to use set -o noglob.
I am not against set -o noglob. I just want to know if there is another way. You know, globbing can be disabled for normal command line arguments just by quoting them, so I was wondering if there is anything similar for command substiution.
If I understand correctly, you want the user to provide a shell command as a command-line argument, which will be executed by the script, and is expected to produce an SQL string, which will be processed (upper-cased) and echoed to stdout.
The first thing to say is that there is no point in having the user provide a shell command that the script just blindly executes. If the script applied some kind of modification/preprocessing of the command before it executed it then perhaps it could make sense, but if not, then the user might as well execute the command himself and pass the output to the script as a command-line argument, or via stdin.
But that being said, if you really want to do it this way, then there are two things that need to be said. Firstly, this is the proper form to use:
out=$(eval "$cmd");
A fairly advanced understanding of the shell grammer and expansion rules would be required to fully understand the rationale for using the above syntax, but basically executing $cmd and executing eval "$cmd" have subtle differences that render the $cmd form inappropriate for executing a given shell command string.
Just to give some detail that will hopefully clarify the above point, there are seven kinds of expansion that are performed by the shell in the following order when processing input: (1) brace expansion, (2) tilde expansion, (3) parameter and variable expansion, (4) arithmetic expansion, (5) command substitution, (6) word splitting, and (7) pathname expansion. Notice that variable expansion happens somewhat in the middle of that sequence, and thus the variable-expanded shell command (which was provided by the user) will not receive the benefit of the prior expansion types. Other issues are that leading variable assignments, pipelines, and command list tokens will not be executed correctly under the $cmd form, because they are parsed and processed prior to variable expansion (actually prior to all expansions) as well.
By running the command through eval, properly double-quoted, you ensure that the full shell parsing/processing/execution algorithm will be applied to the shell command string that was given by the user of your script.
The second thing to say is this: If you try the above proper form in your script, you will find that it has not solved your problem. You will still get SELECT FILEA FILEB FILEC FROM TABLE as output.
The reason is this: Since you've decided you want to accept an arbitrary shell command from the user of your script, it is now the user's responsibility to properly quote all metacharacters that may be embedded in that piece of code. It does not make sense for you to accept a shell command as a command-line argument, but somehow change the processing rules for shell commands so that certain metacharacters will no longer be metacharacters when the given shell command is executed. Actually, you could do something like that, perhaps using set -o noglob as you discovered, but then that must become a contract between the script and the user of the script; the user must be made aware of exactly what the precise processing rules will be when the command is executed so that he can properly use the script.
Under this design, the user could call the script as follows (notice the extra layer of quoting for the shell command string evaluation; could alternatively backslash-escape just the asterisk):
$ sh foo.sh "echo 'select * from table'";
I'd like to return to my earlier comment about the overall design; it doesn't really make sense to do it this way. It makes more sense to take the text-to-process itself, not a shell command that is expected to produce the text-to-process.
Here is how that could be done:
## take the text-to-process via a command-line argument
sql="$1";
## process and echo it
echo "$sql"| tr a-z A-Z;
(I also removed the -s option of tr, which really doesn't make sense here.)
Notice that the script is simpler now, and usage is also simpler:
$ sh foo.sh 'select * from table';
Can someone explain this to me, please?
$ set -x
$ export X="--vendor Bleep\ Bloop"; echo $X
+ export 'X=--vendor Bleep\ Bloop'
+ X='--vendor Bleep\ Bloop'
+ echo --vendor 'Bleep\' Bloop
--vendor Bleep\ Bloop
$
Specifically, why does the echo line insert ' characters that I didn't ask for, and why does it leave the string looking unterminated?
Understanding Shell Expansions
Bash performs shell expansions in a set order. The -x flag allows you to see the intermediate results of the steps that Bash takes as it tokenizes and expands the words that compose the input line.
In other words, the output is operating as designed. Unless you're trying to debug tokenization, word-splitting, or expansion, the intermediate results shouldn't really matter to you.
(Good question)
the ' chars aren't really there.
I would describe what you see as the -x features attempt to disambiguate how it is handling keeping your string intact. The + sign at the front of separate line with echo in it shows you that this is shell debug/trace output.
Note that the final output is exactly like your assignment, i.e. X=...
IHTH
Your confusion seems to be arising more from this + echo --vendor 'Bleep\' Bloop. The reason it appears like that is because it is printing what it would look like when you expand X. In other words doing $X evaluates to putting the independent "words" --vendor, Bleep\, and Bloop on the command line. However, this means that Bloop\ is a word and to prevent the \ from being interpreted to escape the (space), it is preserving the \. If these are meant to be parameters to a different command, I would suggest doing either:
export X='--vendor "Bleep Bloop"'
or
export X="--vendor \"Bleep Bloop\""
but I'm 100% not sure if either work. If you want to store parameters to a command you could do:
# optional:
# declare -a ARGS
ARGS=('--vendor' '"Bleep Bloop"')
And then use them as:
echo ${ARGS[#]}
This code
echo --vendor 'Bleep\' Bloop
produce the exact same output as
echo "--vendor Bleep\ Bloop"
Bash is only reinterpreting your code into it's own code via the debug/trace option.
Reasons for this are probably historical and should not be cared about.
When you set -x you are telling Bash to print its interpretation of every command you put in.
So when you put in
export X="--vendor Bleep\ Bloop"
Bash sees it as
export 'X=--vendor Bleep\ Bloop'
and prints as such.