Calling a shell command from Applescript with quotes - applescript

This seems like it should be simple, but I'm pulling out my remaining hair trying to get it to work. In a shell script I want to run some Applescript code that defines a string, then pass that string (containing a single quote) to a shell command that calls PHP's addslashes function, to return a string with that single quote escaped properly.
Here's the code I have so far - it's returning a syntax error.
STRING=$(osascript -- - <<'EOF'
set s to "It's me"
return "['test'=>'" & (do shell script "php -r 'echo addslashes(\"" & s & "\");") & "']"
EOF)
echo -e $STRING
It's supposed to return this:
['test'=>'It\'s me']

First, when asking a question like this, please include what's happening, not just what you're trying to do. When I try this, I get:
42:99: execution error: sh: -c: line 0: unexpected EOF while looking for matchin
sh: -c: line 1: syntax error: unexpected end of file (2)
(which is actually two error messages, with one partly overwriting the other.) Is that what you're getting?
If it is, the problem is that the inner shell command you're creating has quoting issues. Take a look at the AppleScript snippet that tries to run a shell command:
do shell script "php -r 'echo addslashes(\"" & s & "\");"
Since s is set to It's me, this runs the shell command:
php -r 'echo addslashes("It's me");
Which has the problem that the apostrophe in It's me is acting as a close-quote for the string that starts 'echo .... After that, the double-quote in me"); is seen as opening a new quoted string, which doesn't get closed before the end of the "file", causing the unexpected EOF problem.
The underlying problem is that you're trying to pass a string from AppleScript to shell to php... but each of those has its own rules for parsing strings (with different ideas about how quoting and escaping work). Worse, it looks like you're doing this so you can get an escaped string (following which set of escaping rules?) to pass to something else... This way lies madness.
I'm not sure what the real goal is here, but there has to be a better way; something that doesn't involve a game of telephone with players that all speak different languages. If not, you're pretty much doomed.
BTW, there are a few other dubious shell-scripting practices in the script:
Don't use all-caps variable named in shell scripts. There are a bunch of all-caps variables that have special meanings, and if you accidentally use one of those for something else, weird results can happen.
Put double-quotes around all variable references in scripts, to avoid them getting split into multiple "words" and/or expanded as shell wildcards. For example, if the variable string was set to "['test'=>'It\'s-me']", and you happened to have files named "t" and "m" in the current directory, echo -e $string will print "m t" because those are the files that match the [] pattern.
Don't use echo with options and/or to print strings that might contain escapes, since different versions treat these things differently. Some versions, for example, will print the "-e" as part of the output string. Use printf instead. The first argument to printf is a format string that tells it how to format all of the rest of the arguments. To emulate echo -e "$string" in a more reliable form, use printf '%b\n' "$string".

To complement Gordon Davisson's helpful answer with a pragmatic solution:
Shell strings cannot contain \0 (NUL) characters, but the following sed command emulates all other escaping that PHP's (oddly named) addslashes PHP function performs (\-escaping ', " and \ instances):
string=$(osascript <<'EOF'
set s to "It's me\\you and we got 3\" of rain."
return "['test'=>'" & (do shell script "sed 's/[\"\\\\'\\'']/\\\\&/g' <<<" & quoted form of s) & "']"
EOF
)
printf '%s\n' "$string"
yields
['test'=>'It\'s me\\you and we got 3\" of rain.']
Note the use of quoted form of, which is crucial for passing a string from AppleScript to a do shell script shell command with proper quoting.
Also note how the closing here-doc delimiter, EOF, is on its own line to ensure that it is properly recognized (in Bash 3.2.57, as used on macOS 10.12, (also when called as /bin/sh, which is what do shell script does), this isn't strictly necessary, but Bash 4.x would rightfully complain about EOF) with warning: here-document at line <n> delimited by end-of-file (wanted 'EOF')

Related

Bash script: any way to collect remainder of command line as a string, including quote characters?

The following simplified version of a script I'll call logit obviously just appends everything but $1 in a text file, so I can keep track of time like this:
$ logit Started work on default theme
But bash expansion gets confused by quotes of any kind. What I'd like is to do things like
$ logit Don't forget a dark mode
But when that happens of course shell expansion rules cause a burp:
quote>
I know this works:
# Yeah yeah I can enclose it in quotes but I'd prefer not to
$ logit "Don't forget a dark mode"
Is there any way to somehow collect the remainder of the command line before bash gets to it, without having to use quotes around my command line?
Here's a minimal working version of the script.
#!/bin/bash
log_file=~/log.txt
now=$(date +"%T %r")
echo "${now} ${#:1}" >> $log_file
Is there any way to somehow collect the remainder of the command line before bash gets to it, without having to use quotes around my command line?
No. There is no "before bash gets into it" time. Bash reads the input you are typing, Bash parses the input you are typing, there is nothing in between or "before". There is only Bash.
You can: use a different shell or write your own. Note that quotes parsing like in shell is very common, you may consider that it could be better for you to understand and get used to it.
you can use a backslash "\" before the single quote
$ logit Don\'t forget a dark mode

How to use a pure string as an argument for python program through bash terminal

I am trying to give an argument to my python program through the terminal.
For this I am using the lines:
import sys
something = sys.argv[1]
I now try to put in a string like this through the bash terminal:
python my_script.py 2m+{N7HiwH3[>!"4y?t9*y#;/$Ar3wF9+k$[3hK/WA=aMzF°L0PaZTM]t*P|I_AKAqIb0O4# cm=sl)WWYwEg10DDv%k/"c{LrS)oVd§4>8bs:;9u$ *W_SGk3CXe7hZMm$nXyhAuHDi-q+ug5+%ioou.,IhC]-_O§V]^,2q:VBVyTTD6'aNw9:oan(s2SzV
This returns a bash error because some of the characters in the string are bash special characters.
How can I use the string exactly as it is?
You can put the raw string into a file, for example like this, with cat and a here document.
cat <<'EOF' > file.txt
2m+{N7HiwH3[>!"4y?t9*y#;/$Ar3wF9+k$[3hK/WA=aMzF°L0PaZTM]t*P|I_AKAqIb0O4# cm=sl)WWYwEg10DDv%k/"c{LrS)oVd§4>8bs:;9u$ *W_SGk3CXe7hZMm$nXyhAuHDi-q+ug5+%ioou.,IhC]-_O§V]^,2q:VBVyTTD6'aNw9:oan(s2SzV
EOF
and then run
python my_script.py "$(< file.txt)"
You can also use the text editor of your choice for the first step if you prefer that.
If this is a reoccurring task, which you have to perform from time to time, you can make your life easier with a little alias in your shell:
alias escape='read -r string ; printf "Copy this:\n%q\n" "${string}"'
It is using printf "%q" to escape your input string.
Run it like this:
escape
2m+{N7HiwH3[>!"4y?t9*y#;/$Ar3wF9+k$[3hK/WA=aMzF°L0PaZTM]t*P|I_AKAqIb0O4# cm=sl)WWYwEg10DDv%k/"c{LrS)oVd§4>8bs:;9u$ *W_SGk3CXe7hZMm$nXyhAuHDi-q+ug5+%ioou.,IhC]-_O§V]^,2q:VBVyTTD6'aNw9:oan(s2SzV
Copy this:
2m+\{N7HiwH3\[\>\!\"4y\?t9\*y#\;/\$Ar3wF9+k\$\[3hK/WA=aMzF°L0PaZTM\]t\*P\|I_AKAqIb0O4#\ cm=sl\)WWYwEg10DDv%k/\"c\{LrS\)oVd§4\>8bs:\;9u\$\ \*W_SGk3CXe7hZMm\$nXyhAuHDi-q+ug5+%ioou.\,IhC\]-_O§V\]\^\,2q:VBVyTTD6\'aNw9:oan\(s2SzV
You can use the escaped string directly in your shell, without additional quotes, like this:
python my_script.py 2m+\{N7HiwH3\[\>\!\"4y\?t9\*y#\;/\$Ar3wF9+k\$\[3hK/WA=aMzF°L0PaZTM\]t\*P\|I_AKAqIb0O4#\ cm=sl\)WWYwEg10DDv%k/\"c\{LrS\)oVd§4\>8bs:\;9u\$\ \*W_SGk3CXe7hZMm\$nXyhAuHDi-q+ug5+%ioou.\,IhC\]-_O§V\]\^\,2q:VBVyTTD6\'aNw9:oan\(s2SzV
In order to make life easier, shells like bash do a little bit of extra work to help users pass the correct arguments to the programs they instruct it to execute. This extra work usually results in predictable argument arrays getting passed to programs.
Oftentimes, though, this extra help results in unexpected arguments getting passed to programs; and sometimes results in the execution of undesired additional commands. In this case, though, it ended up causing Bash to emit an error.
In order to turn off this extra work, Bash allows users to indicate where arguments should begin and end by surrounding them by quotation marks. Bash supports both single quotes (') and double quotes (") to delimit arguments. As a last resort, if a string may contain single and double quotes (or double quotes are required but aren't aggressive enough), Bash allows you to indicate that a special- or whitespace-character should be part of the adjacent argument by preceding it with a backslash (\\).
If this method of escaping arguments is too cumbersome, it may be worth simplifying your program's interface by having it consume this data from a file instead of a command line argument. Another option is to create a program that loads the arguments from a more controlled location (like a file) and directly execs the target program with the desired argument array.

Script takes only first part of double quotes

Yesterday I asked a similar question about escaping double quotes in env variables, although It didn't solve my problem (Probably because I didn't explain good enough) so I would like to specify more.
I'm trying to run a script (Which I know is written in Perl), although I have to use it as a black box because of permissions issue (so I don't know how the script works). Lets call this script script_A.
I'm trying to run a basic command in Shell: script_A -arg "date time".
If I run from the command line, it's works fine, but If I try to use it from a bash script or perl scrip (for example using the system operator), it will take only the first part of the string which was sent as an argument. In other words, it will fail with the following error: '"date' is not valid..
Example to specify a little bit more:
If I run from the command line (works fine):
> script_A -arg "date time"
If I run from (for example) a Perl script (fails):
my $args = $ENV{SOME_ENV}; # Assume that SOME_ENV has '-arg "date time"'
my $cmd = "script_A $args";
system($cmd");
I think that the problem comes from the environment variable, but I can't use the one quote while defining the env variable. For example, I can't use the following method:
setenv SOME_ENV '-arg "date time"'
Because it fails with the following error: '"date' is not valid.".
Also, I tried to use the following method:
setenv SOME_ENV "-arg '"'date time'"'"
Although now the env variable will containe:
echo $SOME_ENV
> -arg 'date time' # should be -arg "date time"
Another note, using \" fails on Shell (tried it).
Any suggestions on how to locate the reason for the error and how to solve it?
The $args, obtained from %ENV as you show, is a string.
The problem is in what happens to that string as it is manipulated before arguments are passed to the program, which needs to receive strings -arg and date time
If the program is executed in a way that bypasses the shell, as your example is, then the whole -arg "date time" is passed to it as its first argument. This is clearly wrong as the program expects -arg and then another string for its value (date time)
If the program were executed via the shell, what happens when there are shell metacharacters in the command line (not in your example), then the shell would break the string into words, except for the quoted part; this is how it works from the command line. That can be enforced with
system('/bin/tcsh', '-c', $cmd);
This is the most straightforward fix but I can't honestly recommend to involve the shell just for arguments parsing. Also, you are then in the game of layered quoting and escaping, what can get rather involved and tricky. For one, if things aren't right the shell may end up breaking the command into words -arg, "date, time"
How you set the environment variable works
> setenv SOME_ENV '-arg "date time"'
> perl -wE'say $ENV{SOME_ENV}' #--> -arg "date time" (so it works)
what I believe has always worked this way in [t]csh.
Then, in a Perl script: parse this string into -arg and date time strings, and have the program is executed in a way that bypasses the shell (if shell isn't used by the command)
my #args = $ENV{SOME_ENV} =~ /(\S+)\s+"([^"]+)"/; #"
my #cmd = ('script_A', #args);
system(#cmd) == 0 or die "Error with system(#cmd): $?";
This assumes that SOME_ENV's first word is always the option's name (-arg) and that all the rest is always the option's value, under quotes. The regex extracts the first word, as consecutive non-space characters, and after spaces everything in quotes.† These are program's arguments.
In the system LIST form the program that is the first element of the list is executed without using a shell, and the remaining elements are passed to it as arguments. Please see system for more on this, and also for basics of how to investigate failure by looking into $? variable.
It is in principle advisable to run external commands without the shell. However, if your command needs the shell then make sure that the string is escaped just right to to preserve quotes.
Note that there are modules that make it easier to use external commands. A few, from simple to complex: IPC::System::Simple, Capture::Tiny, IPC::Run3, and IPC::Run.
I must note that that's an awkward environment variable; any way to ogranize things otherwise?
† To make this work for non-quoted arguments as well (-arg date) make the quote optional
my #args = $ENV{SOME_ENV} =~ /(\S+)\s+"?([^"]+)/;
where I now left out the closing (unnecessary) quote for simplicity

Bash "command not found" error

I am trying to edit my .bashrc file with a custom function to launch xwin. I want it to be able to open in multiple windows, so I decided to make a function that accepts 1 parameter: the display number. Here is my code:
function test(){
a=$(($1-0))
"xinit -- :$a -multiwindow -clipboard &"
}
The reason why I created a variable "a" to hold the input is because I suspected that the input was being read in as a string and not a number. I was hoping that taking the step where I subtract the input by 0 would convert the string into an integer, but I'm not actually sure if it will or not. Now, when I call
test 0
I am given the error
-bash: xinit -- :0 -multiwindow -clipboard &: command not found
How can I fix this? Thanks!
Because the entire quoted command is acting as the command itself:
$ "ls"
a b c
$ "ls -1"
-bash: ls -1: command not found
Get rid of the double quotation marks surrounding your xinit:
xinit -- ":$a" -multiwindow -clipboard &
In addition to the double-quotes bishop pointed out, there are several other problems with this function:
test is a standard, and very important, command. Do not redefine it! If you do, you risk having some script (or sourced file, or whatever) run:
if test $num -eq 5; then ...
Which will fire off xinit on some random window number, then continue the script as if $num was equal to 5 (whether or not it actually is). This way lies madness.
As chepner pointed out in a comment, bash doesn't really have an integer type. To it, an integer is just a string that happens to contain only digits (and maybe a "-" at the front), so converting to integer is a non-opertation. But what you might want to do is check whether the parameter got left off. You can either check whether $1 is empty (e.g. if [[ -z "$1" ]]; then echo "Usage: ..." >&2 etc), or supply a default value with e.g. ${1:-0} (in this case, "0" is used as the default).
Finally, don't use the function keyword. bash tolerates it, but it's nonstandard and doesn't do anything useful.
So, here's what I get as the cleaned-up version of the function:
launchxwin() {
xinit -- ":${1:-0}" -multiwindow -clipboard &
}
That happens because bash interprets everything inside quotes as a String. A command is an array of strings which the first element is a binary file or a internal shell command. Subsequent strings in the array are taken as argument.
When you type:
"xinit -- :$a -multiwindow -clipboard &"
the shell thinks that everything you wrote is a command. Depending on the command/program you ran all the rest of the arguments can be a single string. But mostly you use quotes only if you are passing a argument that has spaces inside like:
mkdir "My Documents"
That creates a single directory named My Documents. Also, you could escape spaces like this.
mkdir My\ Documents
But remember, "$" is a special character like "\". It gets interpreted by the shell as a variable. "$a" will be substituted by its value before executing. If you use a simple quote ('$a') it will not be interpreted by the shell.
Also, "&" is a special character that executes the command in background. You should probably pass it outside the quotes also.

Line feed is being removed from echo when called in double-quotes

I'm trying to populate a shell variable called $recipient which should contain a value followed by a new-line.
$ set -x # force bash to show commands as it executes them
I start by populating $user, which is the value that I want to be followed by the newline.
$ user=user#xxx.com
+ user=user#xxx.com
I then call echo $user inside a double-quoted command substitution. The echo statement should create a newline after $user, and the double-quotes should preserve the newline.
$ recipient="$(echo $user)"
++ echo user#xxx.com
+ recipient=user#xxx.com
However when I print $recipient, I can see that the newline has been discarded.
$ echo "'recipient'"
+ echo ''\''recipient'\'''
'recipient'
I've found the same behaviour under bash versions 4.1.5 and 3.1.17, and also replicated the issue under dash.
I tried using "printf" rather than echo; this didn't change anything.
Is this expected behaviour?
Command substitution removes trailing newlines. From the standard:
The shell shall expand the command substitution by executing command in a subshell environment (see Shell Execution Environment ) and replacing the command substitution (the text of command plus the enclosing "$()" or backquotes) with the standard output of the command, removing sequences of one or more characters at the end of the substitution. Embedded characters before the end of the output shall not be removed; however, they may be treated as field delimiters and eliminated during field splitting, depending on the value of IFS and quoting that is in effect. If the output contains any null bytes, the behavior is unspecified.
You will have to explicitly add a newline. Perhaps:
recipient="$user
"
There's really no reason to use a command substitution here. (Which is to say that $(echo ...) is almost always a silly thing to do.)
All shell versions will react the same way, this is nothing new in scripting.
The new-line at the end of your original assignment is not included in the variable's value. It only "terminates" the current cmd and signals the shell to process.
Maybe user="user#xxx.com\n" will work, but without context about why you want this, just know that people usually keep variables values separate from the formatting "tools" like the newline.
IHTH.

Resources