I'm trying to pass a bash list into an applescript variable, but it's not working through the conventional method.
I have a list that looks something like [Terminal, Utilities, Finder, Launchpad]. I pass the bash list into an applescript variable by stating:
set applescriptList to "$bashList"
This works with any other data structure I've tried to pass into an applescript variable via osascript However, when I log the first item in the list, instead of returning Terminal in this example, it would return the letter "T". Similarly, if I log the second item, it returns the letter "e" instead of Utilities, so it seems each individual character is an item in the list.
Does anybody have any experience passing a bash list into an applescript variable? Thanks.
Given this:
bashList=(Terminal Utilities Finder Launchpad)
then you can do this:
osascript -e "set ASlist to the paragraphs of \"$(printf '%s\n' "${bashList[#]}")\""
or, somewhat simpler to read given all the quotes that need escaping above:
osascript <<OSA
set ASlist to the paragraphs of "$(printf '%s\n' "${bashList[#]}")"
OSA
Although, if you know that each item in your bashList consists only of a single word, then this negates the need for printf:
osascript <<OSA
set ASlist to the words of "${bashList[#]}"
OSA
The result returned to stdout in bash after running any of these examples will look like this:
Terminal, Utilities, Finder, Launchpad
which is what I would expect. The equivalent AppleScript output would have looked like this:
{Terminal, Utilities, Finder, Launchpad}
to indicate that ASlist had be assigned a list of values, denoted by the curly braces. However, this list is poorly translated into bash output, so the curly braces get lost and it isn't easy to distinguish a list from a string containing a few commas.
I assume by 'bash list' you mean multi-line text, in which case use paragraphs of bashList to split it at line breaks.
Related
Is it possible to echo a text and keep all quotes and double quotes in place?
I want to write a function to copy the text currently written in the terminal (completely with quotes).
Since I am on OSX I have to use pbcopy:
pb(){echo "$#" | pbcopy}
But pb osascript -e 'tell Application "iTerm" to display dialog "Job finished"' does return
osascript -e tell Application "iTerm" to display dialog "Job finished" but not
osascript -e tell 'Application "iTerm" to display dialog "Job finished"'.
The shell is removing the outer single quotes before pb ever sees the argument(s). Pass a single argument
pb "osascript -e 'tell Application \"iTerm\" to display dialog \"Job finished\"'"
to pb, and define it as
pb () {
printf '%s\n' "$1" | pbcopy
}
It would probably be just as easy to use a here document, though, rather than define a function that feeds its argument to pbcopy:
$ pbcopy <<'EOF'
osascript -e 'tell Application "iTerm" to display dialog "Job finished"'
EOF
Slightly more typing, but no need to nest so many quotes.
Is it possible to echo a text and keep all quotes and double quotes in place? I want to write a function to copy the text currently written in the terminal (completely with quotes).
Let's explore what you mean by "currently written in the terminal". If I understand correctly, you want to provide arbitrary input to a shell command at invocation time. In other words, you have a bit of text that you want to add into your copy buffer, and you want to send it to the stdin of pbcopy to do so.
As a solution to this particular problem, a shell function is a poor fit. That's because a shell function needs to be invoked with arguments that are subject to shell interpretation, and so you'll have to escape them carefully both when invoking pb and when defining it. These strings can be escaped. But it's inconvenient, for one thing because there are several special characters that need to be escaped in a double quoted string, but ' can't itself be escaped in a single quoted string.
Let's explore some other options.
$ pbcopy <<< "this is a simple one-line string directly from the command line. Since it's an argument to pbcopy it needs to be escaped."
$ pbpaste
this is a simple one-line string directly from the command line. Since it's an argument to pbcopy it needs to be escaped.
Here we tell the shell to provide the text to pbpaste's standard input. We still have to escape the string. But we don't have to pass it anywhere or correctly enquote it again to make it a valid shell argument.
Or we can provide multi-line string data to pbcopy without having to enquote it with this special here-doc syntax:
$ pbcopy <<-'-my-chosen-delimiter'
> Since this string's delimiter is single quoted,
> no interpolation will occur. That means " double quotes
> have no meaning, nor does ' single quotes, $dollar signs
> or other such meaningful bash syntaxes.
> -my-chosen-delimiter
$ pbpaste
Since this string's delimiter is single quoted,
no interpolation will occur. That means " double quotes
have no meaning, nor does ' single quotes, $dollar signs
or other such meaningful bash syntaxes.
I have thought that bash would be more powerful than that.
Well, on one hand, I think this is a very good opportunity to compare and contrast command line arguments (which are inherently positional by design and thus must be parsed and split by, usually, whitespace between arguments) to input and output streams expressed with | pipelines. I/O streams are designed to hold arbitrary data; it's not bash's fault that you wanted to make one into a shell-parsed variable list. It's not the considerable power of bash you're observing here, it's the functional limit of your bash knowledge.
But on the other hand, you're kind of right. The concessions to the user-interactive command line interface, substantial historical constraints to achieve backwards compatibility, and many valid design considerations made bash what it is. I for one find it and its ilk to be by far the most powerful user interface to a computer. But I wouldn't use it to assemble a complex application because, frankly, it's syntactically difficult. So don't expect bash to be something it's not. If you don't want to understand it's quirky and esoteric irregularities, stick with something more recent and more pedantic like python , go , ruby, node, or whatever non unix centric people run these days:P
I have an AppleScript processing paths in the alias form and converting them to posix.
These paths will be processed by a shell script in the second step of this automator workflow.
The problem is that the path contains spaces.
I don't know how to deal with this, so I have this subroutine to remove any space and replace it with \.
on replaceChars(this_text, search_string, replacement_string)
set saveTID to AppleScript's text item delimiters
set AppleScript's text item delimiters to the search_string
set the item_list to every text item of this_text
set AppleScript's text item delimiters to the replacement_string
set this_text to the item_list as string
set AppleScript's text item delimiters to saveTID
return this_text
end replaceChars
On the Applescript part I do
-- convert the path from the alias form to posix
set fileInputPosix to (the POSIX path of thePath)
-- replace " " with "\ "
set quotedForm to replaceChars(fileInputPosix, space, "\\ ")
at this point the paths appear to be ok, something like
'/Users/myUser/Desktop/my\ video\ file.png'
but when I pass this path to the shell script it will not accept that.
In a very similar situation to your other question, you seem to be creating a problem that doesn't need solving by choosing to use an AppleScript action to receive file paths that ultimately end up being passed into a shell script action. It appears the only reason for doing this is to have the AppleScript "process" the file paths as alias file references, converting them into posix paths for the purpose of using them as command-line arguments in a shell script.
This is not at all necessary. Even if you decide an AppleScript is necessary for other reasons, Automator will often convert incompatible data types to an appropriate format when passing them between actions†, so an AppleScript alias reference can be happily sent onto a shell script action, which will automagically receive them as posix file paths:
In the above screenshot, you can see that no processing or manipulation was performed by me at any stage, and both scripting actions return the correct file path one would expect for the language.
Conclusion
▸ Remove your AppleScript actions, and simply send the files directly through to the shell script, as arguments, and enclose your command-line argument variables in quotes as #vadian has already demonstrated.
▸ If you use an AppleScript for other reasons, that's fine. Just leave the alias references exactly as they are, and send them on afterwards to the shell script, and treat them the same way as just described.
†File paths sent in the other direction—from a shell script to an AppleScript—won't, however, magically become alias references. A posix path is little more than a piece of text, and since AppleScript handles text perfectly well, there's no expectation on it to try to coerce it into something different; this would need to be done explicitly by the scripter, if necessary.
Just quote the paths in the shell line and delete the search and replace part
/usr/local/bin/ffmpeg -I "$1" -vf "select=eq(n\,$3)" -vframes 1 "$2"
You can quote the whole string or use quoted form of to intelligently quote the string, and you can also escape individual characters as you are trying to do, but in that case you will need to escape each escape character, for example:
quoted form of “/Users/myUser/Desktop/my video file.png”
“/Users/myUser/Desktop/my\\ video\\ file.png”
I am trying to give an argument to my python program through the terminal.
For this I am using the lines:
import sys
something = sys.argv[1]
I now try to put in a string like this through the bash terminal:
python my_script.py 2m+{N7HiwH3[>!"4y?t9*y#;/$Ar3wF9+k$[3hK/WA=aMzF°L0PaZTM]t*P|I_AKAqIb0O4# cm=sl)WWYwEg10DDv%k/"c{LrS)oVd§4>8bs:;9u$ *W_SGk3CXe7hZMm$nXyhAuHDi-q+ug5+%ioou.,IhC]-_O§V]^,2q:VBVyTTD6'aNw9:oan(s2SzV
This returns a bash error because some of the characters in the string are bash special characters.
How can I use the string exactly as it is?
You can put the raw string into a file, for example like this, with cat and a here document.
cat <<'EOF' > file.txt
2m+{N7HiwH3[>!"4y?t9*y#;/$Ar3wF9+k$[3hK/WA=aMzF°L0PaZTM]t*P|I_AKAqIb0O4# cm=sl)WWYwEg10DDv%k/"c{LrS)oVd§4>8bs:;9u$ *W_SGk3CXe7hZMm$nXyhAuHDi-q+ug5+%ioou.,IhC]-_O§V]^,2q:VBVyTTD6'aNw9:oan(s2SzV
EOF
and then run
python my_script.py "$(< file.txt)"
You can also use the text editor of your choice for the first step if you prefer that.
If this is a reoccurring task, which you have to perform from time to time, you can make your life easier with a little alias in your shell:
alias escape='read -r string ; printf "Copy this:\n%q\n" "${string}"'
It is using printf "%q" to escape your input string.
Run it like this:
escape
2m+{N7HiwH3[>!"4y?t9*y#;/$Ar3wF9+k$[3hK/WA=aMzF°L0PaZTM]t*P|I_AKAqIb0O4# cm=sl)WWYwEg10DDv%k/"c{LrS)oVd§4>8bs:;9u$ *W_SGk3CXe7hZMm$nXyhAuHDi-q+ug5+%ioou.,IhC]-_O§V]^,2q:VBVyTTD6'aNw9:oan(s2SzV
Copy this:
2m+\{N7HiwH3\[\>\!\"4y\?t9\*y#\;/\$Ar3wF9+k\$\[3hK/WA=aMzF°L0PaZTM\]t\*P\|I_AKAqIb0O4#\ cm=sl\)WWYwEg10DDv%k/\"c\{LrS\)oVd§4\>8bs:\;9u\$\ \*W_SGk3CXe7hZMm\$nXyhAuHDi-q+ug5+%ioou.\,IhC\]-_O§V\]\^\,2q:VBVyTTD6\'aNw9:oan\(s2SzV
You can use the escaped string directly in your shell, without additional quotes, like this:
python my_script.py 2m+\{N7HiwH3\[\>\!\"4y\?t9\*y#\;/\$Ar3wF9+k\$\[3hK/WA=aMzF°L0PaZTM\]t\*P\|I_AKAqIb0O4#\ cm=sl\)WWYwEg10DDv%k/\"c\{LrS\)oVd§4\>8bs:\;9u\$\ \*W_SGk3CXe7hZMm\$nXyhAuHDi-q+ug5+%ioou.\,IhC\]-_O§V\]\^\,2q:VBVyTTD6\'aNw9:oan\(s2SzV
In order to make life easier, shells like bash do a little bit of extra work to help users pass the correct arguments to the programs they instruct it to execute. This extra work usually results in predictable argument arrays getting passed to programs.
Oftentimes, though, this extra help results in unexpected arguments getting passed to programs; and sometimes results in the execution of undesired additional commands. In this case, though, it ended up causing Bash to emit an error.
In order to turn off this extra work, Bash allows users to indicate where arguments should begin and end by surrounding them by quotation marks. Bash supports both single quotes (') and double quotes (") to delimit arguments. As a last resort, if a string may contain single and double quotes (or double quotes are required but aren't aggressive enough), Bash allows you to indicate that a special- or whitespace-character should be part of the adjacent argument by preceding it with a backslash (\\).
If this method of escaping arguments is too cumbersome, it may be worth simplifying your program's interface by having it consume this data from a file instead of a command line argument. Another option is to create a program that loads the arguments from a more controlled location (like a file) and directly execs the target program with the desired argument array.
I'm giving a second try to the fish shell. One thing that really annoys me is the "new" behaviour of Ctrl+w shortcut.
Consider following situation:
$ vim ~/.config/fish/config.fish
...having the cursor at the end of the line.
When you press Ctrl+w, following happens:
in bash: ~/.config/fish/config.fish is deleted
in fish: only config.fish is deleted
How can I make fish delete words that are separated by spaces only?
"\cw" (in fish's notation) is bound to "backward-kill-path-component" (which bind \cw will tell you).
If you wish, you can bind it to something else, including input functions like "backward-kill-word" or any fish script - bind \cw backward-kill-word or bind \cw "commandline -rt ''" (which will remove the entire current token) or bind \cw backward-kill-bigword. See the bind documentation or bind --help for more information.
The difference between "word" and "bigword" here is that "word" will only go to the next non-word-character, which can be a "." or "/" or "-", among others, while "bigword" will truly go to the next whitespace character.
Note that the "bigword" functions have only been introduced in fish 2.3.0.
You can try these incantations in an interactive shell. If you decide to make it permanent, you'll need to add them to a function called fish_user_key_bindings.
I'm working on a bash script (exclusively used on OS X) that the user executes by double clicking the file. I replaced the .sh ending with .command so its natively opened by the terminal, after which a small description text tells the user what to do. The script waits for an input, which i would love to handle just like command line arguments. Giving me the possibility to access the single "words" with $1, $2, $3, ... and using the "shift" command inside the script.
I found out that the "read" command can save an input to a variable, but accessing the default variable $REPLY will give me the entire input as a string. Using the code
read var1 var2 var3
gets me one step closer by splitting the input like i want but only for a fixed number of variables (in this case 3). In theory, the user input can be a hundred arguments or more, which is why it feels stupid to create so many variables. I could also parse the default $REPLY variable and separate it by spaces, but i feel like there must be an easier way to handle user input just like command line arguments.
Thank you for your time.
This will not handle quotes, but should otherwise work:
doStuff() {
echo first of $# args is $1
}
read VARS
doStuff $VARS
This works because variable expansion takes place on $VARS before calling doStuff, and the expanded command will be something like doStuff foo bar baz 42 23 fin.
In bash, one option would be to use the -a switch to read the input into an array:
read -ra args
Then you can access each argument like "${args[0]}", "${args[1]}", etc.
I've also used the -r switch as it handles input that contains backslashes in a more sensible way (i.e. it doesn't try to do anything clever for you, it just leaves the strings as they are).
If you wanted to handle each argument one by one, another option would be to use a loop:
while read -r arg
do
# whatever with each argument
done
The loop would continue, processing one argument at a time, until the user entered a Ctrl-d.