'>' Terminal mode activated when I accidentally type an extra apostrophe - terminal

What is this mode please may someone enlighten me? Merci
I looked on the interwebs and couldn't find diddly squat at what mode it is.

Unless they're escaped, quotes (') are used to surround a string and prevent the shell from parsing it. When you have a linebreak after an opening quote, you're telling the shell that you want to input a multilined string. The > is the continuation of the that string on another line.
Let's see an example that illustrates this:
mureinik#computer /tmp
$ echo 'this is the first line
> and this is the second
> and now we will terminate the string on the third, with a closing quote -> '
this is the first line
and this is the second
and now we will terminate the string on the third, with a closing quote ->

Related

Nested double quotes in parameter expansion

I was surprised that the following is a valid Parameter Expansion. Notice there are unescaped double quotes within double quotes:
result="${var1#"$var2"}"
Can someone please parse this for me?
There are double quotes nested in curly brackets which is OK.
But none of them is needed in this case.
result=${var1#$var2}
works the same even for values containing spaces and newlines.
The answer is that they get parsed separately. Let's take a simplified tour of the string.
result="${var1#"$var2"}" doesn't actually need any quotes in this case, but look over the string anyway...
result="...
The Parser says meh, it's an assignment, I know what to do with this, I'll ignore these, they aren't hurting anything, but now I have to find the terminating match. Then it reads the value after the quote, byte by byte, looking for the terminating double-quote. This starts a new context-1.
result="${...
Once it sees the open curly, it knows that the terminating quote cannot happen until it sees the matching closing curly. It starts a new context-2.
result="${var1#"...
Seeing a new double quote in this subcontext make this one the opening quote of an internal new context-3.
result="${var1#"$var2"...
When it sees this double-quote it matches it to the previous one, closing context-3, dropping back into context-2.
result="${var1#"$var2"}...
This close-curly allows it to close the still-open context-2, dropping back into context-1.
result="${var1#"$var2"}"
And finding this now-closing double-quote allows it to close context-1. The following newline may be used as a terminating character for the entire term, so it can be evaluated and assigned.
Backslash-Quoting the internal double-quotes, for example, would have added them to the string-term used for the tail trim, which would likely have failed because of it.
$: var1=aaa
$: var2=a
$: result="${var1#"$var2"}"
$: echo $result # does what you want/expect
aa
$: result="${var1#\"$var2\"}" # does NOT
$: echo $result
aaa
Doing it without the quotes, the parser knows this is an assignment and handles the values a little differently as mentioned in comments, but generally kinda treating them as if they were quoted.
$: result=${var1#$var2}
$: echo $result
aa
This means it doesn't have to deal with context-1 or context-3, and only has the curlies to worry about. The end result is the same.
Better?

Why is my line read not working as expected in bash?

I have the following code:
script_list=$'1 a\n2\n3'
old_IFS="${IFS}"
IFS=$'\n'
for line in "${script_list}"; do
echo "__${line}__";
done
IFS="${old_IFS}"
which shall produce:
__1 a__
__2__
__3__
However, instead it gives me:
__1 a
2
3__
I double quoted "${script_list}" because it can contain spaces (see: https://stackoverflow.com/a/10067297). But I believe this is also where the problem lies, because when I remove the double quotes around it, it works as expected.
What am I missing?
edit:
As Cyrus suggested, I ran the code at ShellCheck and it tells me:
$ shellcheck myscript
Line 5:
for line in "${script_list}"; do
^-- SC2066: Since you double quoted this, it will not word split, and the loop will only run once.
Is it safe to simply remove the double quotes or do I need to be careful with that?
I double quoted "${script_list}" because it can contain spaces
This is only done to prevent the shell from splitting the string at spaces. However you explicitly tell the shell (by setting IFS) that your IFS is now a newline, not a space. The shell would split by default here on newlines, not on spaces, unless you quote it. Hence, remove the quotes.

Bash script " & " symbol creating an issues

I have a tag
<string name="currencysym">$</string>
in string.xml And what to change the $ symbol dynamically. Use the below command but didn't work:
currencysym=₹
sed -i '' 's|<string name="currencysym">\(.*\)<\/string>|<string name="currencysym">'"<!\[CDATA\[${currencysym}\]\]>"'<\/string>|g'
Getting OUTPUT:
<string name="currencysym"><![CDATA[<string name="currencysym">$</string>#x20B9;]]></string>
" & " Has Removed...
But I need:
<string name="currencysym"><![CDATA[₹]]></string>
using xml-parser/tool to handle xml is first choice
instead of <..>\(.*\)<..> better use <..>\([^<]*\)<..> in case you have many tags in one line
& in replacement has special meaning, it indicates the whole match (\0) of the pattern. That's why you see <...>..</..> came to your output. If you want it to be literal, you should escape it -> \&
First problem is the line
currencysym=₹
This actually reads as "assign empty to currencysym and start no process in the background":
In bash you can set an environment variable (or variables) just or one run of a process by doing VAR=value command. This is how currencysym= is being interpreted.
The & symbol means start process in the background, except there is no command specified, so nothing happens.
Everything after # is interpreted as a comment, so #x20B9; is just whitespace from Bash's point of view.
Also, ; is a command separator, like &, which means "run in foreground". It is not used here because it is commented out by #.
You have to either escape &, # and ;, or just put your string into single quotes: currencysym=\&\#x20B9\; or currencysym='₹'.
Now on top of that, & has a special meaning in sed, so you will need to escape it before using it in the sed command. You can do this directly in the definition like currencysym=\\\&\#x20B9\; or currencysym='\₹', or you can do it in your call to sed using builtin bash functionality. Instead of accessing ${currencysym}, reference ${currencysym/&/\&}.
You should use double-quotes around variables in your sed command to ensure that your environment variables are expanded, but you should not double-quote exclamation marks without escaping them.
Finally, you do not need to capture the original currency symbol since you are going to replace it. You should make your pattern more specific though since the * quantifier is greedy and will go to the last closing tag on the line if there is more than one:
sed 's|<string name="currencysym">[^<]*</string>|<string name="currencysym"><![CDATA['"${currencysym/&/\&}"']]></string>|' test.xml
Yields
<string name="currencysym"><![CDATA[₹]]></string>
EDIT
As #fedorqui points out, you can use this example to show off correct use of capture groups. You could capture the parts that you want to repeat exactly (the tags), and place them back into the output as-is:
sed 's|\(<string name="currencysym">\)[^<]*\(</string>\)|\1<![CDATA['"${currencysym/&/\&}"']]>\2|' test.xml
sed -i '' 's|\(<string name="currencysym">\)[^<]*<|\1<![CDATA[\₹]]><|g' YourFile
the group you keep in buffer is the wrong one in your code, i keep first part not the &
grouping a .* is not the same as all untl first < you need. especially with the goption meaning several occurence could occur and in this case everithing between first string name and last is the middle part (your group).
carrefull with & alone (not escaped) that mean 'whole search pattern find' in replacement part

BASH function for escaping spaces in filenames before opening them

I've been trying to write a function for my bash profile for quite some time now.
The problem I'm trying to overcome is I'm usually provided with file paths that include spaces and it's a pain having to go through and escape all the spaces before I try to open it up in terminal.
e.g.
File -> /Volumes/Company/Illustrators/Website Front Page Design.ai
What I'm trying to end up with is '/Volumes/Company/Illustrators/Website\ Front\ Page\ Design.ai' being opened from my terminal.
So far I've managed to escape the spaces out, but I then get the error "The file ..... does not exist."
My code so far is
function opn { open "${1// /\\ }";}
Any help would be very much appreciated.
The important thing to understand is the difference between syntax and literal data.
When done correctly, escaping is syntax: It's read and discarded by the shell. That is, when you run
open "File With Spaces"
or
open File\ With\ Spaces
or even
open File" "With\ Spaces
...the quoting and escaping is parsed and removed by the shell, and the actual operating system call that gets executed is this:
execv("/usr/bin/open", "open", "File With Spaces")
Note that there aren't any backslashes (or literal quotes) in that syscall's arguments! If you put literal backslashes in your data, then you cause this to be run:
/* this is C syntax, so "\\" is a single-character backslash literal */
execv("/usr/bin/open", "open", "File\\ With\\ Spaces")
...and unless there's a file with backslashes in its name, that just doesn't work, giving the "file does not exist" error you report.
So -- just call open with your name in quotes:
open "$1"
...there's no need for an opn wrappper.
Spaces are problematic in filenames because they're part of bash's default IFS (Internal Field Separator), which is used to separate tokens in a command line. That means that by default, when you use command an argument with spaces, the command will receive 4 arguments rather than 1 containing spaces.
I'm guessing you called your opn function in the same way, thus resulting in only the first part of your path as $1.
Hopefully, the fix is easy : enclose your path in quotes so that bash does not interpret the spaces. By using this, the need for your opn function disappears : open "/Volumes/Company/Illustrators/Website Front Page Design.ai" should work just fine.

Take in escaped input in Ruby command line app

I'm writing a Ruby command line application in which the user has to enter a "format string" (much like Date.strptime/strftime's strings).
I tried taking them in as arguments to the command, eg
> torque "%A\n%d\n%i, %u"
but it seems that bash actually removes all backslashes from input before it can be processed (plus a lot of trouble with spaces). I also tried the highline gem, which has some more advanced input options, but that automatically escapes backslashes "\" -> "\\" and provides no alternate options.
My current solution is to do a find-and-replace: "\\n" -> "\n". This would take care of the problem, but it also seems hacky and awful.
I could have users write the string in a text file (complicated for the user) or treat some other character, like "&&", as a newline (still complicated for the user).
What is the best way for users to input escaped characters on the command line?
(UPDATE: I checked the documentation for strptime/strftime, and the format strings for those functions replace newline characters with "%n", tabs with "%t", etc. So for now I'm doing that, but any other suggestions are welcome)
What you're looking for is using single quotes instead of double quotes.
Thus:
> torque '%A\n%d\n%i, %u'
Any string quoted in single quotes 'eg.' is does not go through any expansions and is used as is.
More details can be found in the Quoting section of man bash.
From man bash:
Enclosing characters in single quotes preserves the literal value of each character within the quotes. A single quote may not occur between single quotes, even when preceded by a backslash.
p eval("\"#{gets.chomp}\"")
Example use:
\n\b # Input by the user from the keyboard
"\n\b" # Value from the script
%A\n%d\n%i, %u # Input by the user from the keyboard
"%A\n%d\n%i, %u" # Value from the script

Resources