Detection of non-ascii quotes - bash

Yet again I managed to cut and paste something from a source then execute that paste with modifications, but once again I had an error that made the issue truly opaque - the non-ascii double quotes were generated by the intermediary application that wanted to be "fancy", resulting in a derailed thought process for a few minutes while I studied the command line for the changes I made - which are usually the cause of problems.
Is there a script, terminal mode or some other solution that detects bash/zsh input and gives a nice clear error message if non-ascii input is detected in the command line? Granted, it will need an override, but the usual case is that non-ascii characters are not desired on the command line.

Related

Why does curl command on windows generate a /bin/bash: line /: $'\r' : command not found error?

I am trying to install Operator Lifecycle Manager (OLM) — a tool to help manage the Operators running on your cluster — from the official documentation, but I keep getting the error below. What could possibly be wrong?
This is the result from the command:
curl -sL https://github.com/operator-framework/operator-lifecycle-manager/releases/download/v0.22.0/install.sh | bash -s v0.22.0
/bin/bash: line 2: $'\r': command not found
/bin/bash: line 3: $'\r': command not found
/bin/bash: line 5: $'\r': command not found
: invalid option6: set: -
set: usage: set [-abefhkmnptuvxBCEHPT] [-o option-name] [--] [-] [arg ...]
/bin/bash: line 7: $'\r': command not found
/bin/bash: line 9: $'\r': command not found
/bin/bash: line 60: syntax error: unexpected end of file
I've tried removing the existing curl and downloaded and installed another version but the issue has still persisted. Most solutions online are for Linux users and they all lead to Windows path settings and files issues.
I haven't found one tackling installing a file using curl.
I'll gladly accept any help.
Using PowerShell on Windows, you must explicitly ensure that the stdout lines emitted by curl.exe are separated with Unix-format LF-only newlines, \n, when PowerShell passes them on to bash, given that bash, like other Unix shells, doesn't recognize Windows-format CRLF newlines, \r\n:
The simplest way to avoid the problem is to call via cmd /c:
cmd /c 'curl -sL https://github.com/operator-framework/operator-lifecycle-manager/releases/download/v0.22.0/install.sh | bash -s v0.22.0'
cmd.exe's pipeline (|) (as well as its redirection operator, >), unlike PowerShell's (see below), acts as a raw byte conduit, so it simply streams whatever bytes curl.exe outputs to the receiving bash call, unaltered.
Fixing the problem on the PowerShell side requires more work, and is inherently slower:
(
(
curl -sL https://github.com/operator-framework/operator-lifecycle-manager/releases/download/v0.22.0/install.sh
) -join "`n"
) + "`n" | bash -s v0.22.0
Note: `n is a PowerShell escape sequence that produces a literal LF character, analogous to \n in certain bash contexts.
Note:
It is important to note that, as of PowerShell 7.2.x, passing raw bytes through the pipeline is not supported: external-program stdout output is invariably decoded into .NET strings on reading, and re-encoded based on the $OutputEncoding preference variable when writing to an(other) external program.
See this answer for more information, and GitHub issue #1908 for potential future support for raw byte streaming between external programs and on redirection to a file.
That is, PowerShell invariably interprets output from external programs, such as curl.exe, as text, and sends it line by line through the pipeline, as .NET string objects (the PowerShell pipeline in general conducts (.NET) objects).
Note that these lines (strings) do not have a trailing newline themselves; that is, the information about what specific newline sequences originally separated the lines is lost at that point (PowerShell itself recognizes CRLF and LF newlines interchangeably).
However, if the receiving command is also an external program, PowerShell adds a trailing platform-native newline to each line, which on Windows is a CRLF newline - this is what caused the problem.
By collecting the lines in an array up front, using (...), they can be sent as a single, LF-separated multi-line string, using the -joinoperator, as shown above.
Note that PowerShell appends a trailing platform-native newline to this single, multi-line string too, but a stray \r\n at the very end of the input is in effect ignored by bash, assuming that the last true input line ends in \n, which is what the extra + "`n" at the end of the expression ensures.
However, there are scenarios where this trailing CRLF newline does cause problems - see this answer for an example and workarounds via the platform-native shell.
To start off, I need to be clear about a few things:
Based on the tags to the question, I see we are in PowerShell rather than a linux/unix or even Windows cmd shell
In spite of this, we are using Unix curl (probably curl.exe), and not the PowerShell alias for Invoke-WebRequest. We know this because of the -sL argument. If Powershell was using the alias, we'd see a completely different error.
Next, I need to talk briefly about line endings. Instead of just a single LF (\n) character as seen in Unix/linux and expected by bash, Windows by default uses the two-character LF/CR pair (\n\r) for line endings.
With all that background out of the way, I can now explain what's causing the problem. It's this single pipe character:
|
This a PowerShell pipe, not a Unix pipe, so the operation puts the output of the curl program in the PowerShell pipeline in order to send it to the bash interpreter. Each line is an individual item on the pipeline, and as such no longer includes any original line breaks. PowerShell pipeline will "correct" this before calling bash using the default line ending for the system, which in this case is the LF/CR pair used by Windows. Now when bash tries to interpret the input, it sees an extra \r character after every line and doesn't know what to do with it.
The trick is most of what we might do in Powershell to strip out those extra characters is still gonna get sent through another pipe after we're done. I guess we could tell curl to write the file to disk without ever using a pipe, and then tell bash to run the saved file, but that's awkward, extra work, and much slower.
But we can do a little better. PowerShell by default treats each line returned by curl as a separate item on the pipeline. We can "trick" it to instead putting one big item on the pipeline using the -join operation. That will give us one big string that can go on the pipeline as a single element. It will still end up with an extra \r character, but by the time bash sees it the script will have done it's work.
Code to make this work is found in the other answer, and they deserve all the credit for the solution. The purpose of my post is to do a little better job explaining what's going on: why we have a problem, and why the solution works, since I had to read through that answer a couple times to really get it.

how to use quote marks in a for loop in windows command line

Beginner's question here so I'd be grateful for a baby explanation.
I'm trying to create a concatenation text file that lists the files (with paths) in a certain folder, with the word "file" appended to the beginning of each line, as well as quotation marks. I want the text file to look like this:
file 'file:DriveLetter:\path\filename1.mp3'
file 'file:DriveLetter:\path\filename2.mp3'
etc
The command I'm running is as follows:
(for %i in (*.mp3) do #echo file 'file:%cd%\%i') > mylist.txt
But I receive the following error
%i') was unexpected at this time.
However, if I use double quotes instead of single, the command works. But this causes problems in my next step, which is to use ffmpeg to concatenate the files - it refuses to read the double quote marks.
Any advice is much appreciated. I'm open to an alternative method.
I just tried and everything is working fine. I believe that some character has been altered. Are you sure you're typing your command in your command prompt instead of using some editor (which does automatic modifications) and copying/pasting into a command prompt afterwards?

In shell script, colon(:) is being treated as a operator for variable creation

I have following snippet:
host="https://example.com"
port="80"
url="${host}:${port}"
echo $url
the output is:
:80ps://example.com
How can I escape the colon here. I also tried:
url="${host}\:${port}"
but it did not work.
Expected output is:
https://example.com:80
You've most likely run into what I call the Linefeed-Limbo.
If I copy the code you provided from StackOverflow and run it on my machine (bash version 4.4.19(1)), then it outputs correctly
user#host:~$ cat script.sh
host="https://example.com"
port="80"
url="${host}:${port}"
echo $url
user#host:~$ bash script.sh
https://example.com:80
What is Linefeed-Limbo?
Different operating systems use different ASCII symbols to represent when a new line occurs in a text, such as a script. This Wikipedia article gives a good introduction.
As you can see, Unix and Unix-like systems use the single character \n, also called a "Line Feed". Windows, as well as other systems, use \r\n, so a "carriage return" followed by a "line feed".
What happens now is when you write a script on Windows on an editor such as notepad, what you write is host="example.com"\r\n. When you copy this file into Linux, Linux interprets the \r as if it were part of the script, since only \n is considered a new line. And indeed, when I change my newline style to DOS-style, I get the exact output you get.
How can I fix this?
You have several options to fix this issue.
Converting the script (with dos2unix)
Since all you need to do is replacing every instance of \r\n with \n, you could use any text-editing software you want. However, if you like simple solutions, then dos2unix (and its sister unix2dos) might be what you looking for:
user#host:~$ dos2unix script.sh
dos2unix: converting file script.sh to Unix format...
That's it. Run your file now and you will see it behaves well.
Encoding the source-file correctly
By using a more advanced text editor such as Notepad++, you can define which style of newline you would like to use.
By changing the newline-type to whichever system you intend to run your script on, you will not run into any problems like this anymore.
Bonus round: Why does it output :80ps://example.com?
To understand why your output is like this, you have to look at what your script is doing, and what \r means.
Try thinking of your terminal as an old-fashioned typewriter. Returning the carriage means you start writing on the left again. Making a "new line" means sliding the paper. These two things are seperate, and I think that's why some systems decided to use these two characters as a logical "new line".
But I digress. Let's look at the first line, host="https://example.com"\r.
What this means when printed is "Print https://example.com, then put the carriage back at the start". When you then print :80\r, it doesn't start after ".com", it starts at the beginning of the line, because that's where you (unknowingly) told the cursor to go. it then overwites the first few characters, resulting in ":80ps://example.com" to be written. Keep in mind that after 80, you again placed a carriage return symbol, so any new text you would have written ends up overwriting the beginning again.
It works for me, try to remove carriage returns in variables and then try.
new_host=$(echo "$host" | tr -d '\r')
new_port=$(echo "$port" | tr -d '\r')
new_url="${new_host}:${new_port}"

Cygwin wraps text back on to the same line, causing text to be overwritten

I have cygwin installed on my Windows 7 box and I have been running into a problem where when I type a command it will occasionally be wrapped back onto the same line, deleting the bash prompt. Here is an example:
The command in question is command "201" (4 lines from the bottom). I included the others for context.
The text of the command I was typing was
git commit -m "Forced LF line endings."
(Note: I am posting this with mostly git commands, but the problem occurs with any command. I have not noticed a pattern yet.)
It jumped to the start of the line and started to overwrite my prompt.
When I push the up arrow (to view the history) the result is even weirder:
(Note the cursor is many characters past the end of the command.)
When I try to backspace the cursor from that position, I can only go back this far:
Then when I go up into the history from that backspaced line, I get this:
The command starts from the end of the text that is displayed. (This is consistent for the entire history) But when I go up in the history to the faulty git commit ... it displays as it did before with the overwritten text but when I go past it, it deletes a line of the prompt and displays the previous entry in the history the same way it was doing it before (a la image 2).
When I was creating my PS1 variable I has odd output like this, but I have since closed my brackets and things and don't think that is causing the issue. However if you would like to see my .bash_profile (that sets the PS1) feel free to see it on GitHub. It is really short.
I have tried searching for the issue and can only find a few cygwin email archives about the line wrapping in xterm, but no solutions.
PS: As I was pushing the latest .bash_profile, in order to link it, I ran into the problem again when I typed git add .bash_profile and hit enter, it ran the command but returned the cursor to the start of the same command instead of printing a new prompt.
Then when I as writing another commit line, it did the same as the first image, but it blacked out the rest of the line (It wrapped the line, but overwrote the entire line and not just the first few chars.)
See http://manpages.ubuntu.com/manpages/lucid/man1/bash.1.html#contenttoc26:
\[
begin a sequence of non-printing characters, which could be used to embed a terminal control sequence into the prompt
\]
end a sequence of non-printing characters
If you don't enclose non-printing characters (e.g. color sequences) in your prompt with these their length is counted as part of the prompt's length, eventually resulting in the symptoms that you describe.
It doesn't occur frequently to me. When it does, I just type in 'kill -WINCH $$' into the cygwin terminal and it fixes the problem. link to source

How can I add a vertical space in 'Terminal' after each command?

I've just started using Terminal (the CLI for Mac OS X).
When I run a command, get some information back, run another command, get more info etc., it is hard (on the eyes) to find a certain point on the screen (e.g. the output for the command before last).
Is there a way of adding a vertical empty space to the end of each output/ after each command is run that has no output?
Each new command that you enter is preceded by a "prompt", and these can be customized (though the exact way to customize depends on the shell). Since you mention Mac OS X I'm assuming you are using the default bash shell, in which case the absolute simplest way to add a blank line is like this: PROMPT_COMMAND=echo. You can run that command to try it out, or add it to a startup file (like .profile in your home folder) to have it done automatically each time.
If you use Bash 4.4 and you want a blank line after your prompt, you could set the PS0 prompt to a newline:
PS0="\n"
Now, this will be inserted every time you run a command:
$ echo "Hello"
Hello
Wondering this too, I've looked at the menu options in Terminal & most of the control characters one can type in and nothing does this on a keystroke. You can however enter an echo command, it alone to leave a single blank line below it before the next prompt. echo \n will add an extra blank line to that, echo \n\n to do 2 extra, ie. 3 blank lines, etc. (you can also do echo;echo;echo getting the same effect)
You can create a shell alias like alias b='echo;echo' (i couldn't seem to get the \n notation to work in a alias), then entering b on a prompt will leave a double-blank line, not bad. Then you gotta figure out how to save aliases in your .profile script.
I tried making an alias for the command ' ' ie. space character, which I though you could type like \ (hmm, stack overflow not formatting this well, that's backslash followed by a space, then return to execute it), but the bash shell doesn't seem to allow an alias with that name. It probably wouldn't allow a function named that either (similar to alias), though I didn't check.
I often use the fish shell, and I found that it does allow a function with that name! Created with function ' '; echo \n; end and indeed it works; at the shell prompt, typing the command \ (again backslash space) leaves a double blank line.
Cool, but.. I tried saving this function using funcsave ' ' (how you save functions in fish, no messing with startup scripts!) and afterwards the function no longer works :^( This is probably a bug in the fish shell. It's in active development right now though, I think I'll report this as a bug since I would kind of like this to work myself.
One could also send Apple a feature request through their bug reporter for an Insert Blank Line menu/keyboard command in Terminal. If someone pays attention to your request it might be implemented in a year maybe.
I wanted to solve exactly the same, and for anyone interested in doing the same, I used what tripleee said in his comment here - I created a .bash_profile (see details here) with the line export PS1="\n\n$ ".
Hopefully that helps someone else too!

Resources