How does sed interpret the following statement - shell

Salute to all devs,
Excuse me for addressing this question, but I am a little noob, when it comes to shell scripting, I am still learning (please bear with me).
Ok, long story short... I have met this shell command in someone script:
sed -e 1,\$s/a/${b} myfile > myfile_1
What does 1 stands for? and does "\" escape the character "$"?
Thank you devs.

It depends on where the line is executed.
Assuming it is executed in bash or ksh or something similar, the line means to execute sed with the script 1,$s/a/ with the contents of the shell variable b appended. Apparently, it is expected that the value of b should terminate the s instruction, otherwise sed will complain.
It's impossible to tell what the script will do without knowing the value of b -- it may simply complete the s instruction, for substituting something for a on every line, but it could contain any number of additional instructions.

Related

Execute command that results from execution of a script whose name is in a variable

When posting this question originally, I totally misworded it, obtaining another, reasonable but different question, which was correctly answered here.
The following is the correct version of the question I originally wanted to ask.
In one of my Bash scripts, there's a point where I have a variable SCRIPT which contains the /path/to/an/exe which, when executed, outputs a line to be executed.
What my script ultimately needs to do, is executing that line to be executed. Therefore the last line of the script is
$($SCRIPT)
so that $SCRIPT is expanded to /path/to/an/exe, and $(/path/to/an/exe) executes the executable and gives back the line to be executed, which is then executed.
However, running shellcheck on the script generates this error:
In setscreens.sh line 7:
$($SCRIPT)
^--------^ SC2091: Remove surrounding $() to avoid executing output.
For more information:
https://www.shellcheck.net/wiki/SC2091 -- Remove surrounding $() to avoid e...
Is there a way I can rewrite that $($SCRIPT) in a more appropriate way? eval does not seem to be of much help here.
If the script outputs a shell command line to execute, the correct way to do that is:
eval "$("$SCRIPT")"
$($SCRIPT) would only happen to work if the command can be completely evaluated using nothing but word splitting and pathname expansion, which is generally a rare situation. If the program instead outputs e.g. grep "Hello World" or cmd > file.txt then you will need eval or equivalent.
You can make it simple by setting the command to be executed as a positional argument in your shell and execute it from the command line
set -- "$SCRIPT"
and now run the result that is obtained by expansion of SCRIPT, by doing below on command-line.
"$#"
This works in case your output from SCRIPT contains multiple words e.g. custom flags that needs to be run. Since this is run in your current interactive shell, ensure the command to be run is not vulnerable to code injection. You could take one step of caution and run your command within a sub-shell, to not let your parent environment be affected by doing ( "$#" ; )
Or use shellcheck disable=SCnnnn to disable the warning and take the occasion to comment on the explicit intention, rather than evade the detection by cloaking behind an intermediate variable or arguments array.
#!/usr/bin/env bash
# shellcheck disable=SC2091 # Intentional execution of the output
"$("$SCRIPT")"
By disabling shellcheck with a comment, it clarifies the intent and tells the questionable code is not an error, but an informed implementation design choice.
you can do it in 2 steps
command_from_SCRIPT=$($SCRIPT)
$command_from_SCRIPT
and it's clean in shellcheck

Echo-ing an environment variable returns string literal rather than environment variable value

I have two bash scripts. The first listens to a pipe "myfifo" for input and executes the input as a command:
fifo_name="myfifo"
[ -p $fifo_name ] || mkfifo $fifo_name;
while true
do
if read line; then
$line
fi
done <"$fifo_name"
The second passes a command 'echo $SET_VAR' to the "myfifo" pipe:
command='echo $SET_VAR'
command_to_pass="echo $command"
$command_to_pass > myfifo
As you can see, I want to pass 'echo $SET_VAR' through the pipe. In the listener process, I've set a $SET_VAR environment variable. I expect the output of the command 'echo $SET_VAR' to be 'var_value,' which is the value of the environment variable SET_VAR.
Running the first (the listener) script in one bash process and then passing a command via the second in another process gives the following result:
$SET_VAR
I expected to "var_value" to be printed. Instead, the string literal $SET_VAR is printed. Why is this the case?
Before I get to the problem you're reporting, I have to point out that your loop won't work. The while true part (without a break somewhere in the loop) will run forever. It'll read the first line from the file, loop, try to read a second line (which fails), loop again, try to read a third line (also fails), loop again, try to read a fourth line, etc... You want the loop to exit as soon as the read command fails, so use this:
while read line
do
# something I'll get to
done <"$fifo_name"
The other problem you're having is that the shell expands variables (i.e. replaces $var with the value of the variable var) partway through the process of parsing a command line, and when it's done that it doesn't go back and re-do the earlier parsing steps. In particular, if the variable's value included something like $SET_VAR it doesn't go back and expand that, since it's just finished the bit where it expands variables. In fact, the only thing it does with the expanded value is split it into "words" (based on whitespace), and expand any filename wildcards it finds -- no variable expansions happen, no quote or escape interpretation, etc.
One possible solution is to tell the shell to run the parsing process twice, with the eval command:
while read line
do
eval "$line"
done <"$fifo_name"
(Note that I used double-quotes around "$line" -- this prevents the word splitting and wildcard expansion I mentioned from happening before eval goes through the normal parsing process. If you think of your original code half-parsing the command in $line, without double-quotes it gets one and a half-parsed, which is weird. Double-quotes suppress that half-parsing stage, so the contents of the variable get parsed exactly once.)
However, this solution comes with a big warning, because eval has a well-deserved reputation as a bug magnet. eval makes it easy to do complex things without quite understanding what's going on, which means you tend to get scripts that work great in testing, then fail incomprehensibly later. And in my experience, when eval looks like the best solution, it probably means you're trying to solve the wrong problem.
So, what're you actually trying to do? If you're just trying to execute the lines coming from the fifo as shell commands, then you can use bash "$fifo_name" to run them in a subshell, or source "$fifo_name" to run them in the current shell.
BTW, the script that feeds the fifo:
command='echo $SET_VAR'
command_to_pass="echo $command"
$command_to_pass > myfifo
Is also a disaster waiting to happen. Putting commands in variables doesn't work very well in the shell (I second chepner's recommendation of BashFAQ #50: I'm trying to put a command in a variable, but the complex cases always fail!), and putting a command to print another command in a variable is just begging for trouble.
bash, by it's nature, reads commands from stdin. You can simply run:
bash < myfifo

Bash - Special characters on command line [duplicate]

I'm looking for a way (other than ".", '.', \.) to use bash (or any other linux shell) while preventing it from parsing parts of command line. The problem seems to be unsolvable
How to interpret special characters in command line argument in C?
In theory, a simple switch would suffice (e.g. -x ... telling that the
string ... won't be interpreted) but it apparently doesn't exist. I wonder whether there is a workaround, hack or idea for solving this problem. The original problem is a script|alias for a program taking youtube URLs (which may contain special characters (&, etc.)) as arguments. This problem is even more difficult: expanding "$1" while preventing shell from interpreting the expanded string -- essentially, expanding "$1" without interpreting its result
Use a here-document:
myprogramm <<'EOF'
https://www.youtube.com/watch?v=oT3mCybbhf0
EOF
If you wrap the starting EOF in single quotes, bash won't interpret any special chars in the here-doc.
Short answer: you can't do it, because the shell parses the command line (and interprets things like "&") before it even gets to the point of deciding your script/alias/whatever is what will be run, let alone the point where your script has any control at all. By the time your script has any influence in the process, it's far too late.
Within a script, though, it's easy to avoid most problems: wrap all variable references in double-quotes. For example, rather than curl -o $outputfile $url you should use curl -o "$outputfile" "$url". This will prevent the shell from applying any parsing to the contents of the variable(s) before they're passed to the command (/other script/whatever).
But when you run the script, you'll always have to quote or escape anything passed on the command line.
Your spec still isn't very clear. As far as I know the problem is you want to completely reinvent how the shell handles arguments. So… you'll have to write your own shell. The basics aren't even that difficult. Here's pseudo-code:
while true:
print prompt
read input
command = (first input)
args = (argparse (rest input))
child_pid = fork()
if child_pid == 0: // We are inside child process
exec(command, args) // See variety of `exec` family functions in posix
else: // We are inside parent process and child_pid is actual child pid
wait(child_pid) // See variety of `wait` family functions in posix
Your question basically boils down to how that "argparse" function is implemented. If it's just an identity function, then you get no expansion at all. Is that what you want?

ZSH/Shell variable assignment/usage

I use ZSH for my terminal shell, and whilst I've written several functions to automate specific tasks, I've never really attempted anything that requires the functionality I'm after at the moment.
I've recently re-written a blog using Jekyll and I want to automate the production of blog posts and finally the uploading of the newly produced files to my server using something like scp.
I'm slightly confused about the variable bindings/usage in ZSH; for example:
DATE= date +'20%y-%m-%d'
echo $DATE
correctly outputs 2011-08-23 as I'd expect.
But when I try:
DATE= date +'20%y-%m-%d'
FILE= "~/path/to/_posts/$DATE-$1.markdown"
echo $FILE
It outputs:
2011-08-23
blog.sh: line 4: ~/path/to/_posts/-.markdown: No such file or directory
And when run with what I'd be wanting the blog title to be (ignoring the fact the string needs to be manipulated to make it more url friendly and that the route path/to doesn't exist)
i.e. blog "blog title", outputs:
2011-08-23
blog.sh: line 4: ~/path/to/_posts/-blog title.markdown: No such file or directory
Why is $DATE printing above the call to print $FILE rather than the string being included in $FILE?
Two things are going wrong here.
Firstly, your first snippet is not doing what I think you think it is. Try removing the second line, the echo. It still prints the date, right? Because this:
DATE= date +'20%y-%m-%d'
Is not a variable assignment - it's an invocation of date with an auxiliary environment variable (the general syntax is VAR_NAME=VAR_VALUE COMMAND). You mean this:
DATE=$(date +'20%y-%m-%d')
Your second snippet will still fail, but differently. Again, you're using the invoke-with-environment syntax instead of assignment. You mean:
# note the lack of a space after the equals sign
FILE="~/path/to/_posts/$DATE-$1.markdown"
I think that should do the trick.
Disclaimer
While I know bash very well, I only started using zsh recently; there may be zshisms at work here that I'm not aware of.
Learn about what a shell calls 'expansion'. There are several kinds, performed in a particular order:
The order of word expansion is as follows:
tilde expansion
parameter expansion
command substitution
arithmetic expansion
pathname expansion, unless set -f is in effect
quote removal, always performed last
Note that tilde expansion is only performed when the tilde is not quoted; viz.:
$ FILE="~/.zshrc"
$ echo $FILE
~/.zshrc
$ FILE=~./zshrc
$ echo $FILE
/home/user42/.zshrc
And there must be no spaces around the = in variable assignments.
Since you asked in a comment where to learn shell programming, there are several options:
Read the shell's manual page man zsh
Read the specification of the POSIX shell, http://pubs.opengroup.org/onlinepubs/009695399/utilities/xcu_chap02.html, especially if you want to run your scripts on different operating systems (and you will find yourself in that situation one fine day!)
Read books about shell programming.
Hang out in the usenet newsgroup comp.unix.shell where a lot of shell wizards answer questions

KornShell (ksh) wraparound

Okay, I am sure this is simple but it is driving me nuts. I recently went to work on a program where I have had to step back in time a bit and use Redhat 9. When I'm typing on the command line from a standard xterm running KornShell (ksh), and I reach the end of the line the screen slides to the right (cutting off the left side of my command) instead of wrapping the text around to a new line. This makes things difficult for me because I can't easily copy and paste from the previous command straight from the command line. I have to look at the history and paste the command from there. In case you are wondering, I do a lot of command-line awk scripts that cause the line to get quite long.
Is there a way to force the command line to wrap instead of shifting visibility to the right side of the command I am typing?
I have poured through man page options with no luck.
I'm running:
XFree86 4.2.99.903(174)
KSH 5.2.14.
Thanks.
Did you do man ksh?
You want to do a set -o multiline.
Excerpt from man ksh:
multiline:
The built-in editors will use multiple lines on the screen for
lines that are longer than the width of the screen. This may not
work for all terminals.
eval $(resize) should do it.
If possible, try to break the command down to multiple lines by adding \
ie:
$ mycommand -a foo \
-f bar \
-c dif
The simple answer is:
$ set -o multiline
ksh earlier than 5.12, like the ksh shipped with NetBSD 6.1, doesn't have this option. You will have to turn off current Interactive Input Line Editing mode, which is usually emacs:
$ set +o emacs
This turns off a lot of featuers altogether, like tab-completion or the use of 'Up-arrow' key to roll back the previous command.
If you decide to get used to emacs mode somehow, remember ^a goes to the begining of the line ("Home" key won't workk) and ^e goes to the end.
I don't know of a way of forcing the shell to wrap, but I would ask why you'd be writing lines that long. With awk scripts, I simply wrap the script in single quotes, and then break the lines where I want. It only gets tricky if you need single quotes in the script -- and diabolical if you need both single and double quotes. Actually, the rule is simple enough: use single quotes to wrap the whole script, and when you want a single quote in the script, write '\''. The first quote terminates the previous single-quoted string; the backslash-single quote yields a single quote; and the last single quote starts a new single quoted string. It really gets hairy if you need to escape those characters for an eval or something similar.
The other question is - why not launch into an editor. Since I'm a die-hard vim nutcase (ok - I've been using vi for over 20 years, so it is easier for me than the alternatives), I have Korn shell set to vi mode (set -o vi), and can do escape-v to launch the editor on whatever I've typed.
This is kind of a pragmatic answer, but when that's an issue for me I usually do something like:
strings ~/.history | grep COMMAND
or
strings ~/.history | tail
(The history file has a little bit of binary data in it, hence 'strings')

Resources