I have variable "one" which contains following
avi,mkw,dvd,cd
im trying to dynamicly create directories that would look like this
type-avi
type-mkw
type-dvd
type-cd
I have tried to achieve wanted result with following code
mkdir type-{"$one"}
but instead of creating 4 directories , it created one directory called
type-{avi,mkw,dvd,cd}
I suppose this is wrong method.. if so , how can i create dynamicly directories with "suffixes" stored in variabe?
Use an array instead of your string variable for this.
IFS=, read -a onearr <<<"$one"
mkdir "${onearr[#]/#/type-}"
Or if you don't need the $one string in the first place just create the array manually.
onearr=(avi mkw dvd cd)
mkdir "${onearr[#]/#/type-}"
If you aren't worried about spaces or anything in the values in $one and can trust your input to be "safe" and not exploitative and can't use read then you could use this to create the array instead (but it is just flat out a worse soluton).
onearr=($(tr , ' ' <<<"$one"))
A way to do this without reading into the shell, in a traditional tools pipeline approach:
echo "$one" |
tr ',' '\n' |
sed "s/^/mkdir 'type-/; s/$/'/" |
sh -x
Your original attempt was very close. To make it work, you can use the shell eval command:
eval mkdir type-{$one}
or
echo mkdir type{"$one"} | bash
In either case, the effect causes bash to re-evaluate the line.
I personally would not recommend this approach for these reasons:
eval can be a security risk and is little used, maintainers will have to do a double-take.
Brace Expansion is a bash-type shell extension and while I love bash, I write all shell scripts to run with the POSIX /bin/sh.
These will not handle unusual characters in filenames, such as spaces.
The eval causes the shell to re-evaluate the string after the variable substition has been performed. To gain more understanding on these topics, see "Brace Expansion" and also the eval command, both on the bash man page.
Related
I have a situation where a Bash script runs and parses a user-supplied JSON file using jq. Since it's supplied by the user, it's possible for them to include values in the JSON to perform an injection attack.
I'd like to know if there's a way to overcome this. Please note, the setup of: 'my script parsing a user-supplied JSON file' cannot be changed, as it's out of my control. Only thing I can control is the Bash script.
I've tried using jq with and without the -r flag, but in each case, I was successfully able to inject.
Here's what the Bash script looks like at the moment:
#!/bin/bash
set -e
eval "INCLUDES=($(cat user-supplied.json | jq '.Include[]'))"
CMD="echo Includes are: "
for e in "${INCLUDES[#]}"; do
CMD="$CMD\\\"$e\\\" "
done
eval "$CMD"
And here is an example of a sample user-supplied.json file that demonstrates an injection attack:
{
"Include": [
"\\\";ls -al;echo\\\""
]
}
The above JSON file results in the output:
Includes are: ""
, followed by a directory listing (an actual attack would probably be something far more malicious).
What I'd like instead is something like the following to be outputted:
Includes are: "\\\";ls -al;echo\\\""
Edit 1
I used echo as an example command in the script, which probably wasn’t the best example, as then the solution is simply not using eval.
However the actual command that will be needed is dotnet test, and each array item from Includes needs to be passed as an option using /p:<Includes item>. What I was hoping for was a way to globally neutralise injection regardless of the command, but perhaps that’s not possible, ie, the technique you go for relies heavily on the actual command.
You don't need to use eval for dotnet test either. Many bash extensions not present in POSIX sh exist specifically to make eval usage unnecessary; if you think you need eval for something, you should provide enough details to let us explain why it isn't actually required. :)
#!/usr/bin/env bash
# ^^^^- Syntax below is bash-only; the shell *must* be bash, not /bin/sh
include_args=( )
IFS=$'\n' read -r -d '' -a includes < <(jq -r '.Include[]' user-supplied.json && printf '\0')
for include in "${includes[#]}"; do
include_args+=( "/p:$include" )
done
dotnet test "${include_args[#]}"
To speak a bit to what's going on:
IFS=$'\n' read -r -d '' -a arrayname reads up to the next NUL character in stdin (-d specifies a single character to stop at; since C strings are NUL-terminated, the first character in an empty string is a NUL byte), splits on newlines, and puts the result into arrayname.
The shorter way to write this in bash 4.0 or later is readarray -t arrayname, but that doesn't have the advantage of letting you detect whether the program generating the input failed: Because we have the && printf '\0' attached to the jq code, the NUL terminator this read expects is only present if jq succeeds, thus causing the read's exit status to reflect success only if jq reported success as well.
< <(...) is redirecting stdin from a process substitution, which is replaced with a filename which, when read from, returns the output of running the code ....
The reason we can set include_args+=( "/p:$include" ) and have it be exactly the same as include_args+=( /p:"$include" ) is that the quotes are read by the shell itself and used to determine where to perform string-splitting and globbing; they're not persisted in the generated content (and thus later passed to dotnet test).
Some other useful references:
BashFAQ #50: I'm trying to put a command in a variable, but the complex cases always fail! -- explains in depth why you can't store commands in strings without using eval, and describes better practices to use instead (storing commands in functions; storing commands in arrays; etc).
BashFAQ #48: Eval command and security issues -- Goes into more detail on why eval is widely frowned on.
You don't need eval at all.
INCLUDES=( $(jq '.Include[]' user-supplied.json) )
echo "Includes are: "
for e in "${INCLUDES[#]}"; do
echo "$e"
done
The worst that can happen is that the unquoted command substitution may perform word-splitting or pathname expansion where you don't want it (which is a problem in your original as well), but there's no possibility for arbitrary command execution.
I am using tcsh (contract required, cannot change to bash etc), but am having a problem building up a command based on various conditions for different pieces.
Some names changed to protect the innocent...
If new or old program name, is really chosen earlier on by a preprocessor, and is hardcoded by the time this shell script gets run:
set myCMDline = newProgName
set myCMDlineTmpFile = "/tmp/myCMDlineTmpScriptFile.csh"
set bsubQname = "typical"
set bsubResources = "span[hosts=1]"
set myCMDline = "bsub -q $bsubQname -n 8 -R \"$bsubResources\" $myCMDline"
($myCMDline)
Now, I have tried several variations of the above, all not working for some reason or another. The closest I think I get is a complaint about mismatched double-quotes, even when backspacing them.
When I do an echo of $myCMDline, then that looks OK, but the execution of same must somehow be different...
set bsubResources = '"span[hosts=1]"' #double-quotes inside, single-quotes outside
set myCMDline = "bsub -q $bsubQname -n 8 -R $bsubResources $myCMDline"
.
set bsubResources = "span[hosts=1]" #double-quotes inside, single-quotes outside
set myCMDline = 'bsub -q $bsubQname -n 8 -R "$bsubResources" $myCMDline'
.
set bsubResources = "span[hosts=1]" #double-quotes inside, single-quotes outside
set myCMDline = "bsub -q $bsubQname -n 8 -R '$bsubResources' $myCMDline"
etc.
I have also tried dumping to a separate temp script file to source, but that contains the $variable names, not resolved equivalents as I would prefer, as I am doing set, not setenv, and prefer not to put these into shell vars.
First I could not echo the "#!/bin/csh -f" line, it seems to try and execute that rather than echo redirected into the temp script file, and dies.
rm -f $myCMDlineTmpFile
echo "#!/bin/csh -f > $myCMDlineTmpFile
echo "$myCMDline" >> $myCMDlineTmpFile
($myCMDlineTmpFile)
Then I tried multi-line echo, which is where I am seeing the local variable names go into the file rather than their contents:
/bin/cat > $myCMDlineTmpFile <<EOF
#!/bin/csh -f
$myCMDline
EOF
source $myCMDlineTmpFile
And then I am trying to instead use eval:
eval `echo "$myCMDline &" `
with and without the backticks etc, but complains about unknown variables for the queue name, resources etc.
Adding this echo always looks like what I want to be the commandline, between the >>> and <<<
echo "DEBUG - myCMDline= >>>$myCMDline<<<"
Please help me solve this puzzle...
set myCMDline = "bsub -q $bsubQname -n 8 -R \"$bsubResources\" $myCMDline"
($myCMDline)
This won't work because csh considers this as a single string, so it treats the whole string as one big program name. You have to define an array instead:
set myCMDline = (bsub -q $bsubQname -n 8 -R "$bsubResources" $myCMDline:gaq)
($myCMDline:gaq)
Explanation: The :gaq is a substitution quotes all strings in the list and keeps each list element intact. This is quite similar to "$#" in bash.
This is documented in History Substitution
g Apply the following modifier once to each word.
a (+) Apply the following modifier as many times as possible to a single word. `a' and `g' can be used together to apply a modifier globally. In the current implementation, using the `a' and `s' modifiers together can lead to an infinite loop. For example, `:as/f/ff/' will never terminate. This behavior might change in the future.
q Quote the substituted words, preventing further substitutions.
This is relevant due to the text in variable substitution:
The `:' modifiers described under History substitution, except for `:p', can be applied to the substitutions above. More than one may be used. (+) Braces may be needed to insulate a variable substitution from a literal colon just as with History substitution (q.v.); any modifiers must appear within the braces.
I'm trying to use enscript to print PDFs from Mutt, and hitting character encoding issues. One way around them seems to be to just use sed to replace the problem characters: sed -ir 's/[“”]/"/g' {input}
My test input file is this:
“very dirty”
we’re
I'm hoping to get "very dirty" and we're but instead I'm still getting
â\200\234very dirtyâ\200\235
weâ\200\231re
I found a nice little post on printing to PDFs from Mutt that I used as a starting point. I have a bash script that I point to from my .muttrc with set print_command="$HOME/.mutt/print.sh" -- the script currently reads about like this:
#!/bin/bash
input="$1" pdir="$HOME/Desktop" open_pdf=evince
# Straighten out curly quotes
sed -ir 's/[“”]/"/g' $input
sed -ir "s/[’]/'/g" $input
tmpfile="`mktemp $pdir/mutt_XXXXXXXX.pdf`"
enscript --font=Courier8 $input -2r --word-wrap --fancy-header=mutt -p - 2>/dev/null | ps2pdf - $tmpfile
$open_pdf $tmpfile >/dev/null 2>&1 &
sleep 1
rm $tmpfile
It does a fine job of creating a PDF (and works fine if you give it a file as an argument) but I can't figure out how to fix the curly quotes.
I've tried a bunch of variations on the sed line:
input=sed -r 's/[“”]/"/g' $input
$input=sed -ir "s/[’]/'/g" $input
Per the suggestion at Can I use sed to manipulate a variable in bash? I also tried input=$(sed -r 's/[“”]/"/g' <<< $input) and I get an error: "Syntax error: redirection unexpected"
But none manages to actually change $input -- what is the correct syntax to change $input with sed?
Note: I accepted an answer that resolved the question I asked, but as you can see from the comments there are a couple of other issues here. enscript is taking in a whole file as a variable, not just the text of the file. So trying to tweak the text inside the file is going to take a few extra steps. I'm still learning.
On Editing Variables In General
BashFAQ #21 is a comprehensive reference on performing search-and-replace operations in bash, including within variables, and is thus recommended reading. On this particular case:
Use the shell's native string manipulation instead; this is far higher performance than forking off a subshell, launching an external process inside it, and reading that external process's output. BashFAQ #100 covers this topic in detail, and is well worth reading.
Depending on your version of bash and configured locale, it might be possible to use a bracket expression (ie. [“”], as your original code did). However, the most portable thing is to treat “ and ” separately, which will work even without multi-byte character support available.
input='“hello ’cruel’ world”'
input=${input//'“'/'"'}
input=${input//'”'/'"'}
input=${input//'’'/"'"}
printf '%s\n' "$input"
...correctly outputs:
"hello 'cruel' world"
On Using sed
To provide a literal answer -- you almost had a working sed-based approach in your question.
input=$(sed -r 's/[“”]/"/g' <<<"$input")
...adds the missing syntactic double quotes around the parameter expansion of $input, ensuring that it's treated as a single token regardless of how it might be string-split or glob-expanded.
But All That May Not Help...
The below is mentioned because your test script is manipulating content passed on the command line; if that's not the case in production, you can probably disregard the below.
If your script is invoked as ./yourscript “hello * ’cruel’ * world”, then information about exactly what the user entered is lost before the script is started, and nothing you can do here will fix that.
This is because $1, in that scenario, will only contain “hello; ’cruel’ and world” are in their own argv locations, and the *s will have been replaced with lists of files in the current directory (each such file substituted as a separate argument) before the script was even started. Because the shell responsible for parsing the user's command line (which is not the same shell running your script!) did not recognize the quotes as valid at the time when it ran this parsing, by the time the script is running, there's nothing you can do to recover the original data.
Abstract: The way to use sed to change a variable is explored, but what you really need is a way to use and edit a file. It is covered ahead.
Sed
The (two) sed line(s) could be solved with this (note that -i is not used, it is not a file but a value):
input='“very dirty”
we’re'
sed 's/[“”]/\"/g;s/’/'\''/g' <<<"$input"
But it should be faster (for small strings) to use the internals of the shell:
input='“very dirty”
we’re'
input=${input//[“”]/\"}
input=${input//[’]/\'}
printf '%s\n' "$input"
$1
But there is an underlying problem with your script, you are trying to clean an input received from the command line. You are using $1 as the source of the string. Once somebody writes:
./script “very dirty”
we’re
That input is lost. It is broken into shell's tokens and "$1" will be “very only.
But I do not believe that is what you really have.
file
However, you are also saying that the input comes from a file. If that is the case, then read it in with:
input="$(<infile)" # not $1
sed 's/[“”]/\"/g;s/’/'\''/g' <<<"$input"
Or, if you don't mind to edit (change) the file, do this instead:
sed -i 's/[“”]/\"/g;s/’/'\''/g' infile
input="$(<infile)"
Or, if you are clear and certain that what is being given to the script is a filename, like:
./script infile
You can use:
infile="$1"
sed -i 's/[“”]/\"/g;s/’/'\''/g' "$infile"
input="$(<"$infile")"
Other comments:
Then:
Quote your variables.
Do not use the very old `…` syntax, use $(…) instead.
Do not use variables in UPPER case, those are reserved for environment variables.
And (unless you actually meant sh) use a shebang (first line) that targets bash.
The command enscript most definitively requires a file, not a variable.
Maybe you should use evince to open the PS file, there is no need of the step to make a pdf, unless you know you really need it.
I believe that is better use a file to store the output of enscript and ps2pdf.
Do not hide the errors printed by the commands until everything is working as desired, then, just call the script as:
./script infile 2>/dev/null
Or as required to make it less verbose.
Final script.
If you call the script with the name of the file that enscript is going to use, something like:
./script infile
Then, the whole script will look like this (runs both in bash or sh):
#!/usr/bin/env bash
Usage(){ echo "$0; This script require a source file"; exit 1; }
[ $# -lt 1 ] && Usage
[ ! -e $1 ] && Usage
infile="$1"
pdir="$HOME/Desktop"
open_pdf=evince
# Straighten out curly quotes
sed -i 's/[“”]/\"/g;s/’/'\''/g' "$infile"
tmpfile="$(mktemp "$pdir"/mutt_XXXXXXXX.pdf)"
outfile="${tmpfile%.*}.ps"
enscript --font=Courier10 "$infile" -2r \
--word-wrap --fancy-header=mutt -p "$outfile"
ps2pdf "$outfile" "$tmpfile"
"$open_pdf" "$tmpfile" >/dev/null 2>&1 &
sleep 5
rm "$tmpfile" "$outfile"
I want to make a script that takes a file path for argument, and cds into its folder.
Here is what I made :
#!/bin/bash
#remove the file name, and change every space into \space
shorter=`echo "$1" | sed 's/\/[^\/]*$//' | sed 's/\ /\\\ /g'`
echo $shorter
cd $shorter
I actually have 2 questions (I am a relative newbie to shell scripts) :
How could I make the cd "persistent" ? I want to put this script into /usr/bin, and then call it from wherever in the filesystem. Upon return of the script, I want to stay in the $shorter folder. Basically, if pwd was /usr/bin, I could make it by typing . script /my/path instead of ./script /my/path, but what if I am in an other folder ?
The second question is trickier. My script fails whenever there is a space in the given argument. Although $shorter is exactly what I want (for instance /home/jack/my\ folder/subfolder), cd fails whith the error /usr/bin/script : line 4 : cd: /home/jack/my\: no file or folder of this type. I think I have tried everything, using things like cd '$shorter' or cd "'"$shorter"'" doesn't help. What am I missing ??
Thanks a lot for your answers
in your .bashrc add the following line:
function shorter() { cd "${1%/*}"; }
% means remove the smaller pattern from the end
/* is the patern
Then in your terminal:
$ . ~/.bashrc # to refresh your bash configuration
$ type shorter # to check if your new function is available
shorter is a function
shorter ()
{
cd "${1%/*}"
}
$ shorter ./your/directory/filename # this will move to ./your/directory
The first part:
The change of directory won't be “persistent” beyond the lifetime of your script, because your script runs in a new shell process. You could, however, use a shell alias or a shell function. For example, you could embed the code in a shell function and define it in your .bash_profile or other source location.
mycdfunction () {
cd /blah/foo/"$1"
}
As for the “spaces in names” bit:
The general syntax for referring to a variable in Bourne shells is: "$var" — the "double quotes" tell the shell to expand any variables inside of them, but to group the outcome as a single parameter.
Omitting the double quotes around $var tells the shell to expand the variable, but then split the results into parameters (“words”) on whitespace. This is how the shell splits up parameters, normally.
Using 'single quotes' causes the shell to not expand any contents, but group the parameters togethers.
You can use \ (backslash-blank) to escape a space when you're typing (or in a script), but that's usually harder to read than using 'single quotes' or "double quotes"…
Note that the expansion phase includes: $variables wild?cards* {grouping,names}with-braces $(echo command substitution) and other effects.
| expansion | no expansion
-------------------------------------------------------
grouping | " " | ' '
splitting | (no punc.) | (not easily done)
For the first part, there is no need for the shorter variable at all. You can just do:
#!/bin/bash
cd "${1%/*}"
Explanation
Most shells, including bash, have what is called Parameter Expansion and they are very powerful and efficient as they allow you to manipulate variables nativly within the shell that would normally require a call to an external binary.
Two common examples of where you can use Parameter Expansion over an external call would be:
${var%/*} # replaces dirname
${var##*/} # replaces basename
See this FAQ on Parameter Expansion to learn more. In fact, while you're there might as well go over the whole FAQ
When you put your script inside /usr/bin you can call it anywhere. And to deal with whitespace in the shell just put the target between "" (but this doesn't matter !!).
Well here is a demo:
#!/bin/bash
#you can use dirname but that's not apropriate
#shorter=$(dirname $1)
#Use parameter expansion (too much better)
shorter=${1%/*}
echo $shorter
An alternate way to do it, since you have dirname on your Mac:
#!/bin/sh
cd "$(dirname "$1")"
Since you mentioned in the comments that you wanted to be able to drag files into a window and cd to them, you might want to make your script allow file or directory paths as arguments:
#!/bin/sh
[ -f "$1" ] && set "$(dirname "$1")" # convert a file to a directory
cd "$1"
I am reading some other developer script and I run across something I dont quite understand. Please help
typeset -u DOC_RET_CODE=`grep ^${PRNT_JOB_NAME}${SEQ_NUM} ${INPUT_FILE} |cut -c273-276`
if [ "${DOC_RET_CODE}" = "GOOD" ]
I look up typeset - u and it seems like it generate read-only variable, but not sure what it doing there. For grep, I usually pipe an input like ls | grep test, but grep by itself like this, I am not so sure. I know cut -c273-276, but 4 characters out from position 273-276. So what exactly does this script do?
The back-tick command (which would be better enclosed in $(...)) is grepping for a line starting with the print job name and sequence number from the input file, and then the 'cut' command is collecting columns 273-276 (4 characters). The upper-case version of this value (typeset -u) is assigned to $DOC_RET_CODE. The test line checks whether the document return code is GOOD and does something (not shown) if it is ... and maybe something else if the status is not good.
> help typeset
typeset: typeset [-aAfFgilrtux] [-p] name[=value] ...
Set variable values and attributes.
Obsolete. See `help declare'.
> help declare
declare: declare [-aAfFgilrtux] [-p] [name[=value] ...]
Set variable values and attributes.
…
Options which set attributes:
-u to convert NAMEs to upper case on assignment
In other words, this is making everything (the result of the grep|cut pipe) uppercase to avoid a tr a-z A-Z and allow a simple comparison against GOOD.
For your other question, grep is being run against a filename ${INPUT_FILE}. You can run that command as is (after manually substituting the variables)
It's not by itself; it's passed the argument ${INPUT_FILE}, and it will read that file instead of its standard input. The "useless use of cat" version would be cat ${INPUT_FILE} | grep ....
Note that, per the earlier answer, bash has decided to drop compatibility and deprecate typeset. typeset is largely compatible between ksh, bash, and zsh.