I have a bash function that fetches values (using curl and cut) and creates a file name from them. Now I want to support a second naming scheme that needs a different set of parameters.
Example:
#!/bin/bash
TEMPLATE="%02i. %s.txt"
foo() {
a="Imagine these to"
b="be set dynamically"
c="42"
filename="$(printf "$TEMPLATE" "$c" "$a")"
# second: filename="$a - $b.txt"
# or: filename="$(printf "%s - %s.txt" "$a" "$b")"
echo "$filename"
# generate file
}
# actual script loops over:
foo
One of the values is a number that should be padded with leading zeros if required, thus printf in the current implementation.
Is there a way to implement this with just setting a different template globally? This would require that the template can access parameters by index or at least skip some of them.
If not, what are my alternatives? The template is to be chosen by command line parameter and does not change after initialization.
What does not work:
bash man page suggests that zero length output is not possible (to skip values)
C's printf man page mentions a "%m$" construct, which apparently is not supported in bash
the function itself generates the values, so it cannot receive the full filename as parameter
If you need to skip an argument, you can use %.s. Examples:
$ printf "%.s%s\n" "Bonjour" "Hello"
Hello
$ printf "%s%.s\n" "Bonjour" "Hello"
Bonjour
AFAIK, you can't access arguments by index.
If you need to store the formated string in a variable, please don't use a subshell as in:
$ variable=$(printf "%.s%s" "Bonjour" "Hello")
$ echo "$variable"
Hello
Instead, use the -v option to printf (type help printf to have the details) as in:
$ printf -v variable "%.s%s" "Bonjour" "Hello"
$ echo "$variable"
Hello
Also, if your template can come from user input, I would advise you to add -- just before it (this is to end the options of the command, just in case a user wants to start a template with a dash). Hence I would replace you line
filename="$(printf "$TEMPLATE" "$c" "$a")"
with
printf -v filename -- "$TEMPLATE" "$c" "$a"
Finally, having upper case variable names is considered bad bash practice.
Related
Given the following Bash shell script excerpt:
# The intent is to take the PATH env variable, break it up into its parts, making them
# appear to be command line args (i.e., `$1`, `$2`, ...), and then for this example, just
# echo the parts in space delimited form, but we can imagine that we may want to do other
# things with them - this is just sample usage
# Important Requirement/Constraint
# ================================
# Please do not alter the "PATH to $1, $2, $3, ..." portion of the answer or replace the
# Bash ".." range construct with the output of the "seq" command exec'd in a subshell.
# Preferably, the answer should simply consist of the simplification of the last line of
# code - the "eval eval ..." . Also, please don't simplify by collapsing the whole thing
# to just echo "$#" since we may want to work with only some of the parts, and not
# necessarily the first parts, of the path. That is to say that the 1 and $# in the
# {1..$#} range could be replaced with other shell variables or expr., potentially
# Test case
PATH=/usr/local/bin:/usr/bin:/bin
# The code being examined follows
# Set ':' as the input field separator of the path
IFS=: # Or, more appropriately if in a function: local IFS=:
# Parse the PATH environment variable and break it up into its components
set $PATH
# This is the line we want to simplify, if possible, without losing functionality of
# course (see the comment that follows for details)
eval eval echo '\'$(eval 'echo "\${1..$#}"')
# Some notes and explanations regarding the functionality and underlying intent of the
# preceding line:
# - We start by dynamically creating the following construct: ${1..3}
# since $# is 3 for our example
# - Use Bash to expand that construct to: $1 $2 $3
# these vars contain the parsed parts of the PATH
# - Finally, display the three parts of the PATH using echo: echo $1 $2 $3
# - This causes the following text to be sent to STDOUT:
# /usr/local/bin /usr/bin /bin
So, can the eval eval... line in the preceding code be simplified, but still produce the desired output, which for the above example is:
/usr/local/bin /usr/bin /bin
I am thinking along the lines of a solution that would replace some of the echo commands with input/output redirection (perhaps) or maybe a reordering/collapsing of sorts that would lead to the need for fewer eval commands than are used in the example.
but still produce the desired output,
/usr/local/bin /usr/bin /bin
Just:
echo "${PATH//:/ }"
The intent is to take the PATH env variable, break it up into its parts, making them
appear to be command line args (i.e., $1, $2, ...), and then for this example, just
echo the parts in space delimited form, but we can imagine that we may want to do other
things with them - this is just sample usage
I do not trust unquoted shell expansions.
IFS=':' read -ra patharr <<<"$PATH"
set -- "${patharr[#]}"
IFS=' '; printf "%s\n" "${patharr[*]}"
echo "${PATH}" | tr ':' '\n' > stack
count=1
echo "#/bin/sh-" | tr '-' '\n' >> stack2
while read line
do
echo "path${count}=${line}" >> stack2
count=$(($count+1))
done < stack
source stack2
Now you've got every section of the path, in its' own named variable.
Sticking close to the original, you can do
IFS=:
set $PATH
echo "$#"
If you don't want to change IFS and PATH, you can do
set $(sed 's/[^=]*=//;s/:/ /g' <<< ${PATH})
echo "$#"
I am trying to Concatenate variables with Strings in my bash script, the variables are being read independently but whenever I try to concatenate them, it doesn't recognize the variable values.
ex-
echo $CONFIG_PROTOCOL (Prints the variable value, HTTP)
echo $CONFIG_PROTOCOL'://'$CONFIG_SERVER_SOURCE:$CONFIG_PORT'/api/creation/objects/export?collection.ids='$sf
The above echo with the URL prints out /api/creations/objects/export?collection.ids=value_1, while it should print out http://localhost:8080/api/creation/objects/export?collection.ids=value_1
Any inputs will be appreciated.
This happens because $CONFIG_PORT has a trailing carriage return. Here's a MCVE:
CONFIG_PROTOCOL="http"
CONFIG_SERVER_SOURCE="example.com"
CONFIG_PORT="8080"
sf="42"
# Introduce bug:
CONFIG_PORT+=$'\r'
echo $CONFIG_PROTOCOL'://'$CONFIG_SERVER_SOURCE:$CONFIG_PORT'/api/creation/objects/export?collection.ids='$sf
When executed, this prints:
/api/creation/objects/export?collection.ids=42
When you comment out the buggy line, you get:
http://example.com:8080/api/creation/objects/export?collection.ids=42
In both cases, echo will appear to show the correct values because echo is a tool for showing text to humans and not useful for showing the underlying data. printf '%q\n' "$CONFIG_PORT" will instead show it in an unambiguous format:
$ echo $CONFIG_PORT
8080
$ printf '%q\n' "$CONFIG_PORT"
$'8080\r'
The best way to fix this is to ensure that whatever supplies the value does so correctly. But the easiest way is to just strip them:
echo $CONFIG_PROTOCOL'://'$CONFIG_SERVER_SOURCE:$CONFIG_PORT'/api/creation/objects/export?collection.ids='$sf | tr -d '\r'
echo "$CONFIG_PROTOCOL://$CONFIG_SERVER_SOURCE:$CONFIG_PORT/api/creation/objects/export?collection.ids=$sf"
Try above echo statement
Tried to keep my code as simple as possible:
1: What are the rules for using echo within a while loop?
All my $a and some of my $word variables are echoed not my echo kk?
2: What is the scope of my count variable? Why is it not working within my while loop? can I extend the variable to make it global?
3: When I use the grep in the final row the $word cariable only prints the first word in the passing rows ehile if I remove the grep line in the end $work functions as intended and prints all the words.
count=1
while read a; do
((count=count+1))
if [ $count -le 2 ]
then
echo $a
echo kk
for word in $a; do
echo $word
done
fi
done < data.txt | grep Iteration
Use Process Substitution
In a comment, you say:
I thtought I was using grep on data.txt (sic)
No. Your current pipeline passes the loop's results through grep, not the source file. To do that, you need to rewrite your redirection to use process substitution. For example:
count=1
while read a; do
((count=count+1))
if [ $count -le 2 ]
then
echo $a
echo kk
for word in $a; do
echo $word
done
fi
done < <(fgrep Iteration data.txt)
#CodeGnome answered your question but there's other problems with your script that will come back to bite you at some point. (see https://unix.stackexchange.com/questions/169716/why-is-using-a-shell-loop-to-process-text-considered-bad-practice for discussions on some of them and also google quoting shell variables). Just don't do it. Shell scripts are just for sequencing calls to tools and the UNIX tool for manipulating text is awk. In this case all you'd need to do the job robustly, portably and efficiently would be:
awk '
/Iteration/ {
if (++count <= 2) {
print
print "kk"
for (i=1; i<=NF; i++) {
print $i
}
}
}' data.txt
and of course it'd be more efficient still if you just stop reading the input when count hits 2:
awk '
/Iteration/ {
print
print "kk"
for (i=1; i<=NF; i++) {
print $i
}
if (++count == 2) {
exit
}
}' data.txt
To complement CodeGnome's helpful answer with an explanation of how your command actually works and why it doesn't do what you want:
In Bash's grammar, an input redirection such as < data.txt is part of a single command, whereas |, the pipe symbol, chains multiple commands, from left to right, to form a pipeline.
Technically, while ... done ... < data.txt | grep Iteration is a single pipeline composed of 2 commands:
a single compound command (while ...; do ...; done) with an input redirection (< data.txt),
and a simple command (grep Iteration) that receives the stdout output from the compound command via its stdin, courtesy of the pipe.
In other words:
only the contents of data.txt is fed to the while loop as input (via stdin),
and whatever stdout output the while loop produces is then sent to the next pipeline segment, the grep command.
By contrast, it sounds like you want to apply grep to data.txt first, and only sent the matching lines to the while loop.
You have the following options for sending a command's output to another command:
Note: The following solutions use a simplified while loop for brevity - whether a while command is single-line or spans multiple lines is irrelevant.
Also, instead of using input redirection (< data.txt) to pass the file content to grep, data.txt is passed as a filename argument.
Option 1: Place the command whose output to send to your while loop first in the pipeline:
grep 'Iteration' data.txt | while read -r a; do echo "$a"; done
The down-side of this approach is that your while loop then runs in a subshell (as all segments of a pipeline do by default), which means that variables defined or modified in your while command won't be visible to the current shell.
In Bash v4.2+, you can fix this by running shopt -s lastpipe, which tells Bash to run the last pipeline segment - the while command in this case - in the current shell instead.
Note that lastpipe is a nonstandard bash extension to the POSIX standard.
(To try this in an interactive shell, you must first turn off job control with set +m.)
Option 2: Use a process substitution:
Loosely speaking, a process substitution <(...) allows you to present command output as the content of a temporary file that cleans up after itself.
Since <(...) expands to the temporary file's (FIFO's) path, and read in the while loop only accepts stdin input, input redirection must be applied as well: < <(...):
while read -r a; do echo "$a"; done < <(grep 'Iteration' data.txt)
The advantage of this approach is that the while loop runs in the current subshell, and any variables definitions or modifications therefore remain in scope after the command completes.
The potential down-side of this approach is that process substitutions are a nonstandard bash extension to the POSIX standard (although ksh and zsh support them too).
Option 3: Use a command substitution inside a here-document:
Using the command first in the pipeline (option 1) is a POSIX-compliant approach, but doesn't allow you to modify variables in the current shell (and Bash's lastpipe option is not POSIX-compliant).
The only POSIX-compliant way to send command output to a command that runs in the current shell is to use a command substitution ($(...)) inside a double-quoted here-document:
while read -r a; do echo "$a"; done <<EOF
$(grep 'Iteration' data.txt)
EOF
Streamlining your code and making it more robust:
The rest of your code has some non-obvious pitfalls that are worth addressing:
Double-quote your variable references (e.g., echo "$a" instead of echo $a), unless you specifically want word-splitting and globbing (filename expansion) applied to the values; word splitting and globbing are two kinds of shell expansions.
Similarly, don't use for to iterate over an (of necessity unquoted) variable reference (don't use for word in $a, in your case), unless you want globbing applied to the individual words - see what happens when you run $a='one *'; for word in $a; do echo "$word"; done
You could turn globbing off beforehand (set -f) and back on after (set +f), but it's better to use read -ra words ... to read the words into an array first, and then safely iterate over the array elements with for word in "${words[#]}"; ...- note the "..." around the array variable reference.
Always use -r with read; without it, rarely used \-preprocessing is applied, which will "eat" embedded \ chars.
If we heed the advice above, apply a few additional tweaks, and use a process substitution to feed grep's output to the while loop, we get:
count=1
while read -r a; do # Note the -r
if (( ++count <= 2 )); then
echo "$a"
# Split $a safely into words and store the words in
# array variable ${words[#]}.
read -ra words <<<"$a" # Note the -a to read into an *array*.
# Loop over the words (elements of the array).
# Note: To simply print the words, you could use
# `printf '%s\n' "${words[#]}"`` instead of the loop.
for word in "${words[#]}"; do
echo "$word"
done
fi
done < <(grep 'Iteration' data.txt)
Note: As written, you don't need a loop at all, because you always exit after the 1st iteration.
Finally, as a general alternative for larger input sets, consider Ed Morton's helpful answer, which is much faster due to using awk to process your input file, whereas looping in shell code is generally slow.
I want to make sure my script will work when the user uses a syntax like this:
script.sh firstVariable < SecondVariable
For some reason I can't get this to work.
I want $1=firstVariable
And $2=SecondVariable
But for some reason my script thinks only firstVariable exists?
This is a classic X-Y problem. The goal is to write a utility in which
utility file1 file2
and
utility file1 < file2
have the same behaviour. It seems tempting to find a way to somehow translate the second invocation into the first one by (somehow) figuring out the "name" of stdin, and then using that name the same way as the second argument would be used. Unfortunately, that's not possible. The redirection happens before the utility is invoked, and there is no portable way to get the "name" of an open file descriptor. (Indeed, it might not even have a name, in the case of other_cmd | utility file1.)
So the solution is to focus on what is being asked for: make the two behaviours consistent. This is the case with most standard utilities (grep, cat, sort, etc.): if the input file is not specified, the utility uses stdin.
In many unix implementations, stdin does actually have a name: /dev/stdin. In such systems, the above can be achieved trivially:
utility() {
utility_implementation "$1" "${2:-/dev/stdin}"
}
where utility_implementation actually does whatever is required to be done. The syntax of the second argument is normal default parameter expansion; it represents the value of $2 if $2 is present and non-empty, and otherwise the string /dev/stdin. (If you leave out the - so that it is "${2:/dev/stdin}", then it won't do the substitution if $2 is present and empty, which might be better.)
Another way to solve the problem is to ensure that the first syntax becomes the same as the second syntax, so that the input is always coming from stdin even with a named file. The obvious simple approach:
utility() {
if (( $# < 2 )); then
utility_implementation "$1"
else
utility_implementation "$1" < "$2"
fi
}
Another way to do this uses the exec command with just a redirection to redirect the shell's own stdin. Note that we have to do this inside a subshell ((...) instead of {...}) so that the redirection does not apply to the shell which invokes the function:
utility() (
if (( $# > 1 )) then; exec < "$2"; fi
# implementation goes here. $1 is file1 and stdin
# is now redirected to $2 if $2 was provided.
# ...
)
To make the stdin of the second variable the final argument to the script(so if you have one arg then < second arg, it will be the second), you can use the below
#!/bin/bash
##read loop to read in stdin
while read -r line
do
## This just checks if the variable is empty, so a newline isn't appended on the front
[[ -z $Vars ]] && Vars="$line" && continue
## Appends every line read to variable
Vars="$Vars"$'\n'"$line"
## While read loop using stdin
done < /dev/stdin
##Set re-sets the arguments to the script to the original arguments and then the new argument we derived from stdin
set - "$#" "$Vars"
## Echo the new arguments
echo "$#"
Script doesn't work when I want to use standard input when there are no arguments (files) passed. Is there any way how to use stdin instead of a file in this code?
I tried this:
if [ ! -n $1 ] # check if argument exists
then
$1=$(</dev/stdin) # if not use stdin as an argument
fi
var="$1"
while read line
do
... # find the longest line
done <"$var"
For a general case of wanting to read a value from stdin when a parameter is missing, this will work.
$ echo param | script.sh
$ script.sh param
script.sh
#!/bin/bash
set -- "${1:-$(</dev/stdin)}" "${#:2}"
echo $1
Just substitute bash's specially interpreted /dev/stdin as the filename:
VAR=$1
while read blah; do
...
done < "${VAR:-/dev/stdin}"
(Note that bash will actually use that special file /dev/stdin if built for an OS that offers it, but since bash 2.04 will work around that file's absence on systems that do not support it.)
pilcrow's answer provides an elegant solution; this is an explanation of why the OP's approach didn't work.
The main problem with the OP's approach was the attempt to assign to positional parameter $1 with $1=..., which won't work.
The LHS is expanded by the shell to the value of $1, and the result is interpreted as the name of the variable to assign to - clearly, not the intent.
The only way to assign to $1 in bash is via the set builtin.
The caveat is that set invariably sets all positional parameters, so you have to include the other ones as well, if any.
set -- "${1:-/dev/stdin}" "${#:2}" # "${#:2}" expands to all remaining parameters
(If you expect only at most 1 argument, set -- "${1:-/dev/stdin}" will do.)
The above also corrects a secondary problem with the OP's approach: the attempt to store the contents rather than the filename of stdin in $1, since < is used.
${1:-/dev/stdin} is an application of bash parameter expansion that says: return the value of $1, unless $1 is undefined (no argument was passed) or its value is the empty string (""or '' was passed). The variation ${1-/dev/stdin} (no :) would only return /dev/stdin if $1 is undefined (if it contains any value, even the empty string, it would be returned).
If we put it all together:
# Default to filename '/dev/stdin' (stdin), if none was specified.
set -- "${1:-/dev/stdin}" "${#:2}"
while read -r line; do
... # find the longest line
done < "$1"
But, of course, the much simpler approach would be to use ${1:-/dev/stdin} as the filename directly:
while read -r line; do
... # find the longest line
done < "${1:-/dev/stdin}"
or, via an intermediate variable:
filename=${1:-/dev/stdin}
while read -r line; do
... # find the longest line
done < "$filename"
Variables are assigned a value by Var=Value and that variable is used by e.g. echo $Var. In your case, that would amount to
1=$(</dev/stdin)
when assigning the standard input. However, I do not think that variable names are allowed to start with a digit character. See the question bash read from file or stdin for ways to solve this.
Here is my version of script:
#!/bin/bash
file=${1--} # POSIX-compliant; ${1:--} can be used either.
while IFS= read -r line; do
printf '%s\n' "$line"
done < <(cat -- "$file")
If file is not present in the argument, read the from standard input.
See more examples: How to read from file or stdin in bash? at stackoverflow SE