How do I pipe into printf? [duplicate] - bash

This question already has answers here:
Piping not working with echo command [duplicate]
(4 answers)
How to pass command output as multiple arguments to another command
(5 answers)
Closed 1 year ago.
I'm going nuts trying to understand what is the problem with this simple example (zsh or bash):
echo -n "6842" | printf "%'d"
The output is 0... why though? I'd like the output to be 6,842
Thanks in advance, I've had no luck for an hour now using google trying to figure this out...!

printf doesn't read arguments to format from standard input, but from the command line directly. For example, this works:
$ printf "%'d" 6842
6,842
You can convert output of a command to command-line arguments using command substitution:
$ printf "%'d" $(echo -n 6842)
6,842
If you want to invoke printf inside a pipeline, you can use xargs to read input and execute printf with the appropriate arguments:
echo -n "6842" | xargs printf "%'d"

printf does not format data passed to it on standard input; it takes a set of arguments, the first of which is the format, and the remainder are the values to display.
Luckily, this is exactly what xargs is for; to quote the manual:
xargs - build and execute command lines from standard input
So instead of piping to printf directly, you can pipe to xargs, and tell it to run printf for you with the given arguments. In short:
echo -n "6842" | xargs printf "%'d"

Related

how to run bash for each line in command output [duplicate]

This question already has answers here:
How can I loop over the output of a shell command?
(4 answers)
Closed 2 months ago.
i am building a bash script that is supposed to put each line of the output of one command to an variable and then run some commands on that, i am doing it like
for i in `cmd`
do
echo i=$i
lang=$(echo $i | cut -d '"' -f 6)
echo lang=$lang
#some stuff
done
my problem is that for is using space and newlines for separation to different $i's and i want it to do create only new $i's with newline delimiters cause every line may have a cupple of spaces and i want them no matter that handled as it own...
google but found nothing really helping me, only suggestions to use xargs which dosnt help me cause i need to use not one command but a cupple after creating some variables and running some if statements that desiside which command is to run if any...
If you want to read cmd's output line by line you can do it using
while loop and bash's internal read command
cmd | while IFS= read -r i
do
echo "i=${i}"
lang="$(echo "${i}" | cut -d '"' -f 6)"
echo "lang=${lang}"
#some stuff
done
Use " around a variable's de-reference to avoid problems with spaces inside it's value.

How can I create a filename using bash command parameter? [duplicate]

This question already has answers here:
How to use > in an xargs command?
(4 answers)
Closed 2 years ago.
I'm trying to create a filename using command param as such but not sure how to go about doing it. This is what I was trying:
echo "AZ" |xargs date >> $1.txt
I'm trying to create a file named AZ.txt with the date in it.
If you must use xargs, then you'll have to wrap the rest of it in a shell script to delay execution of the redirection until you have constructed the filename:
echo AZ | xargs sh -c 'date >> "$1".txt' sh
The trailing "sh" is to force the xargs argument "AZ" into $1 instead of $0
If the name of the file comes from some command, you can use command substitution:
date > "$(echo AZ)".txt

How to safely echo all arguments of a script? [duplicate]

This question already has answers here:
Bash: echo string that starts with "-"
(4 answers)
Closed 2 years ago.
I am writing a bash script that must echo all of its arguments, which is surprisingly difficult to do.
The naive implementation looks like this:
#!/bin/bash
echo "$#"
However, that fails with input such as:
> ./script.sh -n -e -v -e -r
-v -e -r>
How can I make this more robust, such that the above results in:
> ./script.sh -n -e -v -e -r
-n -e -v -e -r
>
echo command's behavior may be different between systems. The safest way is to use printf:
printf '%s\n' "$*"
According to posix:
It is not possible to use echo portably across all POSIX systems unless both -n (as the first argument) and escape sequences are omitted.
The printf utility can be used portably to emulate any of the traditional behaviors of the echo utility ...
Using printf instead of echo:
#!/bin/bash
printf "%s " "$#"
printf "\n"
You can use printf as well :
#!/bin/bash
printf "%s\n" "$*"
This one adds a space in the beginning but is rather simple :
#!/bin/bash
echo "" "$#"

Correct way to assign regex replace to a variable in bash [duplicate]

This question already has answers here:
Send string to stdin
(6 answers)
Command not found error in Bash variable assignment
(5 answers)
How do I set a variable to the output of a command in Bash?
(15 answers)
Closed 3 years ago.
I am trying to assign a variable to the result of a regex substitution in bash. For instance, when I run
echo $initialvar | perl -pe 's/.+(Some_Dir\/)(.+)/\2/'
I get the desired output from the echo. How would I assign a newvar to the resulting output?
I've tried:
newvar= "$($initialvar | perl -pe 's/.+(Some_Dir\/)(.+)/\2/')"
newvar= echo $initialvar | perl -pe 's/.+(Some_Dir\/)(.+)/\2/'
but none of them work
As for your question, it seems to consist of two challenges: One being assigning the output of running a command to a BASH variable, and the other being piping the content of a variable to the standard input of a Perl program.
One
foo=$(bar)
runs bar and saves its output to the BASH variable $foo.
The other
Any of
echo "$foo" | bar
bar <(echo "$foo")
bar <<< "$foo"
runs bar with the content of $foo piped to its standard input
Combining the two
baz=$(bar <<< "$foo")
sets $baz to the value produced by bar having the content of $foo sent to its standard input.
Secondly, a few Perl-related suggestions peripherally related to your question:
I might instead use perl -nE:
-n will loop through each line like -p, but won't print by default.
-E will evaluate like -e, but with experimental features like say enabled.
You can avoid escaping the slash in Some_Dir/ by using another separator, e.g. s!...!...! or m!...!. Instead of replacing with s//, since you're just printing "Some_Dir/" if it matches, you might as well go and do that directly:
perl -nE 'say (m!Some_Dir/! ? "Some_Dir/" : $_)'
So:
newvar=$(perl -nE 'say (m!Some_Dir/! ? "Some_Dir/" : $_)' <<< "$initialvar")
You can use either of the following:
newvar=$(echo "$initialvar" | perl -pe 's/.+(Some_Dir\/)(.+)/\2/')
newvar=`echo "$initialvar" | perl -pe 's/.+(Some_Dir\/)(.+)/\2/'`

awk command does not work properly when assigning output to variable [duplicate]

This question already has answers here:
How do I set a variable to the output of a command in Bash?
(15 answers)
Closed 5 years ago.
I'm trying to convert space to underscore in a file name, my script is
like below.
old_file=/home/somedir/otherdir/foobar 20170919.csv
new_file="$(basename "$old_file")" | awk 'gsub(" ","_")'
This script works fine when I use with echo command,
echo "$(basename "$old_file")" | awk 'gsub(" ","_")'
but when it comes to assigning the output to variables, it doesn't work...
Does anybody know the idea?
Actually no need of awk, please note below one replaces all space to underscore, not just filename, it can be path too
$ old_file="/home/somedir/otherdir/foobar 20170919.csv"
$ newfile="${old_file// /_}"
$ echo "$newfile"
/home/somedir/otherdir/foobar_20170919.csv

Resources