Store Result of Bash Command in Shell Variable [duplicate] - bash

This question already has answers here:
How do I set a variable to the output of a command in Bash?
(15 answers)
Closed 4 months ago.
I am trying to store the result of a bash command within a for loop for use in a command. This is what I currently have:
for filename in /home/WIN/USER/files/*
var=$(basename ${filename%.*}) | awk -F'[_.]' '{print $1}'
do echo var
done
However, I am getting these errors:
./script.sh: line 2: syntax error near unexpected token `var=$(basename ${filename%.*})'
./script.sh: line 2: `var=$(basename ${filename%.*}) | awk -F'[_.]' '{print $1}''
Does anyone know how to fix this or how to do what I am trying to do?
Thanks.

Your for statement is wrong, and your variable assignment statement is also wrong. You shall write something like this:
for filename in /home/WIN/USER/files/*; do
var=$( your shell code goes here ) # you assign the output of the shell code here
echo "$var" # you echo the results here
done

Related

how to fix redirection unexpected syntax error [duplicate]

This question already has answers here:
Difference between sh and Bash
(11 answers)
Closed 1 year ago.
I am running a bash script in which one line is this:
VERSION=$(awk -F. '{print $2}' <<< $BISMARK)
VERSION=$(cut -d '.' -f2 <<< $BISMARK )
but getting the following error from this line (when I comment out this line I will not get any error).
Syntax error: redirection unexpected
do you know what the problem is?
It would seem you are not actually running the script with Bash, but with some other shell instead. Your code works fine for me on Bash, but executing it with BusyBox's ash for example results in the error you mentioned.
What is the first line of your script? It should be either:
#!/bin/bash
or:
#!/usr/bin/env bash
Also, how do you execute the script? If the first line is correct, you should run it like this:
./script.sh
or alternatively like this:
bash script.sh

set -e fails to catch all errors when output piped to awk [duplicate]

This question already has answers here:
Exit when one process in pipe fails
(2 answers)
Closed 4 years ago.
I am trying to understand how error catching works when piped to awk in a bash script. I have a bash script with :
#!/bin/bash
set -e
echo "hello"
OUTPUT=`ls /root/* | awk '{print $0}'`
echo "world"
When I run this as a non-privileged user, the output is
hello
ls: cannot access /root/*: Permission denied
world
despite the fact that set -e should cause my script to end if any error occurs. The line echo "world" should not be executed because the script does generate an error.
Now if I replace the awk statement with grep poo, the script exits as expected.
QUESTION: What is awk doing that manages to hide the errors from bash?
This is completely by definition; the output status of a pipeline is the exit status of the last command in the pipeline.
You can trap this specific error in Awk explicitly:
ls /root/* | awk 'BEGIN {x=1} {x=0; print $NF } END { exit x }'
This causes Awk to set its exit status to reflect whether it received any input at all. This is similar to the grep behavior you discovered (report success if any matches were found).

How to read input from command line? [duplicate]

This question already has answers here:
get the user input in awk
(3 answers)
Closed 4 years ago.
I have to read certain things from the command line to a shell script which calls an awk script. As far as I know, reading into an awk script is not possible, so I would have to read it into the shell script which then would pass the value of the variable to the awk script so that I could work with it. How can it be done?
For example in Bash you use read to read input into a variable and pass the variable to awk with awk's command line parameter:
$ read -p PROMPT var ; awk -v awkvar="$var" 'BEGIN{print "You wrote: " awkvar}'
PROMPTfoo
You wrote: foo
Here you go:-
create a shell script with below information
vi my.sh
Now add below line of script
#!/bin/bash
#add your awk command here with $1
echo $1
while shift
do
#add your awk command here with $i
echo $1
done
Save the file and
chmod 777 my.sh
Now run:-
./my.sh hello world how are you doing
Out put:-
hello
world
how
are
you
doing

awk command does not work properly when assigning output to variable [duplicate]

This question already has answers here:
How do I set a variable to the output of a command in Bash?
(15 answers)
Closed 5 years ago.
I'm trying to convert space to underscore in a file name, my script is
like below.
old_file=/home/somedir/otherdir/foobar 20170919.csv
new_file="$(basename "$old_file")" | awk 'gsub(" ","_")'
This script works fine when I use with echo command,
echo "$(basename "$old_file")" | awk 'gsub(" ","_")'
but when it comes to assigning the output to variables, it doesn't work...
Does anybody know the idea?
Actually no need of awk, please note below one replaces all space to underscore, not just filename, it can be path too
$ old_file="/home/somedir/otherdir/foobar 20170919.csv"
$ newfile="${old_file// /_}"
$ echo "$newfile"
/home/somedir/otherdir/foobar_20170919.csv

Error when storing the first line of a file to a variable [duplicate]

This question already has answers here:
How do I set a variable to the output of a command in Bash?
(15 answers)
Closed 7 years ago.
I am doing basic programs with Linux shell scripting(bash)
I want to read the first line of a file and store it in a variable .
My input file :
100|Surender|CHN
101|Raja|BNG
102|Kumar|CHN
My shell script is below
first_line=cat /home/user/inputfiles/records.txt | head -1
echo $first_line
I am executing the shell script with bash records.sh
It throws me error as
/home/user/inputfiles/records.txt line 1: command not found
Could some one help me on this
The line
first_line=cat /home/user/inputfiles/records.txt | head -1
sets variable first_line to cat and then tries to execute the rest as a command resulting in an error.
You should use command substitution to execute cat .../records.txt | head -1 as a command:
first_line=`cat /home/user/inputfiles/records.txt | head -1`
echo $first_line
The other answer addresses the obvious mistake you made. Though, you're not using the idiomatic way of reading the first line of a file. Please consider this instead (more efficient, avoiding a subshell, a pipe, two external processes among which a useless use of cat):
IFS= read -r first_line < /home/user/inputfiles/records.txt

Resources