I'm getting this error in a HP-UX machine
+ IFS=;
/home/machine1/folder/borrado_de_logs.sh[45]: read: A specified flag is not valid for this command.
And I'm using this code
head -1 $rutatemporal/logfechas.log > $rutatemporal/cabecera.txt
cabecera=`cat $rutatemporal/cabecera.txt`
IFS=';' read -a arreglo<<EOF
$cabecera
EOF
In Hp-UX its seems that read -a is not allowed
what argument I should use with read?
the content of cabecera.txt is this:
2019-02-01;/home/user/deletelogs/somelog.log
It's probably because -a is not a POSIX compliant flag support for the read command. So it is not surprising that the default shell available in your HP-UX machine is not supporting it.
You can still use the read command without -a to split and store on individual variable names as below. Also you don't need a here-doc to read from the input file, but rather use the read command directly on the file itself
IFS=\; read -r date path < "$rutatemporal"/cabecera.txt
echo "$date"
echo "$path"
Type
$ help read
and you can see the available options and their meaning.
Related
I'm very new to bash scripting and here's what i'm trying to do:
1 - Read a file - this file is a list of names
2 - Ask user if they want to delete {name}
3 - If user enters y, proceed
This is what my script looks so far:
while IFS= read -r repo
do
read -p "Do you want to delete $repo" ip
echo $ip
if [ "$ip" == "y" ]
then
#do something
fi
done < "$filename"
The read -p line does not wait for a user prompt. I sort of understood what/where the problem is and I was trying to resolve it by reading up on this link - https://bash.cyberciti.biz/guide/Reads_from_the_file_descriptor_(fd)
But somehow I was unable to resolve this issue. What am I doing wrong? Please help!
Use a different file descriptor for the named file. You know where that data is coming from; you don't know where standard input might be redirected from, so leave it alone.
while IFS= read -r -u 3 repo # Read from file descriptor 3
do
read -p "Do you want to delete $repo" ip # Read from whatever standard input happens to be
echo "$ip"
if [ "$ip" = "y" ]
then
#do something
fi
done 3< "$filename" # Supply $filename on file descriptor 3
-u is bash-specific, but I note you are already using another bash-specific feature, the -p option to read. The POSIX way to read from something other than standard input is IFS= read -r repo <&3 (which says, copy file descriptor 3 onto standard input for this command).
See this question:
Does bash support doing a read nested within a read loop?
Essentially, you're reading in the file through standard input which is the same input stream as when you type, so when you prompt the user, the script is treating the file's input as the user's input. If you instead read in the the file on another input stream then it wouldn't override.
I am trying to write an fgrep statement removing records with a full record match from a file. I can do this on the command line, but not inside a ksh script. The code I am using boils down to these 4 lines of code:
Header='abc def|ghi jkl' #I use the head command to populate this variable
workfile=abc.txt
command="fgrep -Fxv \'$Header\' $workfile" >$outfile
$command
When I echo $command to STDIN the command is exactly what I would type on the command line (with the single quotes) and that works on the command line. When I execute it within the ksh script (file) the single quotes seem not to be recognized because the errors show it is parsing on spaces.
I have tried back ticks, exec, eval, double quotes instead of single quotes, and not using the $command variable. The problem remains.
I can do this on the command line, but not inside a ksh script
Here's a simple, portable, reliable solution using a heredoc.
#!/usr/bin/env ksh
workfile=abc.txt
outfile=out.txt
IFS= read -r Header <<'EOF'
abc def|ghi jul
EOF
IFS= read -r command <<'EOF'
grep -Fxv "$Header" "$workfile" > "$outfile"
EOF
eval "$command"
Explanation :
(Comments can't be added to the script above because they would affect the lines in the heredoc)
IFS= read -r Header <<'EOF' # Line separated literal strings
abc def|ghi jul # Set into the $Header variable
EOF # As if it were a text file
IFS= read -r command <<'EOF' # Command to execute
grep -Fxv "$Header" "$workfile" > "$outfile" # As if it were typed into
EOF # the shell command line
eval "$command" # Execute the command
The above example is the same as having a text file called header.txt, which contains the contents: abc def|ghi jul and typing the following command:
grep -Fxvf header.txt abc.txt
The heredoc addresses the problem of the script operating differently than the command line as a result of quoting/expansions/escaping issues.
A Word of caution regarding eval:
The use of eval in this example is specific. Please see Eval command and security issues for information on how eval can be misused and cause potentially very damaging results.
More Detail / Alternate Example:
For the sake of completeness, clarity, and ability to apply this concept to other situations, some notes about the heredoc and an alternative demonstration:
This implementation of the heredoc in this example is specifically designed with the following criteria:
Literal string assignment of contents, to the variables (using 'EOF')
Use of the eval command to evaluate and execute the referenced variables within the heredoc itself.
File or heredoc ?
One strength of using a heredoc combined with grep -F (fgrep), is the ability to treat a section of the script as if it were a file.
Case for file:
You want to frequently paste "pattern" lines into the file, and remove them as necessary, without having to modify the script file.
Case for heredoc:
You apply the script in an environment where specific files already exist, and you want to match specific exact literal patterns against it.
Example:
Scenario: I have 5 VPS Servers, and I want a script to produce a new fstab file but to ensure it doesn't contain the exact line:
/dev/xvda1 / ext3 errors=remount-ro,noatime,barrier=0 0 1
This scenario fits the type of situation addressed in this question. I could use the boilerplate from the above code in this answer and modify it as following:
#!/usr/bin/env ksh
workfile=/etc/fstab
IFS= read -r Header <<'EOF'
/dev/xvda1 / ext3 errors=remount-ro,noatime,barrier=0 0 1
EOF
IFS= read -r command <<'EOF'
grep -Fxv "$Header" "$workfile"
EOF
eval "$command"
This would give me a new fstab file, without the line contained in the heredoc.
Bash FAQ #50: I'm trying to put a command in a variable, but the complex cases always fail! provides comprehensive guidance - while it is written for Bash, most of it applies to Ksh as well.[1]
If you want to stick with storing your command in a variable (defining a function is the better choice), use an array, which bypasses the quoting issues:
#!/usr/bin/env ksh
Header='abc def|ghi jkl'
workfile=abc.txt
# Store command and arguments as elements of an array
command=( 'fgrep' '-Fxv' "$Header" "$workfile" )
# Invoke the array as a command.
"${command[#]}" > "$outfile"
Note: only a simple command can be stored in an array, and redirections can't be part of it.
[1] The function examples use local to create local variables, which ksh doesn't support. Omit local to make do with shell-global variables instead, or use function <name> {...} syntax with typeset instead of local to declare local variables in ksh.
I'm trying to run a simple shell script example:
STR="qwe;ert;rty;tii"
IFS=';' read -r NAMES <<< "$STR"
This is giving me the following error message:
syntax error: got <&, expecting Word
I'm not exactly sure why this isn't working. I thought the syntax I used was
correct and I tried comparing to other examples and I saw the syntax was almost
identical.
Any feedback would help
Thanks
This is MKS bash, not GNU bash. It's not really bash, and doesn't support the genuine shell's syntax.
There are perfectly good (...well, reasonably adequate) builds of GNU bash for Windows. Use them.
Particularly, in real bash, to split a semicolon-separated string into an array of names:
str='qwe;ert;rty;tii'
IFS=';' read -r -a names <<<"$str"
...which you can then verify with
declare -p names
or emit one-to-a-line with
printf '%s\n' "${names[#]}"
Is it possible to extract certain files based from a master list via sftp.
Example:
directory containts ff files.
aa.txt
bb.txt
cc.txt
masterlist.txt containts:
aa.txt
bb.txt
Files that should be extracted
aa.txt
bb.txt
Edit:
Thanks #shellter for your feedback.
I did try to write my own code but wasn't able to find samples that I could work on(I'm not a unix person btw).
Anyhow, as for your suggestion regarding using while-read-line, I've tried it but I am getting Invalid command error.
#!/bin/ksh
file=MasterList.txt
while IFS= read -r line
do
echo "fetching $line"
sftp user#192.168.1.101
cd /data/EP_files/balex
get "$line"
bye
done <"$file"
Lastly, if my masterfile containts 10k list of files, is this kind of approach ok performance wise?
Thank you
You need to create the entire sequence of SFTP commands - including the individual get commands for all files in the input list up front, and then invoke sftp only once, passing the command list via stdin (standard input):
#!/usr/bin/env ksh
file=MasterList.txt
sftp -b - user#192.168.1.101 <<EOF
cd /data/EP_files/balex
$(sed -n 's/^file_.*/get "&"/p' "$file")
bye
EOF
The <<EOF ... EOF block is a a so-called here-document, which allows passing multiline strings (optionally with embedded variable references and commands) via stdin.
sed -n 's/^file_.*/get "&"/p' "$file" embeds a get command for each filename in $file that starts with file_, ignoring any other lines (as requested by the OP in a comment).
The above assumes that your sftp utility accepts a list of commands in "batch" mode via the -b option via stdin (-).
I have a file like
name1=value1
name2=value2
I need to read this file using shell script and set variables
$name1=value1
$name2=value2
Please provide a script that can do this.
I tried the first answer below, i.e. sourcing the properties file but I'm getting a problem if the value contains spaces. It gets interpreted as a new command after the space. How can I get it to work in the presence of spaces?
If all lines in the input file are of this format, then simply sourcing it will set the variables:
source nameOfFileWithKeyValuePairs
or
. nameOfFileWithKeyValuePairs
Use:
while read -r line; do declare "$line"; done <file
Sourcing the file using . or source has the problem that you can also put commands in there that are executed. If the input is not absolutely trusted, that's a problem (hello rm -rf /).
You can use read to read key value pairs like this if there's only a limited known amount of keys:
read_properties()
{
file="$1"
while IFS="=" read -r key value; do
case "$key" in
"name1") name1="$value" ;;
"name2") name2="$value" ;;
esac
done < "$file"
}
if your file location is /location/to/file and the key is mykey:
grep mykey $"/location/to/file" | awk -F= '{print $2}'
Improved version of #robinst
read_properties()
{
file="$1"
while IFS="=" read -r key value; do
case "$key" in
'#'*) ;;
*)
eval "$key=\"$value\""
esac
done < "$file"
}
Changes:
Dynamic key mapping instead of static
Supports (skips) comment lines
A nice one is also the solution of #kurumi, but it isn't supported in busybox
And here a completely different variant:
eval "`sed -r -e "s/'/'\\"'\\"'/g" -e "s/^(.+)=(.+)\$/\1='\2'/" $filename`"
(i tried to do best with escaping, but I'm not sure if that's enough)
suppose the name of your file is some.properties
#!/bin/sh
# Sample shell script to read and act on properties
# source the properties:
. some.properties
# Then reference then:
echo "name1 is $name1 and name2 is $name2"
Use shdotenv
dotenv support for shell and POSIX-compliant .env syntax specification
https://github.com/ko1nksm/shdotenv
Usage: shdotenv [OPTION]... [--] [COMMAND [ARG]...]
-d, --dialect DIALECT Specify the .env dialect [default: posix]
(posix, ruby, node, python, php, go, rust, docker)
-s, --shell SHELL Output in the specified shell format [default: posix]
(posix, fish)
-e, --env ENV_PATH Location of the .env file [default: .env]
Multiple -e options are allowed
-o, --overload Overload predefined environment variables
-n, --noexport Do not export keys without export prefix
-g, --grep PATTERN Output only those that match the regexp pattern
-k, --keyonly Output only variable names
-q, --quiet Suppress all output
-v, --version Show the version and exit
-h, --help Show this message and exit
Load the .env file into your shell script.
eval "$(shdotenv [OPTION]...)"
sed 's/^/\$/' yourfilename