How do I include a valid regular line instruction before a literal block in YAML? - yaml

If I use the pipe delimiter to add a literal block to my yml file like so, this is valid YAML:
build:
|
if [ <condition> ]; then
<command>
else
<command>
fi
- mv file-a file-b
But if I need the block to follow after some other regular instructions, it's invalid and the code won't run as expected:
build:
- mv file-a file-b
|
if [ <condition> ]; then
<command>
else
<command>
fi
Why is this the case, and what am I doing wrong? I definitely need the regular instructions to come first before the block.
The regular single line commands should be evaluated without error and then proceed to the literal block instructions. But the code is invalid and doesn't work. It only works if I place the literal block first in order.
"It looks like your post is mostly code". Honestly there's nothing else to say.
Okay, would you like me to tell you a story? This site isn't the same as it used to be.

A literal block scalar continues until it encounters a line that is less indented than the block scalar. That is to say, in your first example, the line
- mv file-a file-b
is processed as part of the block scalar.
However, in your second example, you place this line in front of the block scalar, which makes it a block sequence item. Block sequence items can exist on the same level only with other block sequence items. A literal block scalar is not a block sequence item, hence it cannot be on the same level.
You can make the literal scalar a block sequence item:
build:
- mv file-a file-b
- |
if [ <condition> ]; then
<command>
else
<command>
fi
Or you simply make the command part of the literal block scalar:
build: |
mv file-a file-b
if [ <condition> ]; then
<command>
else
<command>
fi
It is unclear from your description what you intend the structure to be, but one of these solutions is probably what you're looking for.

Related

cat on a quoted variable fails

I have this code snippet:
userjobs=$(grep -rw "$USER" /my/job/dir/|awk '{print $1}'|sort|uniq|rev|cut -c 2-|rev)
for job in "${userjobs[#]}"; do
cat "$job"
done
exit 0
When I run it as is, I get the following output:
cat: /my/job/dir/45
/my/job/dir/46: No such file or directory
However, if I unquote $job, I no longer receive this behavior, and it cats each of the files as expected.
I've done some reading up on globbingand splitting to see if this is occurring, but it seems like double-quoting should prevent that from happening. Can anyone explain why the behavior is different between "$job" and $job?
This happens because your variable looks like:
userjobs='/my/job/dir/45
/my/job/dir/46'
If you expand it as an array, with "${userjobs[#]}", that it acts as an array with exactly one element -- that string. Thus, behavior is identical to:
userjobs=( [0]='/my/job/dir/45
/my/job/dir/46' )
...still exactly one string with a literal newline in it.
Thus, cat "$job" looks for a file with a literal newline in its name.
To load your result into a real array you can iterate over with "${userjobs[#]}" expanding to a distinct element per line, use:
readarray -t userjobs < <(grep ...)
userjobs needs to be an array. Put parentheses around the value when assigning it:
userjobs=($(grep -rw "$USER" /my/job/dir/|awk '{print $1}'|sort|uniq|rev|cut -c 2-|rev))

I want to compare one line to the next line, but only in the third column, from a file using bash

So, what I'm trying to do is read in a file, loop through it comparing it line by line, but only in the third column. Sorry if this doesn't make sense, but maybe this will help. I have a file of names:
JOHN SMITH SMITH
JIM JOHNSON JOHNSON
JIM SMITH SMITH
I want to see if (first, col3)SMITH is equal to JOHNSON, if not, move onto the next name. If (first, col3) SMITH is equal to (second, col3) SMITH, then I'll do something with that.
Again, I'm sorry if this doesn't make much sense, but I tried to explain it as best as I could.
I was attempting to see if they were equal, but obviously that didn't work. Here is what I have so far, but I got stuck:
while read -a line
do
if [ ${line[2]} == ${line[2]} ]
then
echo -e "${line[2]}" >> names5.txt
else
echo "Not equal."
fi
done < names4.txt
Store your immediately prior line in a separate variable, so you can compare against it:
#!/usr/bin/env bash
old_line=( )
while read -r -a line
do
if [ "${line[2]}" = "${line[2]}" ]; then
printf '%s\n' "${line[2]}"
else
echo "Not equal." >&2
fi
old_line=( "${line[#]}" )
done <names4.txt >>names5.txt
Some other changes of note:
Instead of re-opening names5.txt every time you want to write a single line to it, we're opening it just once, for the whole loop. (You could make this >names5.txt if you want to clear it at the top of the loop and append from there, which is likely to be desirable behavior).
We're avoiding echo -e. See the APPLICATION USE and RATIONALE sections of the POSIX standard for echo for background on why echo use is not recommended for new development when contents are not tightly constrained (known not to contain any backslashes, for example).
We're quoting both sides of the test operation. This is mandatory with [ ] to ensure correct operation of words can be expanded as globs (ie. if you have a word *, you don't want it replaced with a list of files in your current directory in the final command), or if they can contain spaces (not so much a concern here, since you're using the same IFS value for the read -a as the unquoted expansion). Even if using [[ ]], you want to quote the right-hand side so it's treated as a literal string and not a pattern.
We're passing -r to read, which ensures that backslashes are not silently removed (changing \t in the input to just t, for example).
When you want to compare each third field with all previous third fields, you need to store the old third fields in an array. You can use awk for this.
When you only want to see the repeated third fields, you can use other tools:
cut -d" " -f3 names4.txt | sort | uniq -d
EDIT:
When you onlu want to print doubles from 2 consecutive lines, it is even easier:
cut -d" " -f3 names4.txt | uniq -d

Unexpected end of file in while loop in bash

I am trying to write a bash script that will do the following:
Take a directory or file as input (will always begin with /mnt/user/)
Search other mount points for same file or directory (will always begin with /mnt/diskx)
Return value
So, for example, the input will be "/mnt/user/my_files/file.txt". It will search if ""/mnt/disk1/my_files/file.txt" exists and will incrementally look for each disk (disk2, disk3, etc) until it finds it or disk20.
This is what I have so far:
#/user/bin/bash
var=$1
i=0
while [ -e $check_var = echo $var | sed 's:/mnt/user:/mnt/disk$i+1:']
do
final=$check_var
done
It's incomplete yes, but I am not that proficient in bash so I'm doing a little at a time. I'm sure my command won't work properly yet either but right now I am getting an "unexpected end of file" and I can't figure out why.
There are many issues here:
If this is the actual code you're getting "unexpected end of file" on, you should save the file in Unix format, not DOS format.
The shebang should be #!/usr/bin/bash or #!/bin/bash depending on your system
You have to assign check_var before running [ .. ] on it.
You have to use $(..) to expand a command
Variables like $i are not expanded in single quotes
sed can't add numbers
i is never incremented
the loop logic is inverted, it should loop until it matches and not while it matches.
You'd want to assign final after -- not in -- the loop.
Consider doing it in even smaller pieces, it's easier to debug e.g. the single statement sed 's:/mnt/user:/mnt/disk$i+1:' than your entire while loop.
Here's a more canonical way of doing it:
#!/bin/bash
var="${1#/mnt/user/}"
for file in /mnt/disk{1..20}/"$var"
do
[[ -e "$file" ]] && final="$file" && break
done
if [[ $final ]]
then
echo "It exists at $final"
else
echo "It doesn't exist anywhere"
fi

Bash: Extract user path (/home/userID) from read line containing full path and replace with "~"

I'm constructing a bash script file a bit at a time. I'm learning as I
go. But I can't find anything online to help me at this point: I need to
extract a substring from a large string, and the two methods I found using ${} (curly brackets) just won't work.
The first, ${x#y}, doesn't do what it should.
The second, ${x:p} or ${x:p:n}, keeps reporting bad substitution.
It only seems to work with constants.
The ${#x} returns a string length as text, not as a number, meaning it does not work with either ${x:p} or ${x:p:n}.
Fact is, it's seems really hard to get bash to do much math at all. Except for the for statements. But that is just counting. And this isn't a task for a for loop.
I've consolidated my script file here as a means of helping you all understand what it is that I am doing. It's for working with PureBasic source files, but you only have to change the grep's "--include=" argument, and it can search other types of text files instead.
#!/bin/bash
home=$(echo ~) # Copy the user's path to a variable named home
len=${#home} # Showing how to find the length. Problem is, this is treated
# as a string, not a number. Can't find a way to make over into
# into a number.
echo $home "has length of" $len "characters."
read -p "Find what: " what # Intended to search PureBasic (*.pb?) source files for text matches
grep -rHn $what $home --include="*.pb*" --exclude-dir=".cache" --exclude-dir=".gvfs" > 1.tmp
while read line # this checks for and reads the next line
do # the closing 'done' has the file to be read appended with "<"
a0=$line # this is each line as read
a1=$(echo "$a0" | awk -F: '{print $1}') # this gets the full path before the first ':'
echo $a0 # Shows full line
echo $a1 # Shows just full path
q1=${line#a1}
echo $q1 # FAILED! No reported problem, but failed to extract $a1 from $line.
q1=${a0#a1}
echo $q1 # FAILED! No reported problem, but failed to extract $a1 from $a0.
break # Can't do a 'read -n 1', as it just reads 1 char from the next line.
# Can't do a pause, because it doesn't exist. So just run from the
# terminal so that after break we can see what's on the screen .
len=${#a1} # Can get the length of $a1, but only as a string
# q1=${line:len} # Right command, wrong variable
# q1=${line:$len} # Right command, right variable, but wrong variable type
# q1=${line:14} # Constants work, but all $home's aren't 14 characters long
done < 1.tmp
The following works:
x="/home/user/rest/of/path"
y="~${x#/home/user}"
echo $y
Will output
~/rest/of/path
If you want to use "/home/user" inside a variable, say prefix, you need to use $ after the #, i.e., ${x#$prefix}, which I think is your issue.
The hejp I got was most appreciated. I got it done, and here it is:
#!/bin/bash
len=${#HOME} # Showing how to find the length. Problem is, this is treated
# as a string, not a number. Can't find a way to make over into
# into a number.
echo $HOME "has length of" $len "characters."
while :
do
echo
read -p "Find what: " what # Intended to search PureBasic (*.pb?) source files for text matches
a0=""; > 0.tmp; > 1.tmp
grep -rHn $what $home --include="*.pb*" --exclude-dir=".cache" --exclude-dir=".gvfs" >> 0.tmp
while read line # this checks for and reads the next line
do # the closing 'done' has the file to be read appended with "<"
a1=$(echo $line | awk -F: '{print $1}') # this gets the full path before the first ':'
a2=${line#$a1":"} # renove path and first colon from rest of line
if [[ $a0 != $a1 ]]
then
echo >> 1.tmp
echo $a1":" >> 1.tmp
a0=$a1
fi
echo " "$a2 >> 1.tmp
done < 0.tmp
cat 1.tmp | less
done
What I don't have yet is an answer as to whether variables can be used in place of constants in the dollar-sign, curly brackets where you use colons to mark that you want a substring of that string returned, if it requires constants, then the only choice might be to generate a child scriot using the variables, which would appear to be constants in the child, execute there, then return the results in an environmental variable or temporary file. I did stuff like that with MSDOS a lot. Limitation here is that you have to then make the produced file executable as well using "chmod +x filename". Or call it using "/bin/bash filename".
Another bash limitation found it that you cannot use "sudo" in the script without discontinuing execution of the present script. I guess a way around that is use sudo to call /bin/bash to call a child script that you produced. I assume then that if the child completes, you return to the parent script where you stopped at. Unless you did "sudo -i", "sudo -su", or some other variation where you become super user. Then you likely need to do an "exit" to drop the super user overlay.
If you exit the child script still as super user, would typing "exit" but you back to completing the parent script? I suspect so, which makes for some interesting senarios.
Another question: If doing a "while read line", what can you do in bash to check for a keyboard key press? The "read" option is already taken while in this loop.

Bash parameter expansion

I have a script which uses the following logic:
if [ ! -z "$1" ]; then # if any parameter is supplied
ACTION= # clear $ACTION
else
ACTION=echo # otherwise, set it to 'echo'
fi
This works fine, as-is. However, in reading the Shell Parameter Expansion section of the bash manual, it seems this should be able to be done in a single step. However, I can't quite wrap my head around how to do it.
I've tried:
ACTION=${1:-echo} # ends up with $1 in $ACTION
ACTION=${1:+}
ACTION=${ACTION:-echo} # ends up always 'echo'
and a few ways of nesting them, but nesting seems to be disallowed as far as I can tell.
I realize I've already got a working solution, but now I'm genuinely curious if this is possible. It's something that would be straightforward with a ternary operator, but I don't think bash has one.
If this is possible, I'd like to see the logic to do this seeming two-step process, with no if/else constructs, but using only any combination of the Shell Parameter Expansion features.
Thank you.
EDIT for elderarthis:
The remainder of the script is just:
find . -name "*\?[NMSD]=[AD]" -exec ${ACTION} rm -f "{}" +
I just want ACTION=echo as a sanity check against myself, hence, passing any argument will actually do the deletion (by nullifying ${ACTION}, whereas passing no args leaves echo in there.
And I know TIMTOWTDI; I'm looking to see if it can be done with just the stuff in the Shell Parameter Expansion section :-)
EDIT for Mikel:
$ cat honk.sh
#!/bin/bash
ACTION=${1-echo}
echo $ACTION
$ ./honk.sh
echo
$ ./honk.sh foo
foo
The last needs to have ACTION='', and thus return a blank line/null value.
If I insisted on doing it in fewer than 4 lines and no sub-shell, then I think I'd use:
ACTION=${1:+' '}
: ${ACTION:=echo}
This cheats slightly - it creates a blank action rather than an empty action if there is an argument to the script. If there is no argument, then ACTION is empty before the second line. On the second line, if action is empty, set it to 'echo'. In the expansion, since you (correctly) do not quote $ACTION, no argument will be passed for the blank.
Tester (xx.sh):
ACTION=${1:+' '}
: ${ACTION:=echo}
echo $ACTION rm -f a b c
Tests:
$ sh xx.sh 1
rm -f a b c
$ sh xx.sh
echo rm -f a b c
$ sh xx.sh ''
echo rm -f a b c
$
If the last line is incorrect, then remove the colon from before the plus.
If a sub-shell is acceptable, then one of these two single lines works:
ACTION=$([ -z "$1" ] && echo echo)
ACTION=$([ -z "${1+X}" ] && echo echo)
The first corresponds to the first version shown above (empty first arguments are treated as absent); the second deals with empty arguments as present. You could write:
ACTION=$([ -z "${1:+X}" ] && echo echo)
to make the relation with the second clearer - except you're only going to use one or the other, not both.
Since the markdown notation in my comment confused the system (or I got it wrong but didn't get to fix it quickly enough), my last comment (slightly amended) should read:
The notation ${var:+' '} means 'if $var is set and is not empty, then use what follows the +' (which, in this case, is a single blank). The notation ${var+' '} means 'if $var is set - regardless of whether it is empty or not - then use what follows the +'. These other expansions are similar:
${var:=X} - set $var to X unless it already has a non-empty value.
${var:-X} - expands to $var if it has a non-empty value and expands to X if $var is unset or is empty
Dropping the colon removes the 'empty' part of the test.
ACTION=${1:-echo}
is correct.
Make sure it's near the top of your script before anything modifies $1 (e.g. before any set command). Also, it wouldn't work inside a function, because $1 would be the first parameter to the function.
Also check if $1 is set but null, in which case fix how you're calling it, or use ACTION=${1-echo} (note there is no :).
Update
Ah, I assumed you must have meant the opposite, because it didn't really make sense otherwise.
It still seems odd, but I guess as a mental exercise, maybe you want something like this:
#!/bin/bash
shopt -s extglob
ACTION=$1
ACTION=${ACTION:-echo}
ACTION=${ACTION/!(echo)/} # or maybe ACTION=${ACTION#!(echo)}
echo ACTION=$ACTION
It's not quite right: it gives ACTION=o, but I think something along those lines should work.
Further, if you pass echo as $1, it will stay as echo, but I don't think that's a bad thing.
It's also terribly ugly, but you knew that when asking the question. :-)

Resources