Create file from shell script (E.g, this-is-the-title) - bash

I am trying to create a file using the following script (see below). While the script runs without errors (at least according to shellcheck), I cannot get the resulting file to have the correct name.
#!/bin/bash
# Set some variables
export site_path=~/Documents/Blog
drafts_path=~/Documents/Blog/_drafts
title="$title"
# Create the filename
title=$("$title" | "awk {print tolower($0)}")
filename="$title.markdown"
file_path="$drafts_path/$filename"
echo "File path: $file_path"
# Create the file, Add metadata fields
cat >"$file_path" <<EOL
---
title: \"$title\"
layout:
tags:
---
EOL
# Open the file in BBEdit
bbedit "$file_path"
exit 0
Very new to bash, so I'm not quite sure what I'm doing wrong...

The most glaring error is this:
title=$("$title" | "awk {print tolower($0)}")
It's wrong for several reasons:
This pipeline runs "$title" as a command -- meaning that it looks for a command named with the title of your blog post to run -- and pipes the output of that command (a command that presumably won't exist) to awk.
Using double-quotes around the entire awk command means you're looking for a command named something like /usr/bin/awk {print tolower(bash-)} (if $0 evaluates to bash-, which it will in an interactive interpreter; behavior will differ elsewhere).
Using double-quotes rather than single-quotes to protect your awk script means that the $0 gets evaluated to the shell rather than by awk.
A better alternative might look like:
title=$(awk '{print tolower($0)}' <<<"$title")
...or, to use simpler tools:
title=$(tr '[:upper:]' '[:lower:]' <<<"$title")
...or, to use bash 4.x built-in functionality:
title=${title,,}
Of course, all that assumes that title is set to start with. If you aren't passing it through your environment, you might want something like title=$1 rather than title="$title" earlier in your script.

Related

How to properly escape bash function in an alias

I have some bash commands that I have successfully piped together:
$> foo --color=RED | grep -Eo '(v[1-9])'
$> v1
Assume foo is an alias that prints out many things and I want to grab a version number and pipe that to another command, bar that gets an id. So I add this and it works:
$> foo --color=RED | grep -Eo '(v[1-9])' | \
awk '{print "bar --version="$1" --color=RED"}' | xargs -0 bash -c
$> ID: 1234
Great. Now, I'd like to create an entry in my .aliases file so I can just run this like so:
$> wombat RED
Problems: I cannot get this to work
alias wombat='function _w() {
COLOR=$1; # cache the color
foo --color=$COLOR | grep -Eo "(v[1-9])" | \
awk '{print "bar --version=$1 --color=$COLOR"}' | xargs -0 bash -c;
};_w'
The problem seems to be with how I am escaping (or not escaping) around the awk command. Note: In the awk command I need to reference both the version number and the color that I passed to the alias.
I have tried many variations but cannot seem to get it right. Can anyone assist?
The immediate problem is that quotes don't nest -- you can't nest a single-quoted string (the awk script) inside a single-quoted string (the alias definition). There are ways to get this to work, but it's much simpler to just skip the alias part entirely. Making an alias that defines a function and then immediately executes it is pointless; just define the function once, use it normally. Like this:
wombat() {
local color="$1" # cache the color
foo --color="$color" | grep -Eo "(v[1-9])" | \
awk -v color="$color" '{print "bar --version=" $1 " --color=" color}' | \
xargs -0 bash -c;
}
Note that I also made several changes to how the color name is handled: I used a lowercase shell variable name (there are a bunch of all-caps names with special meanings, so using lower- or mixed-case names for your own stuff is safer), I made that shell variable local, and I passed it in to awk as a variable rather than trying to embed it literally in the awk script. Finally, in the awk script, I put the references to $1 and the color variable outside the quoted strings, so they'll be expanded to their values rather than used literally.
Oh, and I used the POSIX-standard syntax for function definitions, which uses () instead of the function keyword to signal that this is a function definition.
Since I don't actually have the foo or bar programs, I have not actually tested this. But as far as I can see it should work.
Turn on Bash shell debugging so you can see how your aliases and function are interpreted before they get executed
http://tldp.org/LDP/Bash-Beginners-Guide/html/sect_02_03.html
There are a couple of ways to turn on debugging. One is to start a new Bash session in debug mode, bash -x. Another is to use enable debugging in an existing Bash session, set -x

Using awk to parse a config file

I'm trying to make an awk command which stores an entire config file as variables.
The config file is in the following form (keys never have spaces, but values may):
key=value
key2=value two
And my awk command is:
$(awk -F= '{printf "declare %s=\"%s\"\n", $1, $2}' $file)
Running this without the outer subshell $(...) results in the exact commands that I want being printed, so my question is less about awk, and more about how I can run the output of awk as commands.
The command evaluates to:
declare 'key="value"'
which is somewhat of a problem, since then the double quotes are stored with the value. Even worse is when a space is introduced, which results in:
declare 'key2="value' two"
Of course, I cannot remove the quotes or the multi-word values cause problems.
I've tried most every solution I could find, such as set -f, eval, and system().
You don't need to use Awk for this but the do this with built-ins available. Read the config file properly using input redirection
#!/bin/bash
while IFS== read -r k v; do
declare "$k"="$v"
done < config_file
and source the file as
$ source script.sh
$ echo "$key"
value
$ echo "$key2"
value two
If source is not available explicitly, POSIX-ly way of doing it would be to do just
. ./script.sh

Trying to run few awk commands stored in file

I have a file with below commands
cat /some/dir/with/files/file1_name.tsv|awk -F "\\t" '{print $21$19$23$15}'
cat /some/dir/with/files/file2_name.tsv|awk -F "\\t" '{print $2$13$3$15}'
cat /some/dir/with/files/file3_name.tsv|awk -F "\\t" '{print $22$19$3$15}'
When i loop through the file to run the command, i get below error
cat file | while read line; do $line; done
cat: invalid option -- 'F'
Try `cat --help' for more information.
You are not executing the command properly as you intended it. Since you are reading line by line on the file (for unknown reason) you could call the interpreter directly as below
#!/bin/bash
# ^^^^ for running under 'bash' shell
while IFS= read -r line
do
printf "%s" "$line" | bash
done <file
But this has an overhead of creating a forking a new process for each line of the file. If your commands present under file are harmless and is safe to be run in one shot, you can just as
bash file
and be done with it.
Also for using awk, just do as below for each of the lines to avoid useless cat
awk -F "\\t" '{print $21$19$23$15}' file1_name.tsv
You are expecting the pipe (|) symbol to act as you are accustomed to, but it doesn't. To help you understand, try this :
A="ls / | grep e" # Loads a variable with a command with pipe
$A # Does not work
eval "$A" # Works
When expanding a variable without using eval, expansion and word splitting occurs after the shell interprets redirections and pipes, so your pipe symbol is seen just as a literal character.
Some options you have :
A) Avoid piping, by passing the file name as an argument
awk -F "\\t" '{print $21$19$23$15}' /some/dir/with/files/file1_name.tsv
B) Use eval as shown below, the potential security implications of which I would suggest you to research.
C) Put arguments in file and parse it, avoiding the use of eval, something like :
# Assumes arguments separated by spaces
IFS=" " read -r -a arguments;
awk "${arguments[#]-}"
D) Implement the parsing of your data files in Bash instead of awk, and use your configuration file to specify output without the need for expanding anything (e.g. by specifying fields to print separated by spaces).
The first three approaches involve some form of interpretation of outside data as code, and that comes with risks if the file used as input cannot be guaranteed safe. Approach C might be considered a bit better in that regard, but since the command you are calling is awk, an actual program is passed to awk, so whatever awk can do, an attacker (or careless user) with write access to your file can cause your script to do anything awk can do.

Execute awk output exactly inside of executed script

There is some similar topics, but this is slightly different.
I have database with names of scripts and parameters a. When I execute:
sqlite3 log/log.db "select name, a from result" | awk -F '|' '{printf("a[%s]=%s;\n",$1,$2);}'
I see:
a[inc.bash]=4.23198234894777e-06;
a[inc.c]=3.53343440279423e-10;
In my bash script I would like to use an associative array.
When I execute this code (coding by hand value of a[inc.bash]):
declare -A a
a[inc.bash]=4.23198234894777e-06;
echo ${a[inc.bash]}
It works correctly and print
4.23198234894777e-06
But I do not know, how to use output of first presented command with awk to assign values of key of associative array a declared in my script.
I want to execute code that is printed by awk inside of my script, but when I use something like $() or ``, it prints a error like this:
code:
declare -A a
$(sqlite3 log/log.db "select name, a from result" | awk -F '|' '{printf("a[%s]=%s;\n",$1,$2);}')
echo ${a[inc.bash]}
output:
a[inc.bash]=4.23198234894777e-06; not found command
To tell Bash to interpret your output as commands, you can use process substitution and the source command:
declare -A a
source <(sqlite3 log/log.db "select name, a from result" |
awk -F '|' '{printf("a[%s]=%s;\n",$1,$2);}')
echo ${a[inc.bash]}
The <() construct (process substitution) can be treated like a file, and source (or the equivalent .) runs the commands in its argument without creating a subshell, making the resulting a array accessible in the current shell.
A simplified example to demonstrate, as I don't have your database:
$ declare -A a
$ source <(echo 'a[inc.bash]=value')
$ echo "${a[inc.bash]}"
value
This all being said, this is about as dangerous as using eval: whatever the output of your sqlite/awk script, it will be executed!

How to pass a shell script argument as a variable to be used when executing grep command

I have a file called fruit.txt which contains a list of fruit names (apple, banana.orange,kiwi etc). I want to create a script that allows me to pass an argument when calling the script i.e. script.sh orange which will then search the file fruit.txt for the variable (orange) using grep. I have the following script...
script name and argument as follows:
script.sh orange
script snippet as follows:
#!/bin/bash
nameFind=$1
echo `cat` "fruit.txt"|`grep` | $nameFind
But I get the grep info usage command and it seems that the script is awaiting some additional command etc. Advice greatly appreciated.
The piping syntax is incorrect there. You are piping the output of grep as input to the variable named nameFind. So when the grep command tries to execute it is only getting the contents of fruit.txt. Do this instead:
#!/bin/bash
nameFind=$1
grep "$nameFind" fruit.txt
Something like this should work:
#!/bin/bash
name="$1"
grep "$name" fruit.txt
There's no need to use cat and grep together; you can simply pass the name of the file as the third argument, after the pattern to be matched. If you want to match fixed strings (i.e. no regular expressions), you can also use the -F modifier:
grep -F "$name" fruit.txt

Resources