Printing to multiple files with shell script for loop variable - bash

None of the files are showing up, I tried putting the file to be written to in quotes but just no files are being written all I am getting is one file called person.txt
#!/bin/sh
cut -f 1 $1 > temp1.txt
cut -f 2-3 $2 > temp2.txt
for ((i=3;i<103;i++)); do
cut -f $i $1 > temp3.txt
paste temp1.txt temp2.txt temp3.txt > $HOME/Desktop/Plots/person$iPlot.txt
done

Without having tested it, a problem in your script is
$HOME/Desktop/Plots/person$iPlot.txt
Bash is not going to substitute your variable i as you are expecting, but tries to resolve a variable iPlot instead. As this variable has not been assigned any value, you'll end up with person.txt. This happens as bash tries to resolve variables as entire words separated e.g. by space or period.
To make sure that your variable i is used, try
$HOME/Desktop/Plots/person${i}Plot.txt
instead.

Related

Bash File names will not append to file from script

Hello I am trying to get all files with Jane's name to a separate file called oldFiles.txt. In a directory called "data" I am reading from a list of file names from a file called list.txt, from which I put all the file names containing the name Jane into the files variable. Then I'm trying to test the files variable with the files in list.txt to ensure they are in the file system, then append the all the files containing jane to the oldFiles.txt file(which will be in the scripts directory), after it tests to make sure the item within the files variable passes.
#!/bin/bash
> oldFiles.txt
files= grep " jane " ../data/list.txt | cut -d' ' -f 3
if test -e ~data/$files; then
for file in $files; do
if test -e ~/scripts/$file; then
echo $file>> oldFiles.txt
else
echo "no files"
fi
done
fi
The above code gets the desired files and displays them correctly, as well as creates the oldFiles.txt file, but when I open the file after running the script I find that nothing was appended to the file. I tried changing the file assignment to a pointer instead files= grep " jane " ../data/list.txt | cut -d' ' -f 3 ---> files=$(grep " jane " ../data/list.txt) to see if that would help by just capturing raw data to write to file, but then the error comes up "too many arguments on line 5" which is the 1st if test statement. The only way I get the script to work semi-properly is when I do ./findJane.sh > oldFiles.txt on the shell command line, which is me essentially manually creating the file. How would I go about this so that I create oldFiles.txt and append to the oldFiles.txt all within the script?
The biggest problem you have is matching names like "jane" or "Jane's", etc. while not matching "Janes". grep provides the options -i (case insensitive match) and -w (whole-word match) which can tailor your search to what you appear to want without having to use the kludge (" jane ") of appending spaces before an after your search term. (to properly do that you would use [[:space:]]jane[[:space:]])
You also have the problem of what is your "script dir" if you call your script from a directory other than the one containing your script, such as calling your script from your $HOME directory with bash script/findJane.sh. In that case your script will attempt to append to $HOME/oldFiles.txt. The positional parameter $0 always contains the full pathname to the current script being run, so you can capture the script directory no matter where you call the script from with:
dirname "$0"
You are using bash, so store all the filenames resulting from your grep command in an array, not some general variable (especially since your use of " jane " suggests that your filenames contain whitespace)
You can make your script much more flexible if you take the information of your input file (e.g list.txt), the term to search for (e.g. "jane"), the location where to check for existence of the files (e.g. $HOME/data) and the output filename to append the names to (e.g. "oldFile.txt") as command line [positonal] parameters. You can give each default values so it behaves as you currently desire without providing any arguments.
Even with the additional scripting flexibility of taking the command line arguments, the script actually has fewer lines simply filling an array using mapfile (synonymous with readarray) and then looping over the contents of the array. You also avoid the additional subshell for dirname with a simple parameter expansion and test whether the path component is empty -- to replace with '.', up to you.
If I've understood your goal correctly, you can put all the pieces together with:
#!/bin/bash
# positional parameters
src="${1:-../data/list.txt}" # 1st param - input (default: ../data/list.txt)
term="${2:-jane}" # 2nd param - search term (default: jane)
data="${3:-$HOME/data}" # 3rd param - file location (defaut: ../data)
outfn="${4:-oldFiles.txt}" # 4th param - output (default: oldFiles.txt)
# save the path to the current script in script
script="$(dirname "$0")"
# if outfn not given, prepend path to script to outfn to output
# in script directory (if script called from elsewhere)
[ -z "$4" ] && outfn="$script/$outfn"
# split names w/term into array
# using the -iw option for case-insensitive whole-word match
mapfile -t files < <(grep -iw "$term" "$src" | cut -d' ' -f 3)
# loop over files array
for ((i=0; i<${#files[#]}; i++)); do
# test existence of file in data directory, redirect name to outfn
[ -e "$data/${files[i]}" ] && printf "%s\n" "${files[i]}" >> "$outfn"
done
(note: test expression and [ expression ] are synonymous, use what you like, though you may find [ expression ] a bit more readable)
(further note: "Janes" being plural is not considered the same as the singular -- adjust the grep expression as desired)
Example Use/Output
As was pointed out in the comment, without a sample of your input file, we cannot provide an exact test to confirm your desired behavior.
Let me know if you have questions.
As far as I can tell, this is what you're going for. This is totally a community effort based on the comments, catching your bugs. Obviously credit to Mark and Jetchisel for finding most of the issues. Notable changes:
Fixed $files to use command substitution
Fixed path to data/$file, assuming you have a directory at ~/data full of files
Fixed the test to not test for a string of files, but just the single file (also using -f to make sure it's a regular file)
Using double brackets — you could also use double quotes instead, but you explicitly have a Bash shebang so there's no harm in using Bash syntax
Adding a second message about not matching files, because there are two possible cases there; you may need to adapt depending on the output you're looking for
Removed the initial empty redirection — if you need to ensure that the file is clear before the rest of the script, then it should be added back, but if not, it's not doing any useful work
Changed the shebang to make sure you're using the user's preferred Bash, and added set -e because you should always add set -e
#!/usr/bin/env bash
set -e
files=$(grep " jane " ../data/list.txt | cut -d' ' -f 3)
for file in $files; do
if [[ -f $HOME/data/$file ]]; then
if [[ -f $HOME/scripts/$file ]]; then
echo "$file" >> oldFiles.txt
else
echo "no matching file"
fi
else
echo "no files"
fi
done

Create variable by combining text + another variable

Long story short, I'm trying to grep a value contained in the first column of a text file by using a variable.
Here's a sample of the script, with the grep command that doesn't work:
for ii in `cat list.txt`
do
grep '^$ii' >outfile.txt
done
Contents of list.txt :
123,"first product",description,20.456789
456,"second product",description,30.123456
789,"third product",description,40.123456
If I perform grep '^123' list.txt, it produces the correct output... Just the first line of list.txt.
If I try to use the variable (ie grep '^ii' list.txt) I get a "^ii command not found" error. I tried to combine text with the variable to get it to work:
VAR1= "'^"$ii"'"
but the VAR1 variable contained a carriage return after the $ii variable:
'^123
'
I've tried a laundry list of things to remove the cr/lr (ie sed & awk), but to no avail. There has to be an easier way to perform the grep command using the variable. I would prefer to stay with the grep command because it works perfectly when performing it manually.
You have things mixed in the command grep '^ii' list.txt. The character ^ is for the beginning of the line and a $ is for the value of a variable.
When you want to grep for 123 in the variable ii at the beginning of the line, use
ii="123"
grep "^$ii" list.txt
(You should use double quotes here)
Good moment for learning good habits: Continue in variable names in lowercase (well done) and use curly braces (don't harm and are needed in other cases) :
ii="123"
grep "^${ii}" list.txt
Now we both are forgetting something: Our grep will also match
1234,"4-digit product",description,11.1111. Include a , in the grep:
ii="123"
grep "^${ii}," list.txt
And how did you get the "^ii command not found" error ? I think you used backquotes (old way for nesting a command, better is echo "example: $(date)") and you wrote
grep `^ii` list.txt # wrong !
#!/bin/sh
# Read every character before the first comma into the variable ii.
while IFS=, read ii rest; do
# Echo the value of ii. If these values are what you want, you're done; no
# need for grep.
echo "ii = $ii"
# If you want to find something associated with these values in another
# file, however, you can grep the file for the values. Use double quotes so
# that the value of $ii is substituted in the argument to grep.
grep "^$ii" some_other_file.txt >outfile.txt
done <list.txt

Investigating a diff error in a bash script when variables are used instead of hardcoded file names

I have a script that looks for files of specific type in a specified directory and if they are present, generates a file with the basenames before creating a tar.gz. Once compressed, I check to ensure the tarball contains all the files by running a diff check.
I have created a pair of variables that are the pre-compressed file list and those found in the tarball. When I run an if statement including diff of the variables, I receive this error:
diff: missing operand after `/my/original/dir/filelist.txt'
diff: Try `diff --help' for more information.
I worked around this by referencing the files themselves rather than the created variables. If I run the if statement in a separate bash script, it works just fine using the variables so I am entirely lost as to what my error is in my larger script. Below I provide both the snippet from the large script and the diff statement as its own script for reference.
The if diff in its own script:
#!/bin/sh
filelist=(filelist.txt)
tarfiles=(tarfiles.txt)
#differences=$(diff filelist.txt tarfiles.txt) #Uncomment if below fails
differences=$(diff $filelist $tarfiles)
if $differences > /dev/null ; then
echo Same
else
echo Different
fi
The above works just fine.
Now including this at the end my larger script:
TARFILES=$(tar -tzf "$ARCHIVES/tarredfiles.tar.gz" | awk -F/ '{ if($NF != "") print $NF }' > $LOGS/tarfiles.txt)
FILELIST=($LOGS/filelist.txt)
#Check to see if it all worked
DIFF=$(diff $FILELIST $TARFILES)
cd $LOGS #I shouldn't need to do this but I do as a safety mechanism
#if diff filelist.txt tarfiles.txt > /dev/null ; then
if diff $FILELIST $TARFILES > /dev/null ; then
echo "Today's files have been archived and checked."
else
echo "Some or none of today's files have been archived, check the logs to find the error."
echo (diff $TARFILES $FILELIST) > $LOGS/$(date '+%Y%m%d')errors.txt
fi
I have tried enclosing the variables in "" and it didn't seem to make a difference.
The way you populate TARFILES results in it being empty. What is it that you're trying to store in the variable?
This line
TARFILES=$(tar -tzf "$ARCHIVES/tarredfiles.tar.gz" | awk -F/ '{ if($NF != "") print $NF }' > $LOGS/tarfiles.txt)
Does the following steps
Extracts a list of the filenames (-t) from the compressed (-z) tar file (-f) named tarredfiles.tar.gz in the directory referred to by the $ARCHIVES variables
Sends (pipes) that list of filenames into awk where you print the last component of the filename, that is the last field ($NF) of each line when it is split by / (-F/)
Sends (redirects) all of that output into the log file $LOGS/tarfiles.txt
Captures any other output (of which there will be none!) and stores it in the TARFILES variable.
So, the variable TARFILES is always empty, but the file tarfiles.txt has content in it.
It seems that you want the diff to compare the content of tarfiles.txt with the content of filelist.txt, but you're trying to use your variables in a way that isn't really compatible with that.
An expression of the form:
TARFILES=$( command goes here )
captures the output of that command.
And
TARFILES=$( command goes here > some-file.txt )
sends the output of the command into the file, and then captures nothing.
What you want is something like:
TARFILES=some-file.txt
command goes here > $TARFILES
which will set the variable to be the name of your file, and then run a command which put content into that file.
So, specifically:
TARFILES=$LOGS/tarfiles.txt
tar -tzf "$ARCHIVES/tarredfiles.tar.gz" | awk -F/ '{ if($NF != "") print $NF }' > $TARFILES
When working will shell scripts, it is very common to be running commands that produce output that goes into files, etc. One thing you need to be really clear about in the logic of your script is when you want your variables to contain actual content (that is, the output of a command), and when you want them to contain filenames.
In your case you want to run diff on 2 files ("tarfiles" and "filelist") that happen to contain a list of filenames, so that means there's a little bit more to keep track of, but essentially you want to populate "tarfiles" with the output from a command, and then run a diff where you pass in the 2 files names "tarfiles" and "filelist". So you never want to use $( ... ) to populate tarfiles.txt because that is how you capture the output of a command into a variable, and what you're trying to do is store a filename in your variable.

Bash script to replace or append

I'm new to Bash scripting and I'm having a bit of a hard time. I'm trying to alter the configuration values of a config file. If it finds an existing value I want it to update it, but if it doesn't exist I want it to append it. This is as far I as I got from various tutorials and snippets online:
# FUNCTION TO MODIFY CONFIG BY APPEND OR REPLACE
# $1 File
# $2 Find
# $3 Replace / Append
function replaceappend() {
grep -q '^$2' $1
sed -i 's/^$2.*/$3/' $1
echo '$3' >> $1
}
replaceappend "/etc/test.conf" "Port 20" "Port 10"
However as you might imagine this doesn't work. It seems to be with the logic behind it, I'm not sure how to capture the result of grep in order to choose either sed or echo.
Just use the return value of the command and use double-quotes instead of single quotes:
if ! sed -i "/$2/{s//$3/;h};"'${x;/./{x;q0};x;q1}' $1
then
echo "$3" >> $1
fi
SOURCE: Return code of sed for no match for the q command
This is treading outside my normal use of sed, so let me give an explanation of how this works, as I understand it:
sed "/$2/{s//$3/;h};"'${x;/./{x;q0};x;q1}' $1
The first /$2/ is an address - we will do the commands within {...} for any lines that match this. As a by-product it also sets the pattern-space to $2.
The command {s//$3/;h} says to substitute whatever is in the pattern-space with $3 and then save the pattern-space in the "hold-space", a type of buffer within sed.
The $ after the single quote is another address - it says to do this next command on the LAST line.
The command {x;/./{x;q0};x;q1} says:
x = swap the hold-space and the pattern-space
/./ = an address which matches anything
{x;q0} = swap the hold-space and the pattern-space - if this is successful (there was something in the hold-space) then q0=exit with 0 status (success)
x;q1 = swap the hold-space and the pattern-space - since this is now successful (due to the previous x) then q1=exit with 1 status (fail)
The double-quotes around the first part allow substitution for $2 and $3. The single quotes around the latter part prevents erroneous substitution for the $.
A bit complicated, but it seems to work AS LONG AS YOU HAVE SOMETHING IN THE FILE. An empty file will still succeed since you don't get any match on the last line.
To be honest, after all this complication... Unless the files you are working with are really long so that a double-pass would be really bad I would probably go back to the grep solution like this:
if grep -q "^$2" $1
then
sed -i "s/^$2.*$/$3/" $1
else
echo "$3" >>$1
fi
That's a WHOLE lot easier to understand and maintain later...

Unix Shell Script to take multiple files from standard input (csh)

Using either the for loop or the pipe (both work with one filename), I need to figure out how to accept unlimited specified files from standard input. I have tried regular expressions, and various wildcard forms. The two main issues I'm running into: only the first file is put through the script or every single file in the directory is put through. This is an assignment for a basic Unix Course and my problem thus far is over-complication. Based on the rest of the semester, there's a simple fix for what I'm wanting to do and here I've spent two hours perusing hundreds of websites and posts making my head spin.
EDIT: The command line prompt would be something like this ~/dir/script currentWord newWord fileName1 fileName2 fileName3
#!/bin/csh
set currentWord=$1
set newWord=$2
set fileName=$3
if { grep -q $1 *$3 } then
sed -i.bak -e "s/$1/$2/g" $3
else
echo "The string is not found."
endif
#grep -q $1 $3 | sed -i.bak -e "s/$1/$2/g" $3
You can access the command line arguments using $argv[]. To loop over them but skip the first two, you can use this construct:
foreach file ($argv[3-])
# do stuff here, eg
echo $file
end
You shouldn't use csh though, if you have been instructed to do so by your professor I would question this.

Resources