bash script to read my files and execute the command - bash

I have a script which one can run as ./analyze file.txt file.root log.txt
file.txt is input file which contains executable root files with their paths, others are output. My problem is I have almost 30 input files and i do not want to write down the command each time to run the code. Bash script would be nice but I did not manage. It gives an error. see an example of the code below, I try to run:
#!/bin/bash
do
echo
./analyze runlist1.txt runlist1.root log1.txt
./analyze runlist2.txt runlist2.root log.txt
./analyze runlist3.txt runlist3.root log.txt
./analyze runlist4.txt runlist4.root res4.txt
done
I get the error "syntax error near unexpected token `do' ", Any help would be appreciated.

When all files are called Brunsplit.txt, the following will help
for file in Brunsplit*.txt; do
tmp=${file%.txt}
nr=${tmp#Brunsplit}
./analyze ${file} runlist${nr}.root res${nr}.txt
done
The tmp and nr vars are filled with special syntax, it is something like cutting off .txt at the end and Brunsplit from the start.

Like others indicated in comments, the do out of context is a syntax error (it is a valid keyword right after a for, while, and until control statement).
Apparently, there is no systematic mapping between input file names and the corresponding log file names, so you need a script which maintains this mapping. Something like this, then:
while read suffix logfile; do
./analyze "runlist$suffix.txt" "runlist$suffix.root" "$logfile"
done <<'____HERE'
1 log1.txt
2 log.txt
3 log.txt
4 res4.txt
____HERE
The here document (the stuff between << delimiter and the delimiter alone on a line) is just like a text file, except it is embedded as part of the script.

Check it out the primitive but the modified version.
!/bin/bash
for file in Brunsplit1.txt Brunsplit2.txt
do
echo $file
./analyze Brunsplit1.txt runlist1.root res1.txt
./analyze Brunsplit2.txt runlist2.root res2.txt
done
Thanks.

Related

How do I deal with unusual filenames in a command's parameters?

I have a file ('list') which contains a large list of filenames, with all kinds of character combinations such as:
-sensei
I am using the following script to process this list of files:
#!/bin/bash
while read -r line
do
html2text -o ./text/$line $line
done < list
Which is giving me 'Cannot open input file' errors.
What is the correct way of dealing with these filenames, to prevent any errors?
I have changed the example list above to now include only one filename (out of many) which does not work, no matter how I quote or don't quote it.
#!/bin/bash
while read -r line
do
html2text -o "./text/$line" "$line"
done < list
The error I get is:
Unrecognized command line option "-sensei", try "-help".
As such this question does not resolve this issue.
Something like this should fix your issues (unless the file list has CRLF line endings):
while IFS='' read -r file
do
html2text -o ./text/"$file" -- "$file"
done < filelist.txt
notes:
IFS='' read -r is mandatory when you want to capture a line accurately
most commands support -- for signaling the end of options; whatever the following arguments might be, they will not be treated as options. BTW, an other common work-around for filenames that start with - is to prepend ./ to them.
Are you sure the files are present? Usually there should not be a problem with your script. For example:
#!/bin/bash
while read -r line
do
touch ./text/$line
done < $1
ls -l /tmp/text
works perfectly fine for me (with your example input). Bash should escape them for you. Are you in the right directory? If you are sure the files are present, there is a problem with html2text.
Also make sure not to have a trailing blank line in your input file.

How to use each line of a file as an argument to a for loop

I'm hoping to do a command on each line listed within foo.txt, where each line of foo.txt is a file name.
There's been plenty of great support for this question, and have tried a while read, another while read, I am now trying to do a for loop. However, I'm starting to think the issue is in the body of the loop.
#!/bin/bash
File=/mnt/d/R_projects/EC/foo.txt
Lines=$(cat $File)
for Line in $Lines
do
echo "fastp -i /mnt/d/R_projects/EC/download/fastq/$Line -o /mnt/e/EC/fastp_trimmed/$Line"
./fastp -i /mnt/d/R_projects/EC/download/fastq/$Line -o /mnt/e/EC/fastp_trimmed/$Line
done
I unfortunately receive the error:
ERROR: Failed to open file: /mnt/d/R_projects/EC/download/fastq/SRR6132950_1.fastq
The file exists, and doing less confirms.
Oddly, the echo doesn't echo what I was expecting and instead states:
" -o /mnt/e/EC/fastp_trimmed/SRR6132950_1.fastqRR6132950_1.fastq"
What could be causing this issue? It's as if the first half was cut off.
Thank you all,
I had suspected the windows \r\n was causing an issue here, but I tried in nano to delete these and moved on. Notepad++ showed my error and was able to fix with notepad++ edit>EOL conversion>Linux (LF).
I also adapted the suggestions to remove cat and add IFS - the suggestion was helpful.
#!/bin/bash
IFS=$'\n'
for Line in $(< /mnt/d/R_projects/EC/foo.txt); do
echo "$Line"
echo "fastp -i /mnt/d/R_projects/EC/download/fastq/$Line -o /mnt/e/EC/fastp_trimmed/$Line"
./fastp -i /mnt/d/R_projects/EC/download/fastq/$Line -o /mnt/e/EC/fastp_trimmed/$Line
done
I have like a million needs for running commands based on a text file and this will be extremely helpful for further applications.

Bash: Failure to go through all files in a directory with mediainfo

I have a music directory on a debian computer, which from time to time gets too big files in it. To help me with eventual deleting of these files, I've installed mediainfo and made a script, that should go through all the files with in music directory using that command.
I'm trying to utilize the duration parameter to define what needs to deleted or not. The example input is:
mediainfo --Inform="General;%Duration%" /home/administrator/music/Example\ Full\ OST.mp4
7838987
Output returns duration as milliseconds. Please note, that if files have any spaces in them, mediainfo marks a backslash in front of them. I've taken this into account in my script:
#!/bin/bash
for i in /home/administrator/music/*
do
# Changing i to readable form for mediainfo
i=$(echo $i | sed -r 's/[ ^]+/\\&/g')
echo $i
# Go Through files, storing the output to mediadur variable
mediadur=$(mediainfo --Inform="General;%Duration%" $i);
echo $mediadur;
done
echo outputs are:
/home/administrator/music/Example\ Full\ OST.mp4
 
The mediadur echo output doesn't show anything. But when I copy the first echo output to the first example, it shows same output.
However, if the directory has any media that doesn't have space in its filename, the script works fine. The output of the script:
/home/administrator/music/546721638279.mp3
83017
This problem has left me very puzzled. Any help is appreciated.
You should updates this line:
mediadur=$(mediainfo --Inform="General;%Duration%" "$i");
Double quotes will prevent globbing and word splitting
Actually this is not related to MediaInfo, just that you provide a wrong file name to the command line. And MediaInfo adds no "backslash in front of them".
Your escape method does not work as you expect.
#!/bin/bash
for i in music/*
do
# Go Through files, storing the output to mediadur variable
mediadur=$(mediainfo --Inform="General;%Duration%" "$i");
echo $mediadur;
done
Works well also with file names having spaces.

Problems with basename in a loop

I am new at shell script programming and I'm trying to execute a software that reads a text and perform it's POS tagging. It requires an input and an output, as can be seen in the execute example:
$ cat input.txt | /path/to/tagger/run-Tagger.sh > output.txt
What I'm trying to do is to execute this line not only for a text, but a set of texts in an specific folder, and return the output files with the same name as the input files. So, I tried to do this script:
#!/bin/bash
path="/home/rafaeldaddio/Documents/"
program="/home/rafaeldaddio/Documents/LX-Tagger/POSTagger/Tagger/run-Tagger.sh"
for arqin in '/home/rafaeldaddio/Documents/teste/*'
do
out=$(basename $arqin)
output=$path$out
cat $arqin | $program > $output
done
I tried it with only one file and it works, but when I try with more than one, I get this error:
basename: extra operand ‘/home/rafaeldaddio/Documents/teste/3’
Try 'basename --help' for more information.
./scriptLXTagger.sh: 12: ./scriptLXTagger.sh: cannot create /home/rafaeldaddio/Documents/: Is a directory
Any insights on what I'm doing wrong? Thanks.
You don't want quotes around the pattern, and quote your variables:
for arqin in /home/rafaeldaddio/Documents/teste/*
do
out=$(basename "$arqin")
output=$path$out
"$program" <"$arqin" >"$output"
done

unix shell: check file list in dir versus with list of files from a file

I am trying to write a sh script to check check that all files from list of files' extentions stored in a file are in a place in a particular dir. I am doing following:
file names looks like yyyymmdd.ext
hoff_list.lst sample is following:
abc
dfg
hij
klm
xxx
...
my script is:
#!/bin/ksh
PATH=$PATH:/usr/bin
_input="/exchange/hoff_list.lst"
hoffdate="20130328"
hsourcedir="/upload_data/"
while IFS=' \t\n' read -r line; do
echo "=$line=" #first problem there
hoff_name=$hsourcedir$hoffdate"."$line
if test ! "$hoff_name"
then echo "$hoff_name DOES exist"
else echo "$hoff_name does NOT exist or is empty"
fi
done < "$_input"
but it doesnt revert relevant reply. It doesnt find a file if the file is reqly exists in dir.
echo "=$line="
reterns
=abc
=dfg
...
when expected
=abc=
=dfg=
...
looks problem is there, but haven't a clue how to handle it. Will appreciate your help there...
The file /exchange/hoff_list.lst has CRLF line endings http://en.wikipedia.org/wiki/Newline.
Get rid of CR. You can try using the dos2unix utility available on most Linux systems or refer to http://en.wikipedia.org/wiki/Newline#Conversion_utilities.

Resources