Bash For Loop Syntax Error - bash

I am trying to perform a simple for loop, but it keeps telling me there is a syntax error near do. I have tried to find some answers online, but nothing seems to be quite answering my question.
The for loop is as so. All it wants to do is find the differences between two folders:
#!/bin/bash
for word in $LIST; do
diff DIR1/config $word/config
done
exit
The syntax error is near do. It says "Syntax error near unexpected token 'do '". $LIST is set outside of this script by the program that calls it.
Does anyone know what might be happening here?

That's certainly valid syntax for bash so I'd be checking whether you may have special characters somewhere in the file, such as CR/LF at the ends of your lines.
Assuming you're on a UNIXy system, od -xcb scriptname.sh should show you this.
In addition, you probably also want to use $word rather than just word since you'll want to evaluate the variable.
Another thing to check is that you are actually running this under bash rather than some "lesser" shell. And it's often handy to place a set -x within your script for debugging purposes as this outputs lines before executing them (use set +x to turn this feature off).
One last thing to check is that LIST is actually set to something, by doing echo "[$LIST]" before the for loop.

Related

copy paste code works but not as a script

I wrote a script with six if statements that looks like this:
#!/usr/bin/bash
if [ -n $var1 ]
then
for f in /path/*.fastq.gz
do
x=${f/%.fastq.gz/_sample1-forward.fastq.gz}
y=${f/%.fastq.gz/_sample1-forward.out}
q=${f/%.fastq.gz/_temp.fastq.gz}
command [options] -i $f -o $temp${x##*/}
cp $temp${x##*/} $temp${q##*/}
done
else
echo "no $var1"
for f in /path/*.fastq.gz
do
q=${f/%.fastq.gz/_temp.fastq.gz}
cp $f $temp${q##*/}
done
fi
The other five statements do a similar task for var2 to var6. When I run the script I get unexpected output (no errors no warnings), but when I copy paste each of the if statements to terminal I end up with the exact result I would expect. I've looked for hidden characters or syntax issues for hours now. Could this be a shell issue? Script written on OSX (default zsh) and execution on server (default bash). I have a feeling this issue is similar but I couldn't find an answer to my issue in the replies.
Any and all ideas are most welcome!
Niwatori
You should maybe look at the shebang. I think proper usage would be #!/usr/bin/env bash or #!/bin/bash.
thanks for the help, much appreciated. Shebang didn't seem to be the problem although thanks for pointing that out. Shellcheck.net reminded me to use
[[ ]]
for unquoted variables, but that didn't solve the problem either. Here's what went wrong and how I 'fixed' it:
for every variable the same command (tool) is used which relies on a support file (similar in format but different content). Originally, before every if statement I replaced the support file for the previous variable with the one needed for the current variable. For some reason (curious why, any thoughts are welcome) this didn't always happen correctly.
As a quick workaround I made six versions of the tool, all with a different support file and used PYTHONPATH=/path/to/version/:$PYTHONPATH before every if statement. Best practice would be to adapt the tool so it can use different support files or an option that deals with repetitive tasks but I don't have the time at the moment.
Have a nice day,
Niwatori

missed $ while using variableName in bash script - how to catch such issues?

I have a bash script which does the following:
#!/bin/bash
moduleName=$1
someInfo=`ls | grep -w moduleName`
echo $someInfo
In the line #3, I was supposed to use $moduleName, but I missed it.
Is there any way to find such issues in Bash scripts?
I used shell Check, but it didn't report this issue.
For me, the script looks fine; it lists the files whose name contain the string moduleName.
The script is syntactically correct.
The error in line #3 is a semantic error; it changes the meaning of the script. Only a person that knows the intention of the script can detect it.
There is no way to automatically detect such errors, unless you write a software that reads your mind and knows that you intended to write $moduleName and you mistakenly wrote moduleName instead.

bash command substitution freezes script (output too long?)--how to cope

I have a bash script that includes a line like this:
matches="`grep --no-filename $searchText $files`"
In other words, I am assigning the result of a grep to a variable.
I recently found that that line of code seems to have a vulnerability: if the grep finds too many results, it annoyingly simply freezes execution.
First, if anyone can confirm that excessive output (and exactly what constitutes excessive) is a known danger with command substitution, please provide a solid link for me. I web searched, and the closest reference that I could find is in this link:
"Do not set a variable to the contents of a long text file unless you have a very good reason for doing so."
That hints that there is a danger, but is very inadequate.
Second, is there a known best practice for coping with this?
The behavior that I really want is for excessive output in command substitution
to generate a nice human readable error message followed by an error exit code so that my script will terminate instead of freeze. (Note: I always run my scripts with "set -e" as one of the initial lines). Is there any way that I can get this behavior?
Currently, the only solution that I know of is a hack that sorta works just for my immediate case: I can limit the output from grep using its --max-count option.
Ideally, you shouldn't capture data of unknown length into memory at all; if you read it as you need it, then grep will wait until the content is ready to use.
That is:
while IFS= read -r match; do
echo "Found a match: $match"
# example: maybe we want to look at whether a match exists on the filesystem
[[ -e $match ]] && { echo "Got what we needed!" >&2; break; }
done < <(grep --no-filename "$searchText" "${files[#]}")
That way, grep only writes a line when read is ready to consume it (and will block instead of needing to continue to read input if it has more output already produced than can be stored in the relatively small pipe buffer) -- so the names you don't need don't even get generated in the first place, and there's no need to allocate memory or deal with them in any other way.

Use of Windows undefined environment variable?

Here's a simple but puzzling question.
For an undefined Windows environment variable, abc for example
In the Command Prompt window ECHO [%abc%] results in [%abc%]
But in a .CMD batch file ECHO [%abc%] results in []
Why the difference? I've researched the ECHO command and can't find anything about this. I'm concerned about where else this subtle difference might apply.
Really good question! Confusing huh?
There are actually two distinct parsers used to parse batch scripts and command line commands.
Quote from this excellent answer:
BatchLineParser - The parser inside of batch files, for lines or blocks
CmdLineParser - Like the BatchLineParser, but directly at the command prompt, works different
The key difference is in the first phase of parsing, particularly the extension of %var%:
In BatchLineParser if var does not exists will be replaced with nothing, in CmdLineParser if the var isn't defined, the expression will be unchanged.
So why did someone design it this way? I have absolutely no idea.

Converting a history command into a shell script

This is sort of one of those things that I figured a lot of people would use a lot, but I can't seem to find any people who have written about this sort of thing.
I find that a lot of times I do a lot of iteration on a command-line one-liner and when I end up using it a lot, or anticipate wanting to use it in the future, or when it becomes cumbersome to work with in one line, it generally is a good idea to turn the one-liner into a shell script and stick it somewhere reasonable and easily accessible like ~/bin.
It's obviously too cumbersome to use any sort of roundabout method involving a text editor to get this done, and it's possible to simply do it on the shell, for instance in zsh typing
echo "#!/usr/bin/env sh" > ~/bin/command_from_history_number_523.sh && echo !523 >> ~/bin/command_from_history_number_523.sh
followed by pressing Tab to inject the !523rd command and somehow shoehorning it into an acceptable string to be saved.
This is particularly cumbersome and has at minimum three problems:
Does not work in bash as it does not complete the !523
Requires some manual inspection and string escapement
Requires too much typing such as the script name must be entered twice
So it looks like I need to do some meta shell scripting here.
I think a good solution would function under both bash and zsh, and it should probably work by taking two arguments, an integer for the history command number and a name for the shell script to poop out in a hardcoded directory which contains that one command. Furthermore, under bash, it appears that multi-line commands are treated as separate commands, but I'm willing to assume that we only care about one-liners here and I only use zsh anyway at this point.
The stumbling block here is that i think I'll still be running shell scripts through bash even when using zsh, so it won't likely then be able to parse zsh's history files. I may need to make this into two separate programs then.
Update: I agree with #Floris 's comment that direct use of the commands like !! would be helpful though I am not sure how to make this work. Suppose I have the usage be
mkscript command_number_24 !24
this is inadequate because mkscript will be receiving the expanded out contents of the 24th command. if the 24th command contains any file globs or somesuch they will have been expanded already. This is bad, and I basically want the contents of the history file, i.e. the raw command string. I guess this can be worked around by manually implementing those shortcuts in here. Or just screw it and just take an integer argument.
function mkscript() {
echo '#!/bin/bash' > ~/bin/$2
history -p '!'$1 >> ~/bin/$2
}
Only tested in Bash.
Update from OP: In zsh I can accomplish this with fc -l $2 $2

Resources