Run a Bash Command on File Change? - macos

My question is how can I run a Bash command on some file change?
For example, if I am writing a C program and every time the file is saved I run the command rm output; gcc program.c -o output; ./output automatically

You could use make and watch:
Makefile:
output: program.c
gcc program.c -o output
./output
then
$ watch make
in a separate terminal.
However, there will be a small delay between when you save program.c and when it gets run.

You can use inotifywait for this specific purpose.
while true; do
change=$(inotifywait -e close_write,moved_to,create .)
change=${change#./ * }
if [ "$change" = "program.c" ]; then rm output; gcc program.c -o output; ./output; fi
done

Related

Cannot execute binary file with bash command

I want to run a cpp executable from my git for windows bash. I do not understand why I can run it with ./Main but I can't run it with bash Main or bash Main.exe. In the latter cases, I'm getting an error:
cannot execute binary file
main.cpp
#include<iostream>
int main()
{
std::cout<<"Hello World";
return 0;
}
script.sh
echo "Hello from Bash script.."
echo "Hostname:${1}"
echo "Port:${2}"
echo "Listing contents:"
ls -a
echo "Launching cpp executable:"
path=$(pwd)
echo "Current path:${path}"
bash "${path}/Main"
To compile the C++ code, I'm using: g++ -o Main main.cpp.
What is the problem? Can someone explain please?
Just remove the bash on the last line of your script:
"${path}/Main"
Don't forget to make it executable.
chmod +x script.sh
It worked for me:
./script.sh hostname 80
Hello from Bash script..
Hostname:hostname
Port:80
Listing contents:
. .. Main main.cpp script.sh
Launching cpp executable:
Current path:/tmp/test
Hello World

Bash script to compile a program, feed it with 15 input files and print stdout to file

I'm trying to write a Bash script that feeds my program (./program) with 15 input files named in sequence as (file01.txt, file02.txt, etc) and print the outputs to the file (result.out). Here is the code I wrote:
#!/bin/bash
#Compile the current version
g++ -std=c++11 -fprofile-arcs -ftest-coverage program.cpp -o program
#
#Output file
outFile=result.out
#Loop through files and print output
for i in *.txt; do
./program < $i > $outFile
done
I'm getting a segmentation fault when running this script and not sure what I did wrong. This is my first time to write a bash script, so any help will be appreciated.
Basically these are the points I learnt from my conversation with stackoverflow members:
1- The segmentation fault is not related to bash script. It is definitely related to the program the bash command is running.
2- The bash script that feeds a program with text files and insert the results in an output file is as follow:
#!/bin/bash
#Compile the current version
g++ -std=c++11 -fprofile-arcs -ftest-coverage program.cpp -o program
#
#Test output file
outFile=results.out
# print "Results" into outFile
printf "Results\n" > $outFile
# loops through text files, send files to stdin
# and insert stdout to outFile
for i in *.txt; do
printf "\n$i\n"
./program < "$i"
done >> $outFile

equivalent of pipefail in GNU make?

Say I have the following files:
buggy_program:
#!/bin/sh
echo "wops, some bug made me exit with failure"
exit 1
Makefile:
file.gz:
buggy_program | gzip -9 -c >$#
Now if I type make, GNU make will happily build file.gz even though buggy_program exited with non-zero status.
In bash I could do set -o pipefail to make a pipeline exit with failure if at least one program in the pipeline exits with failure. Is there a similar method in GNU make? Or some workaround that doesn't involve temporary files? (The reason for gzipping here is precisely to avoid a huge temporary file.)
Try this
SHELL=/bin/bash -o pipefail
file.gz:
buggy_program | gzip -9 -c >$#
You could do:
SHELL=/bin/bash
.DELETE_ON_ERROR:
file.gz:
set -o pipefail; buggy_program | gzip -9 -c >$#
but this only work with bash.
Here's a possible solution that doesn't require bash. Imagine you have two programs thisworks and thisfails that fail or work fine, respectively. Then the following will only leave you with work.gz, deleting fail.gz, ie. create the gzipped make target if and only if the program executed correctly:
all: fail.gz work.gz
work.gz:
( thisworks && touch $#.ok ) | gzip -c -9 >$#
rm $#.ok || rm $#
fail.gz:
( thisfails && touch $#.ok ) | gzip -c -9 >$#
rm $#.ok || rm $#
Explanation:
In the first line of the work.gz rule, thisworks will exit with success, and a file work.gz.ok will be created, and all stdout goes through gzip into work.gz. Then in the second line, because work.gz.ok exists, the first rm command also exits with success – and since || is short-circuiting, the second rm does not get run and so work.gz is not deleted.
OTOH, in the first line of the fail.gz rule, thisfails will exit with failure, and fail.gz.ok will not be created. All stdout still goes through gzip into fail.gz. Then in the second line, because fail.gz.ok does not exist, the first rm command exits with failure, so || tries the second rm command which deletes the fail.gz file.
To easily check that this works as it should, simply replace thisworks and thisfails with the commands true and false, respectively, put it in a Makefile and type make.
(Thanks to the kind people in #autotools for helping me with this.)

Giving input to a shell script through command line

I have a file like this with .sh extension..
clear
echo -n "Enter file name: "
read FILE
gcc -Wall -W "$FILE" && ./a.out
echo
When I can execute this file, it asks for a .c file and when given, it compiles and gives output of the .c file.
For this, everytime I have to first execute this .sh file and then give it the .c file name when asked. Is there anyway, so that, I can just give the .c file in the command line itself, so that it takes that file and does the work...
What I mean is, if I give "./file.sh somecommand cfile.c", then it takes cfile.c as input, compiles it and gives the output...
Use '$1' variable:
clear
gcc -Wall -W $1 && ./a.out
echo
$1 means "first argument from the command line".
Alternatively, if you want to compile multiple files at once using your script, you can use $# variable, on example:
gcc -Wall -W $# && ./a.out
You will invoke your script as follows (assuming it's called 'script.sh'):
./script.sh file.c
Plase see section 3.2.5 of the following article.
If your project gets bigger, you may also want to consider using tools designated for building, like automake.
You can also have it do things either way:
if [ -n "$1" ] ; then
FILE="$1"
else
echo -n "Enter file name: "
read FILE
fi
gcc -Wall -W "$FILE" && ./a.out
This will use the command line argument if it is there, otherwise it asks for a file name.

Bash - bad substitution

I have a short bash script to get source code's dependency files.
#!/bin/sh
rule=$(cpp -P -w -undef -nostdinc -C -M file.cc)
rule=${rule##*:}
#echo $rule
echo ${rule//\\}
Unfortunately, it outputs ./findDep.sh: 5: ./findDep.sh: Bad substitution.
But if I uncomment echo $rule, the script will execute without any problem:
lib.h macro.inc fundamental.h lib/fs.h lib/net.h \ lib/net/fetch.h
lib.h macro.inc fundamental.h lib/fs.h lib/net.h lib/net/fetch.h
Any one know why?
Thanks in advance.
You should change #!/bin/sh to #!/bin/bash or #!/usr/bin/env bash.
I can't reproduce your problem here with Bash 4.2.29.
However, did you know that read will join lines with \ newline continuations by default?
read rule < <(cpp -P -w -undef -nostdinc -C -M file.cc)
echo "${rule##*:}"
Or, in a more sh-compatible way (I think),
cpp -P -w -undef -nostdinc -C -M file.cc | {
read rule
echo "${rule##*:}"
}

Resources