writing data to target in gnu make file - makefile

could you please let me know what is wrong with the the following
the file first has the data one. and I am trying to write hello to firstfile. it is not working.
firstfile:first
echo "hello" ; > $#
but the following one works
firstfile:first
for f in $^;do echo "hello" ; done > $#
could you please let me know what is the difference.

; is a command separator in the shell. So your first recipe:
firstfile:first
echo "hello" ; > $#
is equivalent to:
firstfile:first
echo "hello" # writes "hello" to your console
> $# # writes an empty string to `firstfile`
Change it to:
firstfile:first
echo "hello" > $#

It seems echo "hello"; is executed first then some blank is written to the firstfile. and the for loop echo command result hello may be taken as string hello and the string is written to the firstfile. could please conform it.

Related

Create a bash script inside a bash script that uses special variables $1, $#

I'm trying to create a script that creates an other script that uses $1 and $#, the problem is that those variables are being interpreted by the first script, so they are empty. Here's my problem, the first script creates the script /tmp/test.sh
#!/bin/bash
cat << EOF > /tmp/test.sh
#!/bin/bash
echo $1
echo $#
EOF
The result in /tmp/test.sh:
#!/bin/bash
echo
echo 0
Does anyone know how to avoid this and get in /tmp/test.sh $1 and $#?
I would like to have in /tmp/test.sh:
#!/bin/bash
echo $1
echo $#
Thanks in advance.
Quote the here-document delimiter so that the contents of the here document are treated as literal text (i.e., as if occurring in a single-quoted string).
cat << 'EOF' > /tmp/test.sh
#!/bin/bash
echo $1
echo $#
EOF
Any quoting will work, not just single quotes. The only important thing is that at least one character be escaped.
cat << \EOF
cat << "EOF"
cat << E"O"F
etc

Unix How to check if a specific word is entered in as an argument

I'm writing a script in Unix but I need a way to check an argument that is entered in the command line is a specific word.
So if when using the script the user types:
$ ./script hello
my script can tell that "hello" was entered as an argument and can display a message appropriately.
And if the user types something other than "hello" as an argument then my script can display another message.
Thanks.
This should work:
#!/bin/bash
if [[ $1 == hello ]];then
echo "hello was entered"
else
echo "hello wasn't entered"
fi
There are a number of ways to check positional arguments against a list. When there are a number of items in the list, you can use a case statement instead of a string of if ... elif ... elif ... fi comparisons. The syntax is as follows:
#!/bin/bash
case "$1" in
"hello" )
printf "you entered hello\n"
;;
"goodbye" )
printf "well goodbye to you too\n"
;;
* )
printf "you entered something I don't understand.\n"
;;
esac
exit 0
Output/Use
$ ./caseex.sh hello
you entered hello
$ ./caseex.sh goodbye
well goodbye to you too
$ ./caseex.sh morning
you entered something I don't understand.
In Bash arguments passed to shell scripts are stored in variables named as follows:
$0 = name of the script.
$1~$n = arguments.
$# = number of arguments.
$* = single string of all arguments: "arg1,arg2,..."
you can simply use if [ $1 == "some string" ]; then ...
You can retrieve the command line arguments with $(number)
for example the first argument would exist at $1 the second at $2 etc.
You can use conditionals in BASH (I assume you are using bash) just like any other language; however the syntax is a bit wonky :). here is a link for you
http://tldp.org/LDP/Bash-Beginners-Guide/html/chap_07.html
If you are sure about the position of the argument you can :
#!/bin/bash
if [[ $1 == SearchWord]];then
echo "SearchWord was entered"
else
echo "SearchWord wasn't entered"
fi
Incase you are not sure:
You can use $*
[ `echo $* | grep $SearchWord| wc -l` -eq 1 ] && echo "Present"|| echo "Not present"

How to use grep in csh script

Below is a simple csh script I wrote. But the set does not work. Can anyone please help me with the error.
#!/bin/csh
echo "hello"
set ans ='grep -r hello ./'
echo ans
Tried back quotes still not working:
#!/bin/csh
echo "hello"
set ans =`grep -r hello .`
echo $ans
To get the value of a variable you have to add a $ to the beginning:
echo $ans
Furthermore you should remove the space in front of the = sign:
set ans=`grep -r hello .`
You need to use backquotes `` and not simplequotes ''.
Also variables are instanciated without a dollar but you need to access them like $ans
#!/bin/csh
echo "hello"
set ans =`grep -r hello .`
echo $ans

Bash script to automatically test program output - C

I am very new to writing scripts and I am having trouble figuring out how to get started on a bash script that will automatically test the output of a program against expected output.
I want to write a bash script that will run a specified executable on a set of test inputs, say in1 in2 etc., against corresponding expected outputs, out1, out2, etc., and check that they match. The file to be tested reads its input from stdin and writes its output to stdout. So executing the test program on an input file will involve I/O redirection.
The script will be invoked with a single argument, which will be the name of the executable file to be tested.
I'm having trouble just getting going on this, so any help at all (links to any resources that further explain how I could do this) would be greatly appreciated. I've obviously tried searching myself but haven't been very successful in that.
Thanks!
If I get what you want; this might get you started:
A mix of bash + external tools like diff.
#!/bin/bash
# If number of arguments less then 1; print usage and exit
if [ $# -lt 1 ]; then
printf "Usage: %s <application>\n" "$0" >&2
exit 1
fi
bin="$1" # The application (from command arg)
diff="diff -iad" # Diff command, or what ever
# An array, do not have to declare it, but is supposedly faster
declare -a file_base=("file1" "file2" "file3")
# Loop the array
for file in "${file_base[#]}"; do
# Padd file_base with suffixes
file_in="$file.in" # The in file
file_out_val="$file.out" # The out file to check against
file_out_tst="$file.out.tst" # The outfile from test application
# Validate infile exists (do the same for out validate file)
if [ ! -f "$file_in" ]; then
printf "In file %s is missing\n" "$file_in"
continue;
fi
if [ ! -f "$file_out_val" ]; then
printf "Validation file %s is missing\n" "$file_out_val"
continue;
fi
printf "Testing against %s\n" "$file_in"
# Run application, redirect in file to app, and output to out file
"./$bin" < "$file_in" > "$file_out_tst"
# Execute diff
$diff "$file_out_tst" "$file_out_val"
# Check exit code from previous command (ie diff)
# We need to add this to a variable else we can't print it
# as it will be changed by the if [
# Iff not 0 then the files differ (at least with diff)
e_code=$?
if [ $e_code != 0 ]; then
printf "TEST FAIL : %d\n" "$e_code"
else
printf "TEST OK!\n"
fi
# Pause by prompt
read -p "Enter a to abort, anything else to continue: " input_data
# Iff input is "a" then abort
[ "$input_data" == "a" ] && break
done
# Clean exit with status 0
exit 0
Edit.
Added exit code check; And a short walk trough:
This will in short do:
Check if argument is given (bin/application)
Use an array of "base names", loop this and generate real filenames.
I.e.: Having array ("file1" "file2") you get
In file: file1.in
Out file to validate against: file1.out
Out file: file1.out.tst
In file: file2.in
...
Execute application and redirect in file to stdin for application by <, and redirect stdout from application to out file test by >.
Use a tool like i.e. diff to test if they are the same.
Check exit / return code from tool and print message (FAIL/OK)
Prompt for continuance.
Any and all of which off course can be modified, removed etc.
Some links:
TLDP; Advanced Bash-Scripting Guide (can be a bit more readable with this)
Arrays
File test operators
Loops and branches
Exit-status
...
bash-array-tutorial
TLDP; Bash-Beginners-Guide
Expect could be a perfect fit for this kind of problem:
Expect is a tool primarily for automating interactive applications
such as telnet, ftp, passwd, fsck, rlogin, tip, etc. Expect really
makes this stuff trivial. Expect is also useful for testing these same
applications.
First take a look at the Advanced Bash-Scripting Guide chapter on I/O redirection.
Then I have to ask Why use a bash script at all? Do it directly from your makefile.
For instance I have a generic makefile containing something like:
# type 'make test' to run a test.
# for example this runs your program with jackjill.txt as input
# and redirects the stdout to the file jackjill.out
test: $(program_NAME)
./$(program_NAME) < jackjill.txt > jackjill.out
./diff -q jackjill.out jackjill.expected
You can add as many tests as you want like this. You just diff the output file each time against a file containing your expected output.
Of course this is only relevant if you're actually using a makefile for building your program. :-)
Functions. Herestrings. Redirection. Process substitution. diff -q. test.
Expected outputs are a second kind of input.
For example, if you want to test a square function, you would have input like (0, 1, 2, -1, -2) and expected output as (0, 1, 4, 1, 4).
Then you would compare every result of input to the expected output and report errors for example.
You could work with arrays:
in=(0 1 2 -1 -2)
out=(0 1 4 2 4)
for i in $(seq 0 $((${#in[#]}-1)) )
do
(( ${in[i]} * ${in[i]} - ${out[i]} )) && echo -n bad" " || echo -n fine" "
echo $i ": " ${in[i]}"² ?= " ${out[i]}
done
fine 0 : 0² ?= 0
fine 1 : 1² ?= 1
fine 2 : 2² ?= 4
bad 3 : -1² ?= 2
fine 4 : -2² ?= 4
Of course you can read both arrays from a file.
Testing with (( ... )) can invoke arithmetic expressions, strings and files. Try
help test
for an overview.
Reading strings wordwise from a file:
for n in $(< f1); do echo $n "-" ; done
Read into an array:
arr=($(< file1))
Read file linewise:
for i in $(seq 1 $(cat file1 | wc -l ))
do
line=$(sed -n ${i}p file1)
echo $line"#"
done
Testing against program output sounds like string comparison and capturing of program output n=$(cmd param1 param2):
asux:~/prompt > echo -e "foo\nbar\nbaz"
foo
bar
baz
asux:~/prompt > echo -e "foo\nbar\nbaz" > file
asux:~/prompt > for i in $(seq 1 3); do line=$(sed -n ${i}p file); test "$line" = "bar" && echo match || echo fail ; done
fail
match
fail
Further usesful: Regular expression matching on Strings with =~ in [[ ... ]] brackets:
for i in $(seq 1 3)
do
line=$(sed -n ${i}p file)
echo -n $line
if [[ "$line" =~ ba. ]]; then
echo " "match
else echo " "fail
fi
done
foo fail
bar match
baz match

evaluating lines from stdout

I have a bash script that is executing a program in a loop. I want to evaluate each line from the stdout and do something if it matches my condition.
I still want to be able to see stdout on the screen. Is there a simple way to accomplish this? Thanks!
There are several variants of looping over input, but one possibility is thus:
my_cmd | while read line; do
echo "$line"
my_process "$line"
done
This should do what you want:
for string in "a" "b" "c"
do
output=`echo ${string}`
echo ${output}
if [ ${output} == "b" ] ; then
echo "do something"
fi
done
Just replace the first echo with your program.

Resources