Is there any way to make gcc print offending lines when it emits an error? - gcc

I have a large codebase that I've been tasked with porting to 64 bits. The code compiles, but it prints a very large amount of incompatible pointer warnings (as is to be expected.) Is there any way I can have gcc print the line on which the error occurs? At this point I'm just using gcc's error messages to try to track down assumptions that need to be modified, and having to look up every one is not fun.

I've blatantly stolen Joseph Quinsey's answer for this. The only difference is I've attempted to make the code easier to understand:
For bash, use make 2>&1 | show_gcc_line with show_gcc_line the following script:
#!/bin/bash
# Read and echo each line only if it is an error or warning message
# The lines printed will start something like "foobar:123:" so that
# line 123 of file foobar will be printed.
while read input
do
loc=$(echo "$input" | sed -n 's/^\([^ :]*\):\([0-9]*\):.*/\1 \2/p')
len=${#loc}
file=${loc% *}
line=${loc#* }
if [ $len -gt 0 ]
then
echo "$input"
echo "$(sed -n ${line}p $file)"
echo
fi
done
This was partly because I did not like the formatting of the original. This only prints the warnings/errors, followed by the line of code causing the problem, followed by a blank line. I've removed the string of hyphens too.

Perhaps a script to print the desired lines would help. If you are using csh (unlikely!) use:
make ... |& show_gcc_line
with show_gcc_line the following script:
#!/bin/csh
# Read and echo each line. And, if it starts with "foobar:123:", print line 123
# of foobar, using find(1) to find it, prefaced by ---------------.
set input="$<"
while ( "$input" )
echo "$input"
set loc=`echo "$input" | sed -n 's/^\([^ :]*\):\([0-9]*\):.*/\1 \2/p'`
if ( $#loc ) then
find . -name $loc[1] | xargs sed -n $loc[2]s/^/---------------/p
endif
set input="$<"
end
And for bash, use make ... 2>&1 | show_gcc_line with:
#!/bin/bash
# Read and echo each line. And, if it starts with "foobar:123:", print line 123
# of foobar, using find(1) to find it, prefaced by ---------------.
while read input
do
echo "$input"
loc=$(echo "$input" | sed -n 's/^\([^ :]*\):\([0-9]*\):.*/\1 \2/p')
if [ ${#loc} -gt 0 ]
then
find . -name ${loc% *} | xargs sed -n ${loc#* }s/^/---------------/p
fi
done

Use -W option to control which warnings you want to display. this parameter explained here.
Also you can use this trick to suppress progressive outputs:
gcc ... 1>/dev/nul

By the time the compiler emits an error message, the actual source line is long gone, (particularly in C) - it has been transformed to a token stream, then to an abstract syntax tree, then to a decorated syntax tree... gcc has enough on its plate with the many steps of compiling, so it intentionally does not include functionality to reopen the file and re-retrieve the original source. That's what editors are for, and virtually all of them have commands to start a compilation and jump to the next error at a keypress. Do yourself a favor and use a modern editor to browse errors (and maybe even fix them semi-automatically).

This little script should work, but I can't test it right now. sorry if need edit.
LAST_ERROR_LINE=`(gcc ... 2>&1 >/dev/null ) | grep error | tail -n1`
FILE=`echo $LAST_ERROR_LINE | cut -f1 -d':'`
LINE=`echo $LAST_ERROR_LINE | cut -f2 -d':'`
sed -n "${LINE}p" $FILE

For me, the symbol of error messages redirecting is hard to remember.
So, here is my version to printed out gcc error messages:
$ee make
for both error and warning messages:
ee2 make
How:
add these into .bashrc
function ee() {
$* 2>&1 | grep error
}
function ee2() {
$* 2> ha
echo "-----"
echo "Error"
echo "-----"
grep error ha
echo "-------"
echo "Warning"
echo "-------"
grep warning ha
}

Related

bash: pipe continuously into a grep

Not sure how to explain this but, what I am trying to achieve is this:
- tailing a file and grepping for a patter A
- then I want to pipe into another customGrepFunction where it matches pattern B, and if B matches echo something out. Need the customGrepFunction in order to do some other custom stuff.
The sticky part here is how to make the grepCustomFunction work here.In other words when only patternA matches echo the whole line and when both patterA & patternB match printout something custom:
when I only run:
tail -f file.log | grep patternA
I can see the pattenA rows are being printed/tailed however when I add the customGrepFunction nothing happens.
tail -f file.log | grep patternA | customGrepFunction
And the customGrepFunction should be available globally in my bin folder:
customGrepFunction(){
if grep patternB
then
echo "True"
fi
}
I have this setup however it doesn't do what I need it to do, it only echos True whenever I do Ctrl+C and exit the tailing.
What am I missing here?
Thanks
What's Going Wrong
The code: if grep patternB; then echo "true"; fi
...waits for grep patternB to exit, which will happen only when the input from tail -f file.log | grep patternA hits EOF. Since tail -f waits for new content forever, there will never be an EOF, so your if statement will never complete.
How To Fix It
Don't use grep on the inside of your function. Instead, process content line-by-line and use bash's native regex support:
customGrepFunction() {
while IFS= read -r line; do
if [[ $line =~ patternB ]]; then
echo "True"
fi
done
}
Next, make sure that grep isn't buffering content (if it were, then it would be written to your code only in big chunks, delaying until such a chunk is available). The means to do this varies by implementation, but with GNU grep, it would look like:
tail -f file.log | grep --line-buffered patternA | customGrepFunction

creating a variable from sed output

I am banging my head against the keyboard on this simple piece of code.
#!/bin/bash
connstate="Connected"
vpnstatus=$(/opt/cisco/anyconnect/bin/vpn state | (grep -m 1 'state:'))
echo $vpnstatus
vpnconn=$(echo $vpnstatus | sed -e 's/>>\ state: //g' | sed "s/ //g")
echo "$vpnconn" "$connstate"
if [ "$vpnconn" = "$connstate" ];then
echo $vpnconn
else echo "this script still fails"
fi
echo done
This is the output from the above code:
>> state: Connected
Connected Connected
this script still fails
done
I believe the issue revolves around the vpnconn=$ if I comment that section of code out and fill the variable vpnconn="Connected" this code works fine. Something with how the sed is working on the input from vpnstatus and outputting the results to vpnconn is making what looks like a correct result incorrect when doing the compare in the if then.
I have tried splitting up the vpnconn line into two separate lines and that did not change anything, I took out the sed "s/ //g" and replaced it with a trim -d ' ' and that did not change the results. I know this is something small in this tiny piece of code that I am missing.
Did you try?
vpnconn=$(echo "$vpnstatus" | awk '{print $3}')
Something like:
vpnstatus=$(/opt/cisco/anyconnect/bin/vpn state|grep -m 1 'state:'|awk '{print 3}')
should do the work.

Merging fastq files by identifiers with a shell script

I have to merge files with the following naming pattern :
[SampleID]_[custom_ID01]_ID[RUN_ID]_L001_R1.fastq
[SampleID]_[custom_ID02]_ID[RUN_ID]_L002_R1.fastq
[SampleID]_[custom_ID03]_ID[RUN_ID]_L003_R1.fastq
[SampleID]_[custom_ID04]_ID[RUN_ID]_L004_R1.fastq
I need to merge all files with identical [SampleID] but different "Lanes" (L001-L004).
The following script works fine when directly run in the terminal:
custom_id="000"
RUN_ID="0025"
wd="/path/to/script/" # was missing/ incorrect
# get ALL sample identifiers
touch temp1.txt
for line in $wd/*.fastq ; do
fastq_identifier=$(echo "$line" | cut -d"_" -f1);
echo $fastq_identifier >> temp1.txt
done
# get all uniqe samples identical
cat temp1.txt | uniq > temp2.txt
input_var=$(cat temp2.txt)
# concatenate all fastq (different lanes) with identical identifier
for line in $input_var; do
cat $line*fastq >> $line"_"$custom_id"_ID"$Run_ID"_L001_R1.fastq"
done
rm temp1.txt temp2.txt;
But if I create a script file (concatenate_fastq.sh) and make it executable
$ chomd +x concatenate_fastq.sh
and run it
$ ./concatenate_fastq.sh
I got the following error:
$ concatenate_fastq.sh: line 17: /*.fastq_000_ID_L001_R1.fastq: Keine Berechtigung # = Permission denied
Thx to your hints below I solved the problem by fixing
wd=/path/to/script/
The immediate problem seems to be that wd is unset. If you script really genuinely contains exactly the line
wd="/path/to/script/"
then I would suspect invisible control characters in the script file (using a Windows editor is a common way to shoot yourself in the foot).
More generally, your script should cope correctly when the wildcard does not match any files. A common way to do that is to shopt -s nullglob but the subsequent script would still need adaptation then.
Refactoring the script to loop only over actual matches would help avoid trouble. Perhaps something like this:
shopt -s nullglob # bashism
printf '%s\n' "$wd"/*.fastq |
cut -d_ -f1 |
uniq |
while read -r line; do
cat "$line"*fastq >> "${line}_${custom_id}_ID${Run_ID}_L001_R1.fastq"
done
You'll notice that this simplifies the script tremendously, and avoids the pesky temporary files.
I solved it with:
if [ $# -ne 3 ] ; then
echo -e "Usage: $0 {path_to_working_directory} {custom_ID:Z+} {run_ID:ZZZZ}\n"
exit 1
fi
cwd=$(pwd)
wd=$1
custom_id=$2
RUN_ID=$3
folder=$(basename $wd)
input_var=$(ls *fastq | cut --fields 1 -d "_" | uniq)
for line in $input_var; do
cat $line*fastq >> $line"_"$custom_id"_ID"$RUN_ID"_L001_R1.fastq"
done

how to cat command output to string in shell script

in my script i need to loop through lines in a file, once i find some specific line i need to save it to variable so later on i can use it outside the loop, i tried the following but it wont' work:
count=0
res=""
python my.py -p 12345 |
while IFS= read -r line
do
count=$((count+1))
if [ "$count" -eq 5 ]; then
res=`echo "$line" | xargs`
fi
done
echo "$res"
it output nothing, i also tried this,
res=""
... in the loop...
res=$res`echo "$line" | xargs`
still nothing. please help. thanks.
Update: Thanks for all the help. here is my final code:
res=python my.py -p 12345 | sed -n '5p' | xargs
for finding a specific line in a file, have you considered using grep?
grep "thing I'm looking for" /path/to/my.file
this will output the lines that match the thing you're looking for. Moreover this can be piped to xargs as in your question.
If you need to look at a particularly numbered line of a file, consider using the head and tail commands (which can also be piped to grep).
cat /path/to/my.file | head -n5 | tail -n1 | grep "thing I'm looking for"
These commands take the first lines specified (in this case, 5 and 1 respectively) and only prints those out. Hopefully this will help you accomplish your task.
Happy coding! Leave a comment if you have any questions.

Grep without filtering

How do I grep without actually filtering, or highlighting?
The goal is to find out if a certain text is in the output, without affecting the output. I could tee to a file and then inspect the file offline, but, if the output is large, that is a waste of time, because it processes the output only after the process is finished:
command | tee file
file=`mktemp`
if grep -q pattern "$file"; then
echo Pattern found.
fi
rm "$file"
I thought I could also use grep's before (-B) and after (-A) flags to achieve live processing, but that won't output anything if there are no matches.
# Won't even work - DON'T USE.
if command | grep -A 1000000 -B 1000000 pattern; then
echo Pattern found.
fi
Is there a better way to achieve this? Something like a "pretend you're grepping and set the exit code, but don't grep anything".
(Really, what I will be doing is to pipe stderr, since I'm looking for a certain error, so instead of command | ... I will use command 2> >(... >&2; result=${PIPESTATUS[*]}), which achieves the same, only it works on stderr.)
If all you want to do is set the exit code if a pattern is found, then this should do the trick:
awk -v rc=1 '/pattern/ { rc=0 } 1; END {exit rc}'
The -v rc=1 creates a variable inside the Awk program called rc (short for "return code") and initializes it to the value 1. The stanza /pattern/ { rc=0 } causes that variable to be set to 0 whenever a line is encountered that matches the regular expression pattern. The 1; is an always-true condition with no action attached, meaning the default action will be taken on every line; that default action is printing the line out, so this filter will copy its input to its output unchanged. Finally, the END {exit rc} runs when there is no more input left to process, and ensures that awk terminates with the value of the rc variable as its process exit status: 0 if a match was found, 1 otherwise.
The shell interprets exit code 0 as true and nonzero as false, so this command is suitable for use as the condition of a shell if or while statement, possibly at the end of a pipeline.
To allow output with search result you can use awk:
command | awk '/pattern/{print "Pattern found"} 1'
This will print "Pattern found" when pattern is matched in any line. (Line will be printed later)
If you want Line to print before then use:
command | awk '{print} /pattern/{print "Pattern found"}'
EDIT: To execute any command on match use:
command | awk '/pattern/{system("some_command")} 1'
EDIT 2: To take care of special characters in keyword use this:
command | awk -v search="abc*foo?bar" 'index($0, search) {system("some_command"); exit} 1'
Try this script. It will not modify anything of output of your-command and sed exit with 0 when pattern is found, 1 otherwise. I think its what you want from my understand of your question and comment.:
if your-command | sed -nr -e '/pattern/h;p' -e '${x;/^.+$/ q0;/^.+$/ !q1}'; then
echo Pattern found.
fi
Below is some test case:
ubuntu-user:~$ if echo patt | sed -nr -e '/pattern/h;p' -e '${x;/^.+$/ q0;/^.+$/ !q1}'; then echo Pattern found.; fi
patt
ubuntu-user:~$ if echo pattern | sed -nr -e '/pattern/h;p' -e '${x;/^.+$/ q0;/^.+$/ !q1}'; then echo Pattern found.; fi
pattern
Pattern found.
Note previous script fails to work when there is no ouput from your-command because then sed will not run sed expression and exit with 0 all the time.
I take it you want to print out each line of your output, but at the same time, track whether or not a particular pattern is found. Simply passing the output to sed or grep would affect the output. You need to do something like this:
pattern=0
command | while read line
do
echo "$line"
if grep -q "$pattern" <<< "$lines"
then
((pattern+=1))
fi
done
if [[ $pattern -gt 0 ]]
then
echo "Pattern was found $pattern times in the output"
else
echo "Didn't find the pattern at all"
fi
ADDENDUM
If the original command has both stdout and stderr output, which come in a specific order, with the two possibly interleaved, then will your solution ensure that the outputs are interleaved as they normally would?
Okay, I think I understand what you're talking about. You want both STDERR and STDOUT to be grepped for this pattern.
STDERR and STDOUT are two different things. They both appear on the terminal window because that's where you put them. The pipe (|) only takes STDOUT. STDERR is left alone. In the above, only the output of STDOUT would be used. If you want both STDOUT and STDERR, you have to redirect STDERR into STDOUT:
pattern=0
command 2>&1 | while read line
do
echo "$line"
if grep -q "$pattern" <<< "$lines"
then
((pattern+=1))
fi
done
if [[ $pattern -gt 0 ]]
then
echo "Pattern was found $pattern times in the output"
else
echo "Didn't find the pattern at all"
fi
Note the 2>&1. This says to take STDERR (which is File Descriptor 2) and redirect it into STDOUT (File Descriptor 1). Now, both will be piped into that while read loop.
The grep -q will prevent grep from printing out its output to STDOUT. It will print to STDERR, but that shouldn't be an issue in this case. Grep only prints out STDERR if it cannot open a file requested, or the pattern is missing.
You can do this:
echo "'search string' appeared $(command |& tee /dev/stderr | grep 'search string' | wc -l) times"
This will print the entire output of command followed by the line:
'search string' appeared xxx times
The trick is, that the tee command is not used to push a copy into a file, but to copy everything in stdout to stderr. The stderr stream is immediately displayed on the screen as it is not connected to the pipe, while the copy on stdout is gobbled up by the grep/wc combination.
Since error messages are usually emitted to stderr, and you said that you want to grep for error messages, the |& operator is used for the first pipe to combine the stderr of command into its stdout, and push both into the tee command.

Resources