NASM not working with /dev/stdout as output - shell

My task is to create program generating some assembly code based on input on stdin, and output machine code on stdout.
My approach looks like this:
#!/bin/sh
if [ ! -f ./lc ]; then
gcc lc.c -o lc
fi
./lc | nasm /dev/stdin -fbin -o /dev/stdout
LC reads from stdin and outputs assembly code to stdout. Then i pipe assembly code to nasm that is reading from it's stdin. Finally I want to output machine code to /dev/stdout.
But, i get such error:
nasm:fatal: unable to open output file `/dev/stdout'
To verify /dev/stdout works, i checked it like that:
$ cat < /dev/stdin > /dev/stdout
test
test
What am I doing wrong? I tried sudoing, no effect. Is there some commandline switch for this application to output code to stdout? My best bet is avoid creating any temporary source files and work purely on streams.

Related

Bash script to compile a program, feed it with 15 input files and print stdout to file

I'm trying to write a Bash script that feeds my program (./program) with 15 input files named in sequence as (file01.txt, file02.txt, etc) and print the outputs to the file (result.out). Here is the code I wrote:
#!/bin/bash
#Compile the current version
g++ -std=c++11 -fprofile-arcs -ftest-coverage program.cpp -o program
#
#Output file
outFile=result.out
#Loop through files and print output
for i in *.txt; do
./program < $i > $outFile
done
I'm getting a segmentation fault when running this script and not sure what I did wrong. This is my first time to write a bash script, so any help will be appreciated.
Basically these are the points I learnt from my conversation with stackoverflow members:
1- The segmentation fault is not related to bash script. It is definitely related to the program the bash command is running.
2- The bash script that feeds a program with text files and insert the results in an output file is as follow:
#!/bin/bash
#Compile the current version
g++ -std=c++11 -fprofile-arcs -ftest-coverage program.cpp -o program
#
#Test output file
outFile=results.out
# print "Results" into outFile
printf "Results\n" > $outFile
# loops through text files, send files to stdin
# and insert stdout to outFile
for i in *.txt; do
printf "\n$i\n"
./program < "$i"
done >> $outFile

Giving input to a shell script through command line

I have a file like this with .sh extension..
clear
echo -n "Enter file name: "
read FILE
gcc -Wall -W "$FILE" && ./a.out
echo
When I can execute this file, it asks for a .c file and when given, it compiles and gives output of the .c file.
For this, everytime I have to first execute this .sh file and then give it the .c file name when asked. Is there anyway, so that, I can just give the .c file in the command line itself, so that it takes that file and does the work...
What I mean is, if I give "./file.sh somecommand cfile.c", then it takes cfile.c as input, compiles it and gives the output...
Use '$1' variable:
clear
gcc -Wall -W $1 && ./a.out
echo
$1 means "first argument from the command line".
Alternatively, if you want to compile multiple files at once using your script, you can use $# variable, on example:
gcc -Wall -W $# && ./a.out
You will invoke your script as follows (assuming it's called 'script.sh'):
./script.sh file.c
Plase see section 3.2.5 of the following article.
If your project gets bigger, you may also want to consider using tools designated for building, like automake.
You can also have it do things either way:
if [ -n "$1" ] ; then
FILE="$1"
else
echo -n "Enter file name: "
read FILE
fi
gcc -Wall -W "$FILE" && ./a.out
This will use the command line argument if it is there, otherwise it asks for a file name.

How to handle error when exit code is zero in bash

I have the following script ~/bin/cat that uses pygmentize to display syntax highlightable files when ever possible if not just regular old cat.
#!/bin/bash
for var; do
pygmentize "$var" 2> /dev/null
if [ $? -ne 0 ]; then
/bin/cat "$var"
fi
done
This works fine on my work machine but not on my home machine. At home if pygmentize doesn't recognize a file it displays the same error message but the exit status is 0 where as at work it returns 1, which breaks the script. The only difference being at work I run Fedora and at home Ubuntu.
$ pygmentize testfile
Error: no lexer for filename 'testfile' found
$ echo $?
0
$ file testfile
file: ASCII text
This is strange as both are the same version
$ pygmentize -V
Pygments version 1.4, (c) 2006-2008 by Georg Brandl.
I could grep for Error in stderr but how do I do this without throwing away stdout, How should I handle this?
Well, your best approach is to fix pygmentize to properly return an error code. As Ignacio Vazquez-Abrams mentions, one of the distros has a patch that is either causing or fixing this.
But, here is how to work around it:
If the error message is on stderr
The easiest way is probably to redirect stderr to a temporary file, and leave stdout alone:
pygmentize "$var" 2> "$tmpfile"
then you can grep "$tmpfile". There are other ways, but they're more complicated.
If the error message is on stdout
Yep, that'd be another bug in pygmentize, it should be on stderr. The temporary file will work again, however. Just cat the temporary file back to stdout if its OK. Alternatively, you can use tee to duplicate the stdout to several destinations.

Reroute File Output to stdout in Bash Script

I have a script, wacaw (http://webcam-tools.sourceforge.net/) that outputs video from my webcam to a file. I am trying to basically stream that to some sort of display i.e vlc, quicktime, etc to get a "mirror" type effect.
Aside from altering the source code for wacaw, is there any way to force a script's file output to stdout so I can pipe it to something like vlc? Is it even possible to stream video like that?
Thanks for your help!
UPDATE: just to clarify:
running the wacaw script is formatted as follows:
./wacaw --video --duration 5 --VGA myFile
and it outputs a file myFile.avi. If I try to do a named pipe:
mkfifo pipe
./wacaw --video --duration 5 --VGA pipe
it outputs a file pipe.avi
You can use named pipes. You use mkfifo to create the pipe, hand that filename to the writing process and then read from that file with the other process. I have no idea if video in particular would work that way, but many other things do.
At least in bash you can do like this:
Original command:
write-to-file-command -f my-file -c
Updated command:
write-to-file-command -f >(pipe-to-command) -c
write-to-file-command will think >(pipe-to-command) is a write-only file and pipe-command will receive the file data on its stdin.
(If you just want the output to stdout you could do
write-to-file-command >(cat)
)
You may also try using tail -F myFile.avi:
# save stdout to file stdout.avi
man tail | less -p '-F option'
(rm -f myFile.avi stdout.avi; touch myFile.avi; exec tail -F myFile.avi > stdout.avi ) &
rm -f myFile.avi; wacaw --video --duration 1 --VGA myFile
md5 -q myFile.avi stdout.avi
stat -f "bytes: %z" myFile.avi stdout.avi
# pipe stdout to mplayer (didn't work for me though)
# Terminal window 1
# [mov,mp4,m4a,3gp,3g2,mj2 # ...]moov atom not found
#rm -f myFile.avi; touch myFile.avi; tail -F myFile.avi | mplayer -cache 8192 -
# Terminal window 2
#rm -f myFile.avi; wacaw --video --duration 1 --VGA myFile

How do I write to a file and print to a terminal concurrently in Unix?

I have a little bash function to log my Macports outputs to a file (since installs often spew little tidbits that are easy to lose in terminal noise), then I just cat the file to the terminal:
function porti {
command sudo port install $# >> $1.log 2>&1; cat $1.log
}
Is there a way to do this concurrently?
BTW I pass $# to install but only $1 for the file name so that I can do something like:
porti git-gore +bash_completion
and only get the file git-core.log however someone else might prefer to include variants in the file name...
The usual solution is to use tee(1):
sudo port install $# 2>&1 | tee -a $1.log
should do what you want

Resources