Using Pipe in Make File - bash

I'm trying to create a make command that uses a pipe. But the command is not executing as expected.
Example
SHELL:=/bin/bash
all:
$(shell ps aux | grep -i someProcName)
Output:
bash: kls602: command not found
grep: kls602: No such file or directory
grep: 46905: No such file or directory
grep: 0.0: No such file or directory
grep: 0.0: No such file or directory
grep: 4306668: No such file or directory
grep: 6480: No such file or directory
grep: s008: No such file or directory
grep: S: No such file or directory
grep: 12:22PM: No such file or directory
grep: 0:00.57: No such file or directory
...
It's as if the output of ps is being taken in as the file to search in the grep command. I can't figure out what I'm doing wrong.
So how do you write semi complex bash command in make? I would prefer a better answer than "Just run a bash script instead of putting the command in a makefile".

Using $(shell...) in a recipe is an anti-pattern. You should never do it except in the most unusual, bizarre situations. A recipe IS a shell script, so running $(shell ...) there is, at best, redundant.
Second, read what the shell function does:
The shell function performs the same function that backquotes (‘`’) perform in most shells: it does command expansion. This means that it takes as an argument a shell command and evaluates to the output of the command.
So your understanding of what's happening is exactly correct, and that's exactly what it's supposed to do.
You should write this simply as:
all:
ps aux | grep -i someProcName

Related

file-list with env variable

In linux, I have a file-list named file_list.txt, where the paths in that list include also an env variable (for this example, the file listed in that file-list is $HOME/myfile).
> cat file_list.txt
$HOME/myfile
> ls `cat file_list.txt`
ls: $HOME/myfile: No such file or directory
> ls $HOME/myfile
/home/user/myfile
Why is that? How can I run operations (such as ls, less, vim etc) on the files listed in the file-list?
If you want to expand the variables you will need something like:
$ sh -c "ls `cat file_list.txt`"
In this case, the commands are read from string and therefore extending variables if any.
$ eval "ls `cat file_list.txt`"
Just in case check also this question: Why should eval be avoided in Bash, and what should I use instead?

shell scripting no such file or directory

I wrote a shell script that calls the ffmpeg tool but when I run it, it says No such file or directory yet it does!
Here is my script:
#!/bin/bash
MAIN_DIR="/media/sf_data/pipeline"
FFMPEG_DIR="/media/sf_data/livraison_transcripts/ffmpeg-git-20180208-64bit-static"
for file in MAIN_DIR/audio_mp3/*.mp3;
do
cp -p file FFMPEG_DIR;
done
for file in FFMPEG_DIR/*.mp3;
do
./ffmpeg -i ${file%.mp3}.ogg
sox $file -t raw --channels=1 --bits=16 --rate=16000 --encoding=signed-
integer --endian=little ${file%.ogg}.raw;
done
for file in FFMPEG_DIR/*.raw;
do
cp -p file MAIN_DIR/pipeline/audio_raw/;
done
and here is the debug response:
cp: cannot stat ‘file’: No such file or directory
./essai.sh: line 14: ./ffmpeg: No such file or directory
sox FAIL formats: can't open input file `FFMPEG_DIR/*.mp3': No such file or
directory
cp: cannot stat ‘file’: No such file or directory
FYI I'm running CentOS7 on VirtualBox
Thank you
Here's a Minimal, Complete, and Verifiable example (MCVE), a version of your script that removes everything not required to show the problem:
#!/bin/bash
MAIN_DIR="/media/sf_data/pipeline"
echo MAIN_DIR
Expected output:
/media/sf_data/pipeline
Actual output:
MAIN_DIR
This is because bash requires a $ when expanding variables:
#!/bin/bash
MAIN_DIR="/media/sf_data/pipeline"
echo "$MAIN_DIR"
The quotes are not required to fix the issue, but prevent issues with whitespaces.
Hi You need couple of correction in your shell script see below. To get the actual value assigned to a variable you need to add $ at the front of the variable in shell script.
for file in $"MAIN_DIR"/audio_mp3/*.mp3;
do
cp -p "$file" "$FFMPEG_DIR";
done
for file in "$FFMPEG_DIR"/*.mp3;
./ffmpeg -i ${file%.mp3}.ogg
#provide full path like /usr/bin/ffmpeg
for file in "$FFMPEG_DIR"/*.raw;
do
cp -p "$file" "$MAIN_DIR"/pipeline/audio_raw/;
done

csh doesn't recognize command with command line options beginning with --

I have an rsync command in my csh script like this:
#! /bin/csh -f
set source_dir = "blahDir/blahBlahDir"
set dest_dir = "foo/anotherFoo"
rsync -av --exclude=*.csv ${source_dir} ${dest_dir}
When I run this I get the following error:
rsync: No match.
If I remove the --exclude option it works. I wrote the equivalent script in bash and that works as expected
#/bin/bash -f
source_dir="blahDir/blahBlahDir"
dest_dir="foo/anotherFoo"
rsync -av --exclude=*.csv ${source_dir} ${dest_dir}
The problem is that this has to be done in csh only. Any ideas on how I can get his to work?
It's because csh is trying to expand --exclude=*.csv into a filename, and complaining because it cannot find a file matching that pattern.
You can get around this by enclosing the option in quotes:
rsynv -rv '--exclude=*.csv' ...
or escaping the asterisk:
rsynv -rv --exclude=\*.csv ...
This is a consequence of the way csh and bash differ in their default treatment of arguments with wildcards that don't match a file. csh will complain while bash will simply leave it alone.
You may think bash has chosen the better way but that's not necessarily so, as shown in the following transcript where you have a file matching the argument:
pax> touch -- '--file=xyzzy.csv' ; ls -- *.csv
--file=xyzzy.csv
pax> echo --file=*.csv
--file=xyzzy.csv
You can see there that the bash shell expands the argument rather than giving it to the program as is. Both sides have their pros and cons.

Shell subcommand output to parent command

I have a one line command, lets say
grep needle haystack.file
What if i wanted to replace "needle" with the current working directory using the pwd command. Now it might be possible using pipes but I need the needle part to show the working directory and the rest of the command to be the same.
So preferably something like this:
grep (pwd) haystack.file
Which when executed would actually run the following command:
grep /var/www/html/ haystack.file
I've done a bit of searching and have found a lot of examples with pipes but it cant be applied in my scenario as the first part (grep) and second part (haystack.file) is fixed in an application.
Use the $PWD variable, always set:
grep "$PWD" haystack.file
You can also use command substitution:
grep "$(pwd)" haystack.file
Note the importance of quotes. Do it! Otherwise strange things can happen.
You can use command substitution
Test
$ echo $(pwd) > test
$ grep $(pwd) test
/home/xxx/yyy
OR
$ grep `pwd` test
/home/xxx/yyy
Security
It's always recomended to quote the command substituion to take care of the spaces in the output of pwd command
Test
$ pwd
/home/xxx/yyy/hello world
$(pwd) > test
$ grep $(pwd) test #without quoting
grep: world: No such file or directory
test:/home/xxx/yyy/hello world
$ grep "$(pwd)" test #with quoting
/home/xxx/yyy/hello world

/proc directory script

I'm looking for a ruby script that accesses the /proc directory and saves the process ID and command line (cmdline) information in a file.
you may want to call ps instead of going to /proc.
cmd=`ps -eo pid,cmd`
o = File.open("output","w")
o.write(cmd)
o.close
you can also run below one liner bash script and redirect its output anywhere, as well as choose required argument option for head command.
ls -alR /proc/$(ls /proc/ |grep -i '[0-9]'|sort -n|head ) > /proc_open_files

Resources