List of object files to gcc format - bash

I'm trying to link all my files with object *.o extension in my directory.
I tried to use:
for i in $(find . -name "*.o" -type f);
do
echo $i >> myFiles
done
Then I need:
gcc -o myFile <myFiles
gcc: fatal error: no input files
compilation terminated.

I see several problem in your approach:
find should not be used in a loop, but rather with a -fprint <file>. So in your case:
find . -name "*.o" -type f -fprintf myfiles
Secondly, redirecting your file to gcc stdin will not work as you think, as it uses the input as the source of the code: see this question. What you want instead is to expand the list of objects to a list of arguments:
cat myfiles | xargs gcc -o myFile
xargs does it nicely. But as #n.m. mentioned, you could do everything at once with a command substitution:
gcc -o myfile $(find . -type f -name *.o -print0)
The only difference I suggest is to use a -print0 so that find put a \0 at the end of a find instead of a \n.
Good Luck

Related

Bash script to return all elements given an extension, without using print flags

I want to create shell script that search inside all folders of the actual directory and return all files that satisfy some condition, but without using any print flag.
(Here the condition is to end with .py)
What I have done:
find . -name '*.py'| sed -n 's/\.py$//p'
The output:
./123
./test
./abc/dfe/test3
./testing
./test2
What I would like to achieve:
123
test
test3
testing
test2
Use -exec:
find . -name '*.py' -exec sh -c 'for f; do f=${f%.py}; echo "${f##*/}"; done' sh {} +
If GNU basename is an option, you can simplify this to
find . -name '*.py' -exec basename -s .py {} +
POSIX basename is a little more expensive, as you'll have to call it on every file individually:
find . -name '*.py' -exec basename {} .py \;
Using GNU grep instead of sed:
find . -name '*.py' | grep -oP '[^/]+(?=\.py$)'
If portability is not a concern, this is a very readable option:
find . -name '*.py' | xargs basename -a
This is also differentiated from chepner's answer in that it retains the .py file ending in the output.
I'm not familiar with the -exec flag, and I'm sure his one-liners can be customized to do the same, but I couldn't do so off the top of my head.
Chepner's version achieves the same with the small modification:
find . -name '*.py' -exec basename {} \;
if you want the literal output from find and didn't intend to drop the file endings when you used dummy variables (123,test, etc.) in your question.
find shows entries relative to where you ask it to search, you can simply replace the . with a *:
find * -name '*.py'| sed -n 's/\.py$//p'
(Be aware that this skips top level hidden directories)
This might work for you (GNU parallel):
find . -name '*.py*' 2>/dev/null | parallel echo "{/.}"

Why is my find and xargs copy command working for one folder and not for the other?

I have two directories, x86_64 and i386.
I want to filter out test RPMS from both of these folders and place them in a seperate one; test_release/{version}-x86_64/x86_64 and test_release/{version}-i386/i386, respectively.
So my first command works fine:
find x86_64/ -type f -name '*test*' -o -name '*demo*' -o -name '*log*' |
xargs cp -rt test_release/${RELEASE}-x86_64/x86_64
My second command is exactly the same, except with different folder names:
find i386/ -type f -name '*test*' -o -name '*demo*' -o -name '*log*' |
xargs cp -rt test_release/${RELEASE}-i386/i386
Only the second command gives me the error:
cp: missing file operand
Am I missing something?
The error happens because your find command doesn't return any files. Disappointingly, xargs still runs cp, but ends up calling it with too few arguments; and so you get the error message.
GNU xargs has an -r option which solves this specific case; but a more portable and more robust solution is to use find -exec.
find i386/ -type f -name '*test*' -o -name '*demo*' -o -name '*log*' \
-exec cp -rt "test_release/${RELEASE}-i386/i386" {} +
The + statement terminator for -exec is not entirely portable, but if you have cp -t I guess you're on Linux, or at least are using a recent find (GNU or not GNU. If not, replacing + with \; is a workaround, though it will end up running more processes than the equivalent xargs construct).

Using find on multiple file extensions in combination with grep

I am having a problems using find and grep together in msys on Windows. However, I also tried the same command on a Linux machine and it behaved the same. Notwithstanding, the syntax below is for windows in that the semicolon on the end of the command is not preceded by a backslash.
I am trying to write a find expression to find *.cpp and *.h files and pass the results to grep. If I run this alone, it successfully finds all the .cpp and .h files:
find . -name '*.cpp' -o -name '*.h'
But if I add in an exec grep expression like this:
find . -name '*.cpp' -o -name '*.h' -exec grep -l 'std::deque' {} ;
It only greps the .h files. If I switch the .h and .cpp order in the command, it only searches the .h. Essentially, it appears to only grep the last file extension in the expression. What do I need to do to grep both .h and .cpp??
Since you're using -o, you will need to use parentheses around it:
find . \( -name '*.cpp' -o -name '*.h' \) -exec grep -l 'std::deque' {} \;
Or.. you can ...
bash$> grep '/bin' `find . -name "*.pl" -o -name "*.sh"`
./a.sh:#!/bin/bash
./pop3.pl:#!/usr/bin/perl
./seek.pl:#!/usr/bin/perl -w
./move.sh:#!/bin/bash
bash$>
Above command greps 'bin' in ".sh" and ".pl" files. And it has found them !!

Bash Script How to find every file in folder and run command on it

So im trying to create a script that looks in a folder and finds all the file types that have .cpp and run g++ on them. so far i have but it doesn't run it says unexpected end
for i in `find /home/Phil/Programs/Compile -name *.cpp` ; do echo $i ;
done
Thanks
The problem with your code is that the wildcard * is being expanded by the shell before being passed to find. Quote it thusly:
for i in `find /home/Phil/Programs/Compile -name '*.cpp'` ; do echo $i ; done
xargs as suggested by others is a good solution for this problem, though.
find has an option for doing exactly that:
find /p/a/t/h -name '*.cpp' -exec g++ {} \;
This code works for me:
#!/bin/bash
for i in `find /home/administrator/Desktop/testfolder -name *.cpp` ; do echo $i ;
done
I get:
administrator#Netvista:~$ /home/administrator/Desktop/test.sh
/home/administrator/Desktop/testfolder/main.cpp
/home/administrator/Desktop/testfolder/main2.cpp
You could use xargs like:
find folder/ -name "*.cpp" | xargs g++
Or if you want to handle files which contain whitespaces:
find folder/ -name "*.cpp" -print0 | xargs -0 g++
I think you want to use xargs:
For example:
find /home/Phil/Programs/Compile -name *.cpp | xargs g++
How about using xargs like this:
find $working_dir -type f -name *.cpp | xargs -n 1 g++

How to search subdirectories for .c files and compile them (shell scripting)

I need to take an argument which is a directory of the current directory and search its folders and compile any C files in those folders. I'm just beginning shell scripting in Bash and am a little over my head.
So far things I've tried included using find to search for the files and then pipe it to xargs to compile but kept getting an error saying that testing.c wasn't a directory.
find ~/directory -name *.c | xargs gcc -o testing testing.c
I've also tried ls -R to search folders for .c files but don't know how to then take the paths as arguments to then move to and compile?
find ~/directory -type f -name "*.c" -print0 |
while IFS= read -r -d '' pathname; do
gcc -o "${pathname%.c}" "$pathname"
done
find directory -type f -name "*.c" -exec sh -c \
'cd $(dirname $1);make $(basename $1 .c)' sh {} \;
As #shx2 suggested, using make (or some other build system) would arguably be the best approach. You don't want to go compiling files in some source tree without a proper build system.

Resources