Bash - How to execute paths from file [closed] - bash

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 2 years ago.
Improve this question
How i could execute paths from .txt in my script?
For Example:
###################
foo.txt
/home/foo_1/public/
/home/foo_2/public/
[...]
/home/foo_n/public/
Then I want my script to look for optional file in every path from the .txt.
How I can do this?
Some loop?
Greetings

Using xargs and find Utilities
Assuming your file has no extraneous data (it's hard to tell from your original post) and holds one directory path per line, you can simply use the -n or -L flags with xargs. For example, given a source file like:
/home/foo_1/public/
/home/foo_2/public/
you could invoke find like so:
xargs -I{} -L find "{}" -name "filename_to_find" < file_paths.txt
There may be ways to do this more efficiently, of course, but this seems conceptually simpler to me than writing your own read and loop statements. Your mileage (and input data quality) may vary.

You can just concatenate all of them into a single path variable
$ path=$(paste -ds: foo.txt)

Related

Create Shell Script to Find Shell Scripts in Folder and Execute Them [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I have complex folder structure under one main folder. The folder where shell script can be is max 3 levels deep.
I want to write a shell script which will scan through all folders, find any file ending in .sh and execute the same.
Any help would be appreciated.
Try the below script.
#!/bin/bash
all_files=` find Testdir -name "*.sh" -type f`
for file in $all_files
do
bash $file
done
The find command finds all the file names that ends with .sh extension recursively and the output is stored in a variable.
Using that variable, the for loop executes all the files one by one.
Note :
Make sure to have the if condition inside the for loop to check the file name is not a current executing script file. Because, if you
execute that file, it will get into infinity. It again and again
executes infinitely.
Try
#! /bin/bash
find /mainfolder -name '*.sh' > /home/dumm.txt
while read abc
do
$abc
done < <(cut -f1 /home/dumm.txt)
Explanation:
The find command searches and lists the (full) path for file names that end with '.sh' under your main folder
The list is stored in a file
Each row of this file is then executed in the while loop

bash: print all files in directory and subdirectories [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
printing all files in directory and sub directories
hey, i am very new to bash and i need to write a One Liner in bash that prints the content of every file in the least recently modified directory and all of it's sub directories.
what is the best way to go about this?
i tried using the find command that looks useful but i am not sure how to use it for this application, and how do i reach the directory that was least recently used.
thanks in advance to anyone that can help :)
find `ls -t1|head -1` -type f -exec cat {} \;

Simple bash script if function [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I want to make a bash script that will ls the catalog and if it finds file test it will launch mplayer, is it possible?
If I understand your requirement correctly, one option would be to use find:
find /path/to/catalog -type f -exec mplayer {} +
This searches the catalog directory for any files and builds a command using the results (for example if file1 and file2 were found, the command executed would be mplayer file1 file2). If no files are found, no command will be executed.

Different results for same argument for different Unix functions [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
I am getting somewhat different results with using these commands:
ls "$MYDIR/*.avi"
md5sum "$MYDIR/*.avi"
using win-bash. The former lists only the files that end with .avi while the latter does the checksum calculation for all files containing .avi. Is this expected? I thought the wildcard operation should work the same throughout.
Because you're quoting the wildcard, it is not being expanded by the shell (but the variable is). That means you're letting the command decide what to do with the * character.
You want the shell to expand the filenames before invoking the command:
ls "$MYDIR"/*.avi
md5sum "$MYDIR"/*.avi
You might want to store the results in an array if you're reusing them
files=( "$MYDIR"/*.avi )
ls "${files[#]}"
md5sum "${files[#]}"

In Bash scripting, what's a good way to append multiple files of the same names together? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
So, I'm building my first script, it's to unzip files from two different directories and merge said directories together and then to append all the files of the same name together. The only part I'm struggling with is the appending multiple files of the same name together. What's a good way to go about that?
That depends on the directory structure of each archive, is it the same? In that case, assume unzipp'ed files are in a/ and b/, do something like this:
mkdir c
for f in a/*; do
cat a/"$f" b/"$f" > c/"$f"
done
Instead of using cp for the second directory, do
for file in source/* ; do
cat "$file" >> target/"${file#*/}"
done

Resources