Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I want to make a bash script that will ls the catalog and if it finds file test it will launch mplayer, is it possible?
If I understand your requirement correctly, one option would be to use find:
find /path/to/catalog -type f -exec mplayer {} +
This searches the catalog directory for any files and builds a command using the results (for example if file1 and file2 were found, the command executed would be mplayer file1 file2). If no files are found, no command will be executed.
Related
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 2 years ago.
Improve this question
How i could execute paths from .txt in my script?
For Example:
###################
foo.txt
/home/foo_1/public/
/home/foo_2/public/
[...]
/home/foo_n/public/
Then I want my script to look for optional file in every path from the .txt.
How I can do this?
Some loop?
Greetings
Using xargs and find Utilities
Assuming your file has no extraneous data (it's hard to tell from your original post) and holds one directory path per line, you can simply use the -n or -L flags with xargs. For example, given a source file like:
/home/foo_1/public/
/home/foo_2/public/
you could invoke find like so:
xargs -I{} -L find "{}" -name "filename_to_find" < file_paths.txt
There may be ways to do this more efficiently, of course, but this seems conceptually simpler to me than writing your own read and loop statements. Your mileage (and input data quality) may vary.
You can just concatenate all of them into a single path variable
$ path=$(paste -ds: foo.txt)
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 6 years ago.
Improve this question
I'm trying to make a script in bash without "awk" to find every directories that can be accessed.
I did not see any post about this issue.
Thanks
Assuming you meant readable directories, find (at least the GNU version) makes this pretty easy:
find /path/to/root -type d -readable
which could similarly be used to find unreadable dirs,
find /path/to/root -type d ! -readable
There is also a -writable option which does what you'd expect, and also -executable which for directories means searchable.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
printing all files in directory and sub directories
hey, i am very new to bash and i need to write a One Liner in bash that prints the content of every file in the least recently modified directory and all of it's sub directories.
what is the best way to go about this?
i tried using the find command that looks useful but i am not sure how to use it for this application, and how do i reach the directory that was least recently used.
thanks in advance to anyone that can help :)
find `ls -t1|head -1` -type f -exec cat {} \;
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
I am getting somewhat different results with using these commands:
ls "$MYDIR/*.avi"
md5sum "$MYDIR/*.avi"
using win-bash. The former lists only the files that end with .avi while the latter does the checksum calculation for all files containing .avi. Is this expected? I thought the wildcard operation should work the same throughout.
Because you're quoting the wildcard, it is not being expanded by the shell (but the variable is). That means you're letting the command decide what to do with the * character.
You want the shell to expand the filenames before invoking the command:
ls "$MYDIR"/*.avi
md5sum "$MYDIR"/*.avi
You might want to store the results in an array if you're reusing them
files=( "$MYDIR"/*.avi )
ls "${files[#]}"
md5sum "${files[#]}"
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
So, I'm building my first script, it's to unzip files from two different directories and merge said directories together and then to append all the files of the same name together. The only part I'm struggling with is the appending multiple files of the same name together. What's a good way to go about that?
That depends on the directory structure of each archive, is it the same? In that case, assume unzipp'ed files are in a/ and b/, do something like this:
mkdir c
for f in a/*; do
cat a/"$f" b/"$f" > c/"$f"
done
Instead of using cp for the second directory, do
for file in source/* ; do
cat "$file" >> target/"${file#*/}"
done