How to get files and subfolders of a folder - bash

I am trying to create a simple bash script which will echo all the files from a folder, including subfolders. The following is my code. But the output I am getting is just ls $fromFolder
#! /bin/bash
fromFolder="~/proj/activex"
toFolder="~/proj/outgoing"
files='ls $fromFolder'
for file in $files
do
echo $file
done
Thanks

No need to use ls command here. You can simply replace your for loop as:
for file in ~/proj/outgoing/*
do
echo $file
done

find $fromfolder -print
will print all of the files and subdirectories in $fromfolder.
This lists regular files
find $fromfolder -print -type f
This lists directories
find $fromfolder -print -type d
In you code --this has a problem
files='ls $fromFolder'
$fromfolder will never be "translated" into its value by bash because of the single quotes.

You need to use double quotes instead of singles, which will allow the shell to expand the fromFolder variable:
files="ls $fromFolder"
Although anubhava's solution is better

Related

How to loop through subdirectories and files in subdirectories then print file names in bash

Given a root path, I am trying to loop through the sub-directories to loop through the files in each subdirectory and print the names of the files.
The directory structure is like this:
Root directory
dir2,
file{1..10}
dir3,
file{1..10}
dir4
file{1..10}
I want to loop through dir2 and print all the filenames in it. Then loop through dir3 and print all the file names...and so on
Here is what I have so far:
#!/bin/bash
#!/bin/sh
cd /the/root/directory
for dir in */
do
for FILE in dir
do
echo "$FILE"
done > /the/root/directory/filenames.txt
done
This is the output I get in filenames.txt:
dir
My expected output is supposed to be:
file{1..10}
file{1..10}
file{1..10}
I am a beginner to bash scripting...well scripting in general.
Any help is greatly appreciated!
You didn't mention what your end goal is, so I'll speculate here.
If your end goal is to only see the files recursively in a list, you can run just a simple find command:
find . -type f
Or if you want to see the details:
find . -type f -ls
A nice way to view them with colors and nice-looking ansi bars is to install the tree command. Example:
https://www.tecmint.com/linux-tree-command-examples/
If your needs are simple, for example, you want to do an action such as a tail -n1 on each file, you can pipe the command to xargs like this:
find . -type f | xargs tail -n1
But if your end goal is to use bash to process them in some way, then you can continue down the bash looping method as mentioned by #tjm3772.
You mentioned you were just looking for the filenames so you can just run:
find . -type f | sed 's/.*\///'
If you want to write that to a file, just redirect the output to a filename of your choice:
find . -type f | sed 's/.*\///' > filename.txt
You can use the find command, and this will loop through the directories without needing the for loop
my_bash_script.sh:
find * -type d > filenames.txt
Put this script in the same level of the directories, or you point it to the location by changing the * to the path
note: if it says permission denied in the terminal run this: chmod u+x the_script_name.sh
You forgot to expand $dir in your inner loop, so the loop is executing one time with FILE set to the literal string 'dir' instead of the directory name.
After that, you need a globbing pattern to expand to the filenames inside the directory.
Fixed example:
#!/bin/bash
cd /the/root/directory
for dir in */
do
for FILE in "$dir/"*
do
echo "$FILE"
done > /the/root/directory/filenames.txt
done
The bash's way to do it is with the globstar expansion which expands recursively in directories
#!/usr/bin/env bash
shopt -s globstar # This enables recursively expanding files in directories
# This prints all the files in all the directories starting from /the/root/directory
printf '%s\n' /the/root/directory/**

Renaming multiple files in a nested structure

I have a directory with this structure:
root
|-dir1
| |-pred_20181231.csv
|
|-dir2
| |-pred_20181234.csv
...
|-dir84
|-pred_2018123256.csv
I want to run a command that will rename all the pred_XXX.csv files to pred.csv.
How can I easily achieve that?
I have looked into the rename facility but I do not understand the perl expression syntax.
EDIT: I tried with this code: rename -n 's/\training_*.csv$/\training_history.csv/' *.csv but it did not work
Try with this command:
find root -type f -name "*.csv" -exec perl-rename 's/_\d+(\.csv)/$1/g' '{}' \;
Options used:
-type f to specify file or directory.
-name "*.csv" to only match files with extension csv
-exec\-execdir to execute a command, in this case, perl-rename
's/_\d+(\.csv)/$1/g' search a string like _20181234.csv and replace it with .csv, $1 means first group found.
NOTE
Depending in your S.O. you could use just rename instead of perl-rename.
Use some shell looping:
for file in **/*.csv
do
echo mv "$(dirname "$file")/$(basename "$file")" "$(dirname "$file")/pred.csv"
done
On modern shells ** is a wildcard that matches multiple directories in a hierarchy, an alternative to find, which is a fine solution too. I'm not sure if this should instead be /**/*.csv or /root/**/*.csv based on tree you provided, so I've put echo before the 'mv' to see what it's about to do. After making sure this is going to do what you expect it to do, remove the echo.

bash loop through all find recursively in sub-directories

I have a bash script that looks like the following:
#!/bin/bash
FILES=public_html/*.php # */ stupid syntax highlighter!
for f in $FILES
do
echo "Processing $f file..."
# take action on each file.
done
Now I need it to go through all subdirectories in public_html so it should run on:
/public_html/index.php
/public_html/forums/status.php
/public_html/really/deep/file/in/many/sub/dirs/here.php
What do I change FILES=public_html/*.php to in order to do that?
Also I need to check to make sure that there is at least one file or else it prints
Processing *.php file...
FILES=$(find public_html -type f -name '*.php')
IMPORTANT: Note the single quotes around the *.php to prevent shell expansion of the *.
FILES=`find public_html -type d`
$FILES will now be a list of every single directory inside public_html.

Bash scripting, loop through files in folder fails

I'm looping through certain files (all files starting with MOVIE) in a folder with this bash script code:
for i in MY-FOLDER/MOVIE*
do
which works fine when there are files in the folder. But when there aren't any, it somehow goes on with one file which it thinks is named MY-FOLDER/MOVIE*.
How can I avoid it to enter the things after
do
if there aren't any files in the folder?
With the nullglob option.
$ shopt -s nullglob
$ for i in zzz* ; do echo "$i" ; done
$
for i in $(find MY-FOLDER/MOVIE -type f); do
echo $i
done
The find utility is one of the Swiss Army knives of linux. It starts at the directory you give it and finds all files in all subdirectories, according to the options you give it.
-type f will find only regular files (not directories).
As I wrote it, the command will find files in subdirectories as well; you can prevent that by adding -maxdepth 1
Edit, 8 years later (thanks for the comment, #tadman!)
You can avoid the loop altogether with
find . -type f -exec echo "{}" \;
This tells find to echo the name of each file by substituting its name for {}. The escaped semicolon is necessary to terminate the command that's passed to -exec.
for file in MY-FOLDER/MOVIE*
do
# Skip if not a file
test -f "$file" || continue
# Now you know it's a file.
...
done

Cropping out files from a list of files in bash

So I would like to do a simple find in a dir with:
find /HOME/ | grep .properties
Then with this list I want to weed out certain files, lets say one is server.properties and another is testing.properties.
After those have been taken out, I want to do a quick for loop that will pass each remaning file that didn't get filtered out into a function one by one. The function call is just something like
extractHash FILE OUTPUTFILE
I hope this makes sense, I'll try to be more clear if it's not.
Thanks
for file in "`find ~ -name \*.properties |grep -v -e server.properties -e testfile.properties`"; do
extractHash $file output
done
Use while, not for, for iterating over files: for will not work as you expect for iterating over the output of a backtick-ed program if there is extraneous whitespace:
find /HOME -name \*.properties \! -name server.propertiees \! -name testing.properties` |
while read -r file; do
extractHash "$file" OUTPUTFILE
done
If all your files are in the current directory, use an extended globbing pattern, and for is appropriate to iterate over filename wildcards:
shopt -s extglob
for file in !(server|testing).properties; do
extractHash "$file" out
done
In csh you would use foreach:
#!/bin/csh
set files=`find /HOME/ | grep .properties`
foreach file ($files)
set outfile = $file.out
extractHash $file $outfile
end
not sure about bash - it has a similar for loop but I never learned it :)
First, I would recommend using the -name argument for find instead of piping every filename through grep. Then you can do something like:
for file in `find /HOME -name \*.properties \! -name server.propertiees \! -name testing.properties`; do
extractHash "$file" OUTPUTFILE
done

Resources