Unix: file names and location to excel sheet - bash

I am new to Unix system. I am not sure is this task possible or how ?
I have this directory structure
c: parent dir/ child dir/file1
c: parent dir/ child dir/file2
c: parent dir/ child dir/file3
.
.
I need to export all this files name and its full location(path) to an excel sheet.
Any hint/help would be appreciated.
thanks

You can export the list returned from find to a CSV file. To get a list of files only (not directories) in a specific folder, not recursively, you would do:
find `pwd` -maxdepth 1 -type f -print > files.csv

I do not know of a way to write to an excel sheet, as it is not a text file. The best approach would probably be to output your data to a file in a format such as CSV or just plain text and then copy and paste that into an excel sheet.
Example:
$] find `pwd` > myfile.txt
/absolute/path/to/file0
/absolute/path/to/file1
...
And copy and paste that to your excel sheet. Or you could parse the data in from find and insert a comma after each line. I'll leave that as homework for you... should be easy enough.

I don't have access to a Unix system right now, but from a console window try
cd "root of directory structure"
find . -print > myFiles.csv
Load the .csv into Excel then putter as needed.
You may want to read up on the find command - it's very useful.

Related

making text files containing list of files in different directories with random name in bash

I have a script which makes some files in pair (all the resulting files have either R1 OR R2 in the file name) and then put each pair in a separate directory with random name (every time will be different and I cannot predict). for example if I have 6 files, 3 files would be file_R1.txt or file_R2.txt but in pair like this example:
s1_R1.txt and s1_R2.txt
s2_R1.txt and s2_R2.txt
s3_R1.txt and s3_R2.txt
In this example I will have 3 directories (one per pair of files). I want to make 2 text file (d1.txt and. d2.txt) containing above file names. In fact, d1.txt will have all the files with R1 and d2.txt will contain all the files with R2. To do so, I made the following short bash code but it does not return what I want. Do you know how to fix it?
For file in /./*R*.txt;
do;
touch "s1.txt"
touch "s2.txt"
echo "${fastq}" >> "s1.txt"
done
Weird question, not sure I get it, but for your d1 and d2 files:
#!/bin/bash
find . -type f -name "*R1*" -print >d1.txt
find . -type f -name "*R2*" -print >d2.txt
find is used since you have files under different sub-directories.
Using > ensures the text files are emptied out if they already exist.
Note the code you put in your question is not valid bash.

How do i search for the file names that match the pattern we set and write the output to a text file in bash

Example: /apps/mft/local/tmp/folder1/folder2
Let say I have many files at random in folder 1 and folder 2. My requirement is to find the files that match the pattern $DAY-log.xml and write the result file names to a text file. Which command should i use in the bash script to perform this task. Please help.
find -name "$DAY-log.xml" > listOfFiles.txt
It will recursively find the pattern that you want, starting from the current directory, and output all the result to the listOfFiles.txt file.
If you want, you can even give a starting path like this:
find /path/where/to/start -name "$DAY-log.xml" > listOfFiles.txt

How to loop through the result of find in bash and perform operations on each of them

I am very new to bash scripting. I need to perform same operation on 300 files. I have tried the following so far:
empty_files=$(find ./ -type f -empty)
echo "$empty_files"
Now I have the full path to all the 300 files, stored in variable empty_files.
Now what I want to do is, for each file in the result
go to it's parent's parent directory,
then go to the other child (sibling of earlier file's parent directory) and find a file by name 'abc.js' inside it
Now, inside abc.js, find a particular word (object property) "user"
now on the new line, insert a line of js code. (exact same code for all files)
Please let me know if it's possible from mac command line.
Thanks
You can use a for loop:
for file in "${empty_files[#]}"
do
... code that uses "$file"
done
You could also pipe directly from find into the loop:
find . -type f -empty | while read -r file
do
... code that uses "$file"
done
This version should work for any filenames that don't contain newline.

Shell script to write folder tree structure to file

I need a shell script that writes the tree structure (including all data in the folders) from a specific folder to a text/dat file.
So far I got this:
find . -type f|sed 's_\.\/__' > PATH/Input.dat
I dont want a "/" as the first char of the path.
This script works fine, but it returns ALL folder structures. I need something that returns only structures from a specific folder, like "Sales".
I need something that returns only structures from a specific folder,
like "Sales".
Specify the desired folder name. Say:
find Sales -type f | sed 's_\.\/__'
^^^^^
Saying find . ... would search in . (i.e. the current directory and subdirectories).
If you need to search more folders, say Purchase, specify those too:
find Sales Purchase -type f | sed 's_\.\/__'

batch rename files and folders at once

I got help regarding the following question:
batch rename files with ids intact
It's a great example of how to rename specific files in a group, but I am wondering if there is a similar script I could use to do the following:
I have a group of nested folders and files within a root directory that contain [myprefix_foldername] and [myprefix_filename.ext]
I would like to rename all of the folders and files to [foldername] and [filename.ext]
Can I use a similar methodology to what is found in the post above?
Thanks!
jml
Yes, quite easily, with find.
find rootDir -name "myprefix_*"
This will give you a list of all files and folders in rootDir that start with myprefix_. From there, it's a short jump to a batch rename:
find rootDir -name "myprefix_*" | while read f
do
echo "Moving $f to ${f/myprefix_/}"
mv "$f" "${f/myprefix_/}"
done
EDIT: IFS added per http://www.cyberciti.biz/tips/handling-filenames-with-spaces-in-bash.html
EDIT 2: IFS removed in favor of while read.
EDIT 3: As bos points out, you may need to change while read f to while read -d $'\n' f if your version of Bash still doesn't like it.

Resources