I have 300 directories with the following structure
Directory
|-- zipfile.tgz
| |-- anotherzip.tgz
| | |-- afurtherzip.tgz
| | | |-- ethernet1.txt
| | | |-- ethernet2.txt
| | | |-- ethernet3.txt
| | |-- files.txt
| |-- files.txt
|-- zipfile2.tgz
| |-- anotherzip.tgz
| | |-- afurtherzip.tgz
| | | |-- ethernet1.txt
| | | |-- ethernet2.txt
| | | |-- ethernet3.txt
| | |-- files.txt
| |-- files.txt
I need to grep the contents of the ethernetx.txt files and for any files containing 'abc=n' have it tell me the file path of the containing zipfile.tgz so i know where to find it.
Can anyone suggest a decent script / one liner i can use to achieve this? I dont want to have to recursively extrace every .tgz if i can avoid it but please let me know what you think.
Related
I am using django and I have this directory setup
/project
/apps
| /app1
| | /templates
| | /app1
| | app1.html
| | /static
| | /app1
| | /css (desired)
| | app1.css(desired)
| | /scss
| | _app1-partial.scss
| | app1.scss
| /app2
| /templates
| /app2
| app2.html
| /static
| /app2
| /css (desired)
| app2.css(desired)
| /scss
| _app2-partial.scss
| app2.scss
/templates
base.html
navbar.html
/static
| /images
| /css (desired)
| style.css(desired)
| /scss
| _partial.scss
| style.scss
/project
settings.py
manage.py
Is it possible that I can have a live sass compiler extension to compile multiple source scss inputs into multiple css output files?
I think I had it figure it out but I appreciate corrections for me to learn.
This is what I had before
"liveSassCompile.settings.formats": [
{
"format": "compressed",
"extensionName": ".css",
"savePath": "~/css",
"savePathReplacementPairs": null
}
],
"liveSassCompile.settings.includeItems": [
"/static/*.scss",
"/static/**/*.scss"
],
Changes I made
"liveSassCompile.settings.includeItems": [
"/static/*.scss", // for root static scss file
"/apps/**/static/**/*.scss" // for app specific scss file
],
Issue
I have been struggling with writing a Bash command that is able to recursively search a directory and then return the paths of every sub-directory (up to a certain max-depth) that contains exclusively hidden files and/or hidden directories.
Visual Explanation
Consider the following File System excerpt:
+--- Root_Dir
| +--- Dir_A
| | +--- abc.txt
| | +--- 123.txt
| | +--- .hiddenfile
| | +--- .hidden_dir
| | | +--- normal_sub_file_1.txt
| | | +--- .hidden_sub_file_1.txt
| |
| +--- Dir_B
| | +--- abc.txt
| | +--- .hidden_dir
| | | +--- normal_sub_file_2.txt
| | | +--- .hidden_sub_file_2.txt
| |
| +--- Dir_C
| | +--- 123.txt
| | +--- program.c
| | +--- a.out
| | +--- .hiddenfile
| |
| +--- Dir_D
| | +--- .hiddenfile
| | +--- .another_hiddenfile
| |
| +--- Dir_E
| | +--- .hiddenfile
| | +--- .hidden_dir
| | | +--- normal_sub_file_3.txt # This is OK because its within a hidden directory, aka won't be checked
| | | +--- .hidden_sub_file_3.txt
| |
| +--- Dir_F
| | +--- .hidden_dir
| | | +--- normal_sub_file_4.txt
| | | +--- .hidden_sub_file_4.txt
Desired Output
The command I am looking for would output
./Root_Dir/Dir_D
./Root_Dir/Dir_E
./Root_Dir/Dir_F
Dir_D because it only contains hidden files.
Dir_E because it only contains a hidden file and a hidden directory at the level I am searching.
Dir_F because it only contains a hidden directory at the level I am searching.
Attempts
I have attempted to use the find command to get the results I am looking for but I can't seem to figure out what other command I need to pipe the output to or what other options I should be using.
I think the command that will work would look something like this:
$ find ./Root_Dir -mindepth 2 -maxdepth 2 -type d -name "*." -type -f -name "*." | command to see if these are the only files in that directory
Parsing find's output is not a good idea; -exec exists, and sh can do the filtering without breaking anything.
find . -type d -exec sh -c '
for d; do
for f in "$d"/*; do
test -e "$f" &&
continue 2
done
for f in "$d"/.[!.]* "$d"/..?*; do
if test -e "$f"; then
printf %s\\n "$d"
break
fi
done
done' sh {} +
You can adjust the depth using whatever extension your find provides for it.
Assuming this structure, you don't need find.
Adjust your pattern as needed.
for d in $ROOT_DIR/Dir_?/; do
lst=( $d* ); [[ -e "${lst[0]}" ]] && continue # normal files, skip
lst=( $d.* ); [[ -e "${lst[2]}" ]] || continue # NO hidden, so skip
echo "$d"
done
I rebuilt your file structure in my /tmp dir and saved this as tst, so
$: ROOT_DIR=/tmp ./tst
/tmp/Dir_D/
/tmp/Dir_E/
/tmp/Dir_F/
Note that the confirmation of hidden files uses "${lst[2]}" because the first 2 will always be . and .., which don't count.
You could probably use for d in $ROOT_DIR/*/.
I suspect this'll do for you. (mindepth=2, maxdepth=2)
If you needed deeper subdirectories (mindepth=3, maxdepth=3) you could add a level -
for d in $ROOT_DIR/*/*/
and/or both (mindepth=2, maxdepth=3)
for d in $ROOT_DIR/*/ $ROOT_DIR/*/*/
or if you didn't want a mindepth/maxdepth,
shopt -s globstar
for d in $ROOT_DIR/**/
This will work if your filenames contain no newlines.
find -name '.*' | awk -F/ -v OFS=/ '{ --NF } !a[$0]++'
Learn awk: https://www.gnu.org/software/gawk/manual/gawk.html
I need to upload files in different folders with same name, recursively to a remote folder. Here a example.
Local
|-app1-1
| |-src
| |-img
| |-static
| |-app1-2 <------Upload from this
| |-file1-1
| |-file1-2
|-app2-1
| |-src
| |-img
| |-static
| |-app2-2 <------Upload from this
| |-file2-1
| |-file2-2
Here the remote result I need.
Remote Result
|-folder1
|-folder2
|-folder3
|-static
| |-app1-2 <------Upload from this
| | |-file1-1
| | |-file1-2
| |
| |-app2-2 <------Upload from this
| | |-file2-1
| | |-file2-1
I have testing with scripts like this:
- >
lftp
-e "mirror
--include ^static/
--exclude ^\.git.*
-eRv $CI_PROJECT_DIR Remote Result/static; quit;"
sftp://$CREDENTIALS
But this generate a output not desired:
Remote Result
|-folder1
|-folder2
|-folder3
|-static
| |-app1-1 <------Upload from this
| | |-static
| | |-app1-2
| | |-file1
| | |-file2
...... (Same other folder)
I am a complete Makefile newb, So I'm wondering how I would loop through a specific directory and remove specific subdirectories?
Say I have:
[app]
|
| - [src]
| |
| |- [foo]
| |
| |- [foo-bar]
| |
| |- [bar]
| |
| |- [baz]
|
How would I loop through src and :
remove foo
remove foo-bar
leave bar
remove baz
I have actually about 8 specific folders I wish to delete from src.
Any help is appreciated
clean:
rm -f /src/foo
rm -f /src/foo-bar
rm -f /src/baz
I was wondering in case I could do the following, better than what I have.
Objective: Identify files with the same name in the directory tree. I do not have any knowledge if there would be any duplicate file or the location/name of such files.
Expected output: List the files with the location.
Input provided: path of top directory for search.
My algorithm:
1.list all file in the target directory (I have used find -name ".")
2.List1: sort the file names
3.List2: Uniquify files names
4.Diff lists from step 2 & 3 to get the repeated file
5.extract the location.
Sample Directory:
temp/
|-- d1
| |-- d2
| | `-- f3
| |-- d3
| | `-- f3
| |-- f1
| `-- f2
`-- d4
|-- d5
| |-- f2
| `-- f6
|-- f4
`-- f5
> find temp/ -type f -follow -print | sed 's;.*/;;' | sort -u > ~/tmp/12
> find temp/ -type f -follow -print | sed 's;.*/;;' | sort -n > ~/tmp/11
> diff ~/tmp/11 ~/tmp/12
3,4d2
< f2
< f3
> find temp/ -name f2
temp/d1/f2
temp/d4/d5/f2
> find temp/ -name f3
temp/d1/d2/f3
temp/d1/d3/f3
I want to simplify this process. Any help would be appriciated. Please let me know in case you need further details.
Guys this is a solution that I identified fitting my needs and may help you:
Your comments are welcome.
set idirectory = `echo $* | awk '{print $1}'`
if ( -d $idirectory ) then
foreach xxx (`find $idirectory -type f -follow -print | sed 's;.*/;;' | sort -n | uniq -d`)
echo "Multiple files found for " $xxx
find $idirectory -name $xxx
end
endif