I got the following directories :
.
|__ scripts
|
|__ logs
In my logs folder, I got files formated this way :
AAAAAAA_X1-09-09-2018.log
BBBBBBB_Y2-09-09-2018.log
CCCCCCC_Z3-09-09-2018.log
When I run the command ls | grep AAAAAAA*.log" from logs it works fine :
user /my/path/logs #> ls | grep AAAA*log
AAAAAAA_X1-09-09-2018.log
But if ran from scripts directory, I got no match :
user /my/path/scripts #> ls ../logs | grep AAAAA*log
I noticed that the command ls ../logs | grep AAAAA* would work, but I have to force the .log to be matched (other files being generated in that directory sometimes). I can fix this issue by doing :
ls ../logs | grep AAAAA* | grep log
but I wonder, why ls ../logs | grep AAAAA*log doesn't work from scripts but logs ?
Better change directory to log like below first:-
cd /full/path/log
ls | grep AAAA*log
cd - #go back to the original path
Modify above in your script and try.
Also try to follow the instruction given by Kamil Cuk.
Related
I am trying to learn find and got stuck when tried to find files (if there are any) from current directory that a user can print (so to which user has read permissions). Pretty much what this command does:
find / -type f -user whateverUser
To do this using grep:
ls -o to list the current folder
ls -o . */** to list sub folder as well, but for each file show the full sub path
ls -o */** | grep -v '^d' to exclude directories in the list
To see which files 'you' can read, you can grep for the owner :
ls -o . */** | grep -v '^d' | grep " $LOGNAME "
A subtlety is that you can all read files that have group group level r permission, or all user level r permission.
To include the latter is easy:
ls -o . */** | grep -v '^d' | egrep "( $LOGNAME |^.......r..)"
To include files you can access via group permissions is a little more complex. Not tested so here goes:
GROUP_GREP=$( groups | sed 's/ /|/g' )
ls -l */** | grep -v '^d' | egrep "( $LOGNAME |^.......r..|$GROUP_GREP)"
Now, you might need to make refinements in the cases where the filename contains any grep search terms (e.g. the user name, or group names).
What does this command do? Can it be shortened without losing functionality?
echo "abcabcabc" | sed "s/aBc/xyZ/gi;s/Z/a/;s/c/CCC/g" | xargs ls -ld
You would know what the command does if you entered it on a Mac OS terminal. It doesn't do anything but producing an error because the i in the sed command seems to be wrong or a typing error. However
echo "abcabcabc" | sed "s/aBc/xyZ/g;s/Z/a/;s/c/CCC/g" | xargs ls -ld
outputs the string abcabcabc which is used as input for the sed command which replaces aBc with xyZ, Z with a and c with CCC. This results in the string abCCCabCCCabCCC which is again input for the ls -ln command. If you had a file with the name abCCCabCCCabCCC in the current directory it would be found and it's details would be shown to you, otherwise the output is
ls: abCCCabCCCabCCC: No such file or directory
If it could be shortened depends on what you really want to achieve.
echo "abcabcabc" | sed "s/c/CCC/g" | xargs ls -ld
would have the same result in this case.
I have an assignment to count all the changes from all the files from a git open-source project. I know that if I use:
git log --pretty=oneline <filename> | wc -l
I will get the number of changes of that file ( I use git bash on Windows 10)
My idea is to use
find .
and to redirect the output to the git command. How can I do the redirecting? I tried :
$ find . > git log --pretty=online | wc -l
0
find: unknown predicate `--pretty=online'
and
$ find . | git log --pretty=online | wc -l
fatal: invalid --pretty format: online
0
You can do much better than that,
git log --pretty='' --name-only | sort | uniq -c
That's "show only the names of files changed in each commit, no other metadata, sort that list so uniq can easily count the occurrences of each'
You'll need to loop over the results of find.
find -type f | grep -v '^\./\.git' |
while read f; do
count=$(git log --oneline ${f} | wc -l)
echo "${f} - ${count}"
done | grep -v ' 0$'
Your find is okay, but I'd restrict it to just files (git doesn't track directories explicitly) and remove the .git folder (we don't care about those files). Pipe that into a loop (I'm using a while), and then your git log command works just fine. Lastly, I'm going to strip anything with a count of 0, since I may have files that are part of .gitignore I don't want to show up (e.g., things in __pycache__).
is it possible to make a listing of every data in the root directory including the path but without every Directory in a new section? Something like this but without every a "header" on every directory"
ls -l $PWD/*
When you want to see hidden files you can use
find $PWD -maxdepth 1 | xargs ls -ld
Or use grep -v "${PWD}/\." when you don't want them.
pipe to awk to handle it.
ls -l * | awk '$1!="total"'
This searches only in the pwd, despite having specified the directory. Probably caused by the third argument.
supported="*mov *mp4"
ls /home/kv/m $supported | head -1
..Removing the filter brings up the first file found by ls, but what can I use to tell ls to consider only the file types listed in $supported? --It's worth mentioning that the extensions mustn't be case-sensitive.
ls /home/kv/m | head -1
ls /home/kv/m | grep -i -E '\.(mov|mp4)$' | head -1
Run it in a subshell, and cd to the directory first
first=$( cd /home/kv/m && ls $supported | head -1 )
you might want to shopt -s nullglob first, in case there are not .mov or .mp4 files.