I am trying to create a script that opens automatically any files containing a particular pattern.
This is what I achieved so far:
xargs -d " " vim < "$(grep --color -r test * | cut -d ':' -f 1 | uniq | sed ':a;N;$!ba;s/\n/ /g')"
The problem is that vim does not recognize the command as separate file of list, but as a whole filename instead:
zsh: file name too long: ..............
Is there an easy way to achieve it? What am I missing?
The usual way to call xargs is just to pass the arguments with newlines via a pipe:
grep -Rl test * | xargs vim
Note that I'm also passing the -l argument to grep to list the files that contain my pattern.
Use this:
vim -- `grep -rIl test *`
-I skip matching in binary files
-l print file name at first match
Try to omit xargs, becouse this leads to incorrect behaviour of vim:
Vim: Warning: Input is not from a terminal
What I usually do is append the following line to a list of files:
> ~/.files.txt && vim $(cat ~/.files.txt | tr "\n" " ")
For example :
grep --color -r test * > ~/.files.txt && vim $(cat ~/.files.txt | tr "\n" " ")
I have the following in my .bashrc to bind VV (twice V in uppercase) to insert that automatically :
insertinreadline() {
READLINE_LINE=${READLINE_LINE:0:$READLINE_POINT}$1${READLINE_LINE:$READLINE_POINT}
READLINE_POINT=`expr $READLINE_POINT + ${#1}`
}
bind -x '"VV": insertinreadline " > ~/.files.txt && vim \$(cat ~/.files.txt | tr \"\\n\" \" \")"'
Related
I have a command like below
md5sum test1.txt | cut -f 1 -d " " >> test.txt
I want output of the above result prefixed with File_CheckSum:
Expected output: File_CheckSum: <checksumvalue>
I tried as follows
echo 'File_Checksum:' >> test.txt | md5sum test.txt | cut -f 1 -d " " >> test.txt
but getting result as
File_Checksum:
adbch345wjlfjsafhals
I want the entire output in 1 line
File_Checksum: adbch345wjlfjsafhals
echo writes a newline after it finishes writing its arguments. Some versions of echo allow a -n option to suppress this, but it's better to use printf instead.
You can use a command group to concatenate the the standard output of your two commands:
{ printf 'File_Checksum: '; md5sum test.txt | cut -f 1 -d " "; } >> test.txt
Note that there is a race condition here: you can theoretically write to test.txt before md5sum is done reading from it, causing you to checksum more data than you intended. (Your original command mentions test1.txt and test.txt as separate files, so it's not clear if you are really reading from and writing to the same file.)
You can use command grouping to have a list of commands executed as a unit and redirect the output of the group at once:
{ printf 'File_Checksum: '; md5sum test1.txt | cut -f 1 -d " " } >> test.txt
printf "%s: %s\n" "File_Checksum:" "$(md5sum < test1.txt | cut ...)" > test.txt
Note that if you are trying to compute the hash of test.txt(the same file you are trying to write to), this changes things significantly.
Another option is:
{
printf "File_Checksum: "
md5sum ...
} > test.txt
Or:
exec > test.txt
printf "File_Checksum: "
md5sum ...
but be aware that all subsequent commands will also write their output to test.txt. The typical way to restore stdout is:
exec 3>&1
exec > test.txt # Redirect all subsequent commands to `test.txt`
printf "File_Checksum: "
md5sum ...
exec >&3 # Restore original stdout
Operator &&
e.g. mkdir example && cd example
I'm trying to pipe a series of manipulations into an xargs call that I can use to swap the first value with the second using the sed command (sed is optional if there's a better way).
Basically I'm grabbing method signature in camel case and appending a prefix while trying to retain camel case.
So it should take...
originalMethodSignature
and replace it with...
givenOriginalMethodSignature
Because I'm using a series of pipes to find and modify the text, I was hoping to use multiple params with xargs, but it seems that most of the questions involving that use sh -c which would be fine but in order for the sed command to be interactive on a Mac terminal I need to use single quotes inside the shell calls' single quotes.
Something like this, where the double quotes preserve the functionality of the single quotes in the sed command...
echo "somePrecondition SomePrecondition" | xargs -L1 sh -c 'find ~/Documents/BDD/Definitions/ -type f -name "Given$1.swift" -exec sed -i "''" "'"s/ $0/ given$1/g"'" {} +'
assuming there's a file called "~/Documents/BDD/Definitions/GivenSomePrecondition.swift" with below code...
protocol GivenSomePrecondition { }
extension GivenSomePrecondition {
func somePrecondition() {
print("empty")
}
}
The first awk is going through a list of swift protocols that start with the Given keyword (e.g. GivenSomePrecondition), then they strip it down to "somePrecondition SomePrecondition" before hitting the final pipe. My intent is that the final xargs call can replace $0 with given$1 interactively (overwriting the file).
The original command in context...
awk '{ if ($1 ~ /^Given/) print $0;}' ~/Documents/Sell/SellUITests/BDDLite/Definitions/HasStepDefinitions.swift \
| tr -d "\t" \
| tr -d " " \
| tr -d "," \
| sort -u \
| xargs -I string sh -c 'str=$(echo string); echo ${str#"Given"}' \
| awk '{ print tolower(substr($1,1,1)) substr($1, 2)" "$1 }' \
| xargs -L1 sh -c '
find ~/Documents/Sell/SellUITests/BDDLite/Definitions/ \
-type f \
-name "Given$1.swift" \
-exec sed -i '' "'"s/ $0/ given$1/g"'" {} +'
You don't need xargs or sh -c, and taking them out reduces the amount of work involved.
echo "somePrecondition SomePrecondition" |
while read -r source replace; do
find ~/Documents/BDD/Definitions/ -type f -name "Given${replace}.swift" -print0 |
while IFS= read -r -d '' filename; do
sed -i '' -e "s/ ${source}/ given${replace}/g" "$filename"
done
done
However, to answer your questions as opposed to sidestepping it, you can write functions that use any kind of quotes you want, and export them into your subshell, either with export -f yourFunction in a parent process or by putting "$(declare -f yourFunction)" inside the string passed after bash -c (assuming that bash is the same shell used in the parent process defining those functions).
#!/usr/bin/env bash
replaceOne() {
local source replace
source=$1; shift || return
replace=$1; shift || return
sed -i '' -e "s/ $1/ given$2/g" "$#"
}
# substitute replaceOne into a new copy of bash, no matter what kind of quotes it has
bash -c "$(declare -f replaceOne)"'; replaceOne "$#"'
So I was building a script for a co-worker so she can easily scan files for occurrences of strings. But I am having trouble with my grep command.
#!/bin/bash -x
filepath() {
echo -n "Please enter the path of the folder you would like to scan, then press [ENTER]: "
read path
filepath=$path
}
filename () {
echo -n "Please enter the path/filename you would like the output saved to, then press [ENTER]: "
read outputfile
fileoutput=$outputfile
touch $outputfile
}
searchstring () {
echo -n "Please enter the string you would like to seach for, then press [ENTER]: "
read searchstring
string=$searchstring
}
codeblock() {
for i in $(ls "${filepath}")
do
grep "'${string}'" "$i" | wc -l | sed "s/$/ occurance(s) in "${i}" /g" >> "${fileoutput}"
done
}
filepath
filename
searchstring
codeblock
exit
I know there are a lot of extra variable "redirects" Just practicing my scripting. Here is the error I am receiving when I run the command.
+ for i in '$(ls "${filepath}")'
+ grep ''\''<OutageType>'\''' *filename*.DONE.xml
+ wc -l
+ sed 's/$/ occurance(s) in *filename*.DONE.xml /g'
grep: *filename*.DONE.xml: No such file or directory
However if I run the grep command with the wc and sed functions from CLI it works fine.
# grep '<OutageNumber>' "*filename*.DONE.xml" | wc -l | sed "s/$/ occurance(s) in "*filename*.DONE.xml" /g"
13766 occurance(s) in *filename*.DONE.xml
There are several things going wrong here.
for i in $(ls "${filepath}")
The value of filepath is *filename*.DONE.xml, and if you assume that the * get expanded there, that won't happen. A double-quoted string variable is taken literally by the shell, the * will not get expanded.
If you want wildcard characters to be expanded to match filename patterns,
then you cannot double-quote the variable in the command.
Next, it's strongly discouraged to parse the output of the ls command. This would be better:
for i in ${filepath}
And this still won't be "perfect', because if there are no files matching the pattern,
then grep will fail. To avoid that, you could enable the nullglob option:
shopt -s nullglob
for i in ${filepath}
Finally, I suggest to eliminate this for loop,
and use the grep command directly:
grep "'${string}'" ${filepath} | ...
Suppose we have the following command and its related output :
gsettings list-recursively org.gnome.Terminal.ProfilesList | head -n 1 | grep -oP '(?<=\[).*?(?=\])'
Output :
'b1dcc9dd-5262-4d8d-a863-c897e6d979b9', 'ca4b733c-53f2-4a7e-8a47-dce8de182546', '802e8bb8-1b78-4e1b-b97a-538d7e2f9c63', '892cd84f-9718-46ef-be06-eeda0a0550b1', '6a7d836f-b2e8-4a1e-87c9-e64e9692c8a8', '2b9e8848-0b4a-44c7-98c7-3a7e880e9b45', 'b23a4a62-3e25-40ae-844f-00fb1fc244d9'
I need to use gsettings command in a script and create filenames regarding to output ot gessetings command. For example a file name should be
b1dcc9dd-5262-4d8d-a863-c897e6d979b9
the next one :
ca4b733c-53f2-4a7e-8a47-dce8de182546
and so on.
How I can do this?
Another solution... just pipe the output of your command to:
your_command | sed "s/[ ']//g" | xargs -d, touch
You can use process substitution to read your gsettings output and store it in an array :
IFS=', ' read -r -a array < <(gsettings)
for f in "${array[#]}"
do
file=$(echo $f |tr -d "'" ) # removes leading and trailing quotes
touch "$file"
done
How would one grep backtick from files in a for-loop.
I would like to run grep for a pattern '`define'. The pattern works in standalone grep command but fails in for-loop.
foreach xxx ( `grep -r '`define' $idirectory --no-filename | sed -e 's ; //.* ; ; ' -e 's ; #.* ; ; ' -e 's ; ^\s* ; ; ' | grep -v ^$ | sort -n | awk '{print $2}' | uniq -d`)
echo $xxx
end
The backticks are conflicting in the for-loop.
regards
Srisurya
Simply, don't use ' and escape the backtick with backshlash.
So, the next didn't works:
grep -r '`def' *
and prints
No matching command
But this:
grep -r \`def *
works and prints
ewdwedwe `define`
So, simiarly for your script, the next works (file btick.tcsh):
#!/bin/tcsh
set greparg = \`def
foreach xxx ( `grep -l $greparg *` )
echo ===$xxx===
end
and pruduces the next result
===btick.tcsh===
===btick1.txt===
===btick2.txt===
the content of btick.txt files:
btick1 `def`
This is an alternate solution.
Use of ASCII code for grep argument
grep -rP '\x60define' $idirectory
where \x60 is the ascii code for "`"
You should not use old and outdated back ticks, use parentheses like this $(code)
Try this:
for xxx in $(some code $(som more code)
echo "$xxx"
done
Nesting and back tics makes it complicated, it need to be escaped. Compare this to:
listing=`ls -l \`cat filenames.txt\``
vs
listing=$(ls -l $(cat filenames.txt))