Permission denied when running shell script using 'wkhtmltopdf' tool - bash

Hi!
So, I am using the wkhtmltopdf tool in a script on my Ubuntu terminal...
#!/bin/bash
#search for every '*.html' extension and store it in 'output.txt' file
find /home/guidine/09 -name '*.html' > output.txt
#read 'output.txt' file executing the command 'wkhtmltopdf' syntax for each line
for word in $(cat /home/guidine/bin/output.txt)
do
$wkhtmltopfd $word $word".pdf"
done
Which returns every line (it's a path) in the 'output.txt' with the 'Permission denied' message (example annexed), can anyone help me?
Screenshot of output
I've already did $ chmod +x wkhtmltopdf.sh.
I've found an alternative that satisfied me and I'm closing the question.
find <directory> -name '*.html' -exec wkhtmltopdf {} {}.pdf \;

Related

find and rename files gives bash 'no such file or directory' error

I'm trying to strip the numbers and - from the file name.
Below are the files.
ui-service-3.100.503505.json
kibana-store-end-3.103.103505.json
api-application-3.4003.10350665.json
find . -type f -iname "*.json" -exec rename 's/[0-9]//g' {} \;
throws find: ‘rename’: No such file or directory
Tried multiple other combinations but same error.
I'm expecting below output, where am I going wrong ?
ui-service.json
kibana-store-end.json
api-application.json
Probably you don't have rename command on your system. Try this instead:
for file in ./*.json; do
echo mv "$file" "${file%-*}.json"
done
Drop the echo if the output looks fine.

Execute shell command with multiple options on all files in a directory

I'm trying to do something similar to this post : Execute command on all files in a directory
I have a bunch of test files in a directory, and I need to run each test using a specific run command with multiple options.
I tried using a for loop from a shell script, but I get the following errors :
for: Command not found.
do: Command not found.
f: Undefined variable.
Script :
#!/bin/csh
set FILES = ../../test_files/*.out
for f in $FILES; do
runcmd -option1 -option2 -option3 -option4 "$f" >> results.log
done
I also tried using the find -exec commands, but I get the following error with this :
find: missing argument to `-exec'
I'm under the impression that I'm getting this error since I have multiple options in the command I'm trying to run.
Script :
#!/bin/csh
find ../../test_files -name "*.out" -exec run_cmd -option1 -option2 -option3 -option4 -- echo {} \ > results.log
I did chmod 777 on my script before trying to run it. And I also tried using running this same script in bash (by using #!/bin/bash in my .sh script and entering bash in the terminal), but I get the same error when I use find -exec, and the script doesn't do anything when I use the for loop.
Could someone please help me figure out why my scripts aren't working? or show me another way to execute the run command on all the test files in a directory?
Thanks a lot #Gordon Davisson, I was able to fix my find command, and the script works now.
#!/bin/csh
find ../../test_files -name "*.out" -exec run_cmd -option1 -option2 -option3 -option4 -- echo {} \; > results.log

use file comand instead of -name

I want to write a shell script that searches in all .txt files the word cat and replaces it with mouse.I wrote the following code:
!/bin/bash
read directory
for F in ` find $directory -name '*.txt' -type f`
do
echo $F
`sed -i "s/\<cat\>/mouse/g" $F`
done
I am supposed to use "file" command.I searched for it and it seems like file command finds all the files of a certain type.I want to know how can I include that command in my script.
Assuming you are in the directory where all *.txt files are. You can execute the following command:
find . -name *.txt -exec sed -i "s/\<cat\>/mouse/g" "{}" \;

Finding files from list and copying them into new directory

I want to find files via the command line terminal for OS X and then copy the files that were found to a new directory. The filenames of the files I want to find are in a listexample.txt file and there are about 5000 file names in there.
The file listexample.txt looks like this:
1111 00001 55553.bmp
1113 11312 24125.bmp
…
I tried around with thing like this:
find /directory -type f "`cat listexample.txt`" -exec cp {} …
but can't get it to run.
I have now this, but it doesn't work:
cat listexample.txt | while read line; do grep "$line" listexample.txt -exec find /directorya "$line" -exec cp {} /directoryb \; done
The idea is to read the lines of list example.txt, then take the line with grep, find the file in directory a and then copy the found file over to a new directory b. I think because of the nature of my file names, see above, there is a problem with spaces in the name as well.
I also started this approach to see what is going on, but didn't get far.
for line in `cat listexample.txt`; do grep $line -exec echo "Processing $line"; done
Here's the solution for a find and copy script (copy.sh) in case somebody has a similar problem:
First, give rights to the script with: chmod +x fcopy.sh
Then run it with: ./fcopy.sh listexample.txt
Script content:
#!/bin/bash
target="/directory with images"
while read line
do
name=$line
echo "Text read from file - $name"
find "${target}" -name "$name" -exec cp {} /found_files \;
done < $1
Cheers

how to use vim to open every .txt file under a directory (with Bash)

I am trying the following to use a vim to open every txt file under current directory.
find . -name "*.txt" -print | while read aline; do
read -p "start spellchecking fine: $aline" sth
vim $aline
done
Running it in bash complains with
Vim: Warning: Input is not from a terminal
Vim: Error reading input, exiting...
Vim: Finished.
Can anyone explain what could possibly goes wrong? Also, I intend to use read -p for prompt before using vim, without no success.
Try:
vim $( find . -name "*.txt" )
To fix your solution, you can (probably) do:
find . -name "*.txt" -print | while read aline; do
read -p "start spellchecking fine: $aline" sth < /dev/tty
vim $aline < /dev/tty
done
The problem is that the entire while loop is taking its input from find, and vim inherits that pipe as its stdin. This is one technique for getting vim's input to come from your terminal. (Not all systems support /dev/tty, though.)
With shopt -s globstar you can purge out find and thus make bash not execute vim in a subshell that receives output from find:
shopt -s globstar
shopt -s failglob
for file in **/*.txt ; do
read -p "Start spellchecking fine: $file" sth
vim "$file"
done
. Another idea is using
for file in $(find . -name "*.txt") ; do
(in case there are no filenames with spaces or newlines.)
Often the simplest solution is the best, and I believe this is it:
vim -o `find . -name \*.txt -type f`
The -type f is to ensure only files ending .txt are opened as you don't discount the possibility that there may be subdirectories that have names that end in ".txt".
This will open each file in a seperate window/bufer in vim, if you don't require this and are happy with using :next and :prefix to navigate through the files, remove "-o" from the suggested comand-line above.
The proper way to open all files in one single vim instance is (provided the number of files doesn't exceed the maximal number of arguments):
find . -name '*.txt' -type f -exec vim {} +
Another possibility that fully answers the OP, but with the benefit that it is safe regarding file names containing spaces or funny symbols.
find . -name '*.txt' -type f -exec bash -c 'read -p "start spellchecking $0"; vim "$0"' {} \;

Resources