Basically, I have a file (say.sh) which uses an api to save and play text to speech mp3 files. The api uses the URL: http://api.voicerss.org/?key=keygoeshere&src=TEXT_GOES_HERE&hl=en-gb&c=mp3&r=1&f=32khz_8bit_stereo
The script (bellow) uses wget to get the file. In theory, this code should work, but it doesn't. If I echo the wget command, it returns a working command, that successfully runs, but as soon as I remove the echo, it gets confused by the quotation marks and stops working.
#!/bin/bash
TA="http://api.voicerss.org/?key=MY_KEY_IS_HERE&src="
TB="&hl=en-gb&c=mp3&r=1&f=32khz_8bit_stereo"
wget -O example.mp3 \"$TA$#$TB\"
omxplayer example.mp3
If anybody here knows how to fix this, it would be very helpful. Thanks!
EDIT: To run the command I have tried sh say.sh Text here and sh say.sh "Text here". Neither of which work -_-
i don't know that you need to escape any quotes for the wget command, but i think if variables are appended next to each other, you can try using {}:
#!/bin/bash
TA="http://api.voicerss.org/?key=MY_KEY_IS_HERE&src="
TB="&hl=en-gb&c=mp3&r=1&f=32khz_8bit_stereo"
wget -O example.mp3 "${TA}${TB}"
omxplayer example.mp3
give that a try and let us know how it goes.
Thanks for the responses! I managed to fix this by adding the "${TA}${#}${TB}" and using \" to declare a special character to get:
#!/bin/bash
TA="http://api.voicerss.org/?key=keygoeshere&src=\""
TB="\"&hl=en-gb&c=mp3&r=1&f=32khz_8bit_stereo"
wget -O example.mp3 "${TA}${#}${TB}"
omxplayer example.mp3
Related
I am executing the following command:
wget -O -https://raw.githubusercontent.com/drud/ddev/master/scripts/windows_ddev_nfs_setup.sh | bash
And I get the following:
wget: missing URL
Usage: wget [OPTION]... [URL]...
Looks as if I have the syntax wrong, but I have been searching to find out how to fix this and haven't found an answer.
I checked the command on shellcheck.net and learned the following:
SC2148: Tips depend on target shell and yours is unknown. Add a shebang.
That sounds great, but I don't (yet) know what it means.
You are using -O, which means that the next argument is the output file. Hence,
-https://raw.githubusercontent.com/drud/ddev/master/scripts/windows_ddev_nfs_setup.sh
is the name of your output file, and there is no URL in your command.
Since you are piping the result of wget into bash, you don't need an output file, I think.
first of all, I'm quite new with bash scripting and I'm just starting to learn, evidently there's something wrong with this script, but I don't know what it is...
I created a bash script to automate downloading videos with youtube-dl:
#!/bin/bash
echo url:
read url
export url
youtube-dl -f 'bestvideo[height<=360]+worstaudio/worst[height<=360]' $url
The idea is that I type in the command line the name of the script, e.g.: "360" and it will ask for a url (e.g.: a Youtube video), I paste it and youtube-dl downloads it with the stated parameters. It works like a charm...
Now, I want to make the script more complex and I think I need to convert the youtube-dl command to a variable (of course, being a newbie, I might be wrong, but let's assume I'm right for a moment...)
#!/bin/bash
video="youtube-dl -f 'bestvideo[height<=360]+worstaudio/worst[height<=360]'"
echo url:
read url
export url
$video $url
When I try this, it throws me an error: "ERROR: requested format not available
"
I don't know what's wrong... I'd like to solve the problem with the least changes to the code as possible and I repeat, I'd like to know what's wrong with the current code so I can learn from it.
Thank you very much in advance!
It's explained in detail here here: I'm trying to put a command in a variable, but the complex cases always fail!
First always double-quote your variables, unless you know exactly what will happen if you don't.
You don't need to export that variable: you're not invoking any other program that needs to use it.
When you want to re-use a command, think about putting it in a function:
#!/bin/bash
function video {
youtube-dl -f 'bestvideo[height<=360]+worstaudio/worst[height<=360]' "$1"
}
read -p "url: " url
video "$url"
Actually, I would do this:
add that function to your ~/.bashrc,
source that file: source ~/.bashrc
then you can use it from the command line:
video 'https://www.youtube.com/watch?v=dQw4w9WgXcQ'
Remove the single quote from the -f parameter it will work.
For eg.
video="youtube-dl -f bestvideo[height<=360]+worstaudio/worst[height<=360]"
i've started playing around with curl a few days ago. For any reason i couldn't figure out how to archive the following.
I would like to get the original filename with the output option
-O -J
AND put there some kind of variable, like time stamp, source path or whatever. This would avoid the file overwriting issue and also make it easier for further work with it.
Here are a few specs about my setup
Win7 x64
curl 7.37.0
Admin user
just commandline no PHP or script or so one
no scripting solutions please, need tihs command in a single line for Selenium automation
C:>curl --retry 1 --cert c:\certificate.cer --URL https://blabla.com/pdf-file --user username:password --cookie-jar cookie.txt -v -O -J
I've played around with various things i found online like
-o %(file %H:%s)
-O -J/%date%
-o $(%H) bla#1.pdf
but it always just print out the file as it is named link "%(file.pdf" or some other shitty names. I guess this is something pointing to escaping and quoting issues but cant find it right now.
No scripting solutions please, I need tihs command in a single line for Selenium automation.
Prefered output
originalfilename_date_time_source.pdf
Let me know if you get a solution for this.
I working with a function to parse a file that has a list of desired file names to download. I'm using curl to download them but is there a better way? The output is shown which is okay but is there way for the output not be shown? Is there way to handle exceptions if the file isn't found and move on to the next file to be download if something happens? Might wanna ignore what I do for getting the proper link name, it was pain. The directory pattern has a pattern to what the name of the file is.
#!/bin/bash
# reads from the file and assigns to $MYARRAY and download to Downloads/*
FILENAME=$1
DLPATH=$2
VARIABLEDNA="DNA"
index=0
function Download {
VARL=$1
#VARL=$i
echo $VARL
VAR=${VARL,,}
echo $VAR
VAR2=${VAR:1:2}
echo $VAR2
HOST=ftp://ftp.wwpdb.org/pub/pdb/data/structures/divided/pdb/
HOSTCAT=$HOST$VAR2
FILECATB='/pdb'
FILECATE='.ent.gz'
NOSLASH='pdb'
DLADDR=$HOSTCAT$FILECATB$VAR$FILECATE
FILECATNAME=$NOSLASH$VAR$FILECATE
echo $DLADDR
curl -o Downloads/$FILECATNAME $DLADDR
gunzip Downloads/$FILECATNAME
}
mkdir -p Downloads
while read line ; do
MYARRAY[$index]="$line"
index=$(($index+1))
done < $FILENAME
echo "MYARRAY is: ${MYARRAY[*]}"
echo "Total pdbs in the file: ${index}"
for i in "${MYARRAY[#]}"
do
Download $i
done
I'm trying to write the log file to a folder that i made before the downloading but it doesn't seem to be making it in the folder. It writes to the root directory of the file that being executed and it doesn't write it correctly either. My syntax might be wrong??
curl -o Downloads/$FILECATNAME $DLADDR >> Downloads\LOGS\$LOGFILE 2>&1
Okey, first of all, I'm not sure if I got it all right, but I'll give it a try:
I'm using curl to download them but is there a better way?
I don't know a better one. You could use wget instead of curl, but curl is much more powerful.
The output is shown which is okay but is there way for the output not be shown?
You could use nohup (e.g. nohup curl -o Downloads/$FILECATNAME $DLADDR). If you don't redirect the output to a specific file, It will be stored in nohup.out. By adding an ampersand (&) at the end of your command you can also let it run in the background, so the command is still executed, even if you loose the connection to the server.
Is there way to handle exceptions if the file isn't found and move on to the next file to be download if something happens?
You could use something like test...exists.Or you could just check your nohup.out for errors or anything else with a grep.
I hope this helped you in any way.
cheers
I'm trying to execute this command in a script:
axel "$1"
Where "$1" is a URL sent to this command in a script by the firefox plugin FlashGot. However,the URL is long and it keeps cutting it off short. The only way to overcome this is to enclose the URL in single or double quotes...eg. "http://...."
Thanks, in advance.
EDIT:
Ok, so an example of the URL is http://audio-sjl-t1-2.pandora.com/access/Letting%20Go%20-%20Isaac%20Shepard%2Emp4a?version=4&lid=290476474&token=Z6TTYtio6FYbhzesbxzPyWA%2F%2Bfa2uT5atbV8L0QF%2FMubHshmLJ1hgkN6B8SMZe74V8Q1feGMNmkmyTJO343qYkQ3aklQVKo4mDE2VVl1nkYk05gu0%2BBfP3WtxTCrn8r0gz0wwDgMfzQd68fBcmOTKtB%2FjR2kqVs9ZY7tZQUuabjGcP84ws%2BuIsuTqkKkHyrWaaLkGhk71GoPng2IMrm0L%2B6MeyHu6bvWn%2FoqNhXNerpFLpRZqXZ8JrX9uKVkDmkeQxUVV5%2F8y8uv2yYpG3P5tx1mfAY6U7ZteDLCfCT4JQWzlZscpl7GmtW4gf64KBExGA98xucIp%2Bt1x%2Bjru2Jt%2F7PVeeKWGv2en0%2Fetf1CQWjVUbDoWy4q9cEnYOc7rkpX
Well, it keeps cutting it off at
http://audio-sjl-t1-2.pandora.com/access/Letting%20Go%20-%20Isaac%20Shepard%2Emp4a?version=4
and that is all is getting sent to axel.
I added an echo command in the script:
#!/bin/bash
cd /home/caleb/Desktop
echo "$1"
axel "$1"
I can see the debug of a script by sending a URL to it through the terminal:
./axel.sh <URL>
The only error message I see is because of the shortened URL.
Here's the output of the script above:
http://audio-sjl-t1-2.pandora.com/access/Letting%20Go%20-%20Isaac%20Shepard%2Emp4a?version=4
Initializing download: http://audio-sjl-t1-2.pandora.com/access/Letting%20Go%20-%20Isaac%20Shepard%2Emp4a?version=4
HTTP/1.1 400 Bad Request
axel "$1" should work and I'm not surprised axel ""$1"" doesn't work because that's equivalent to axel $1.
To debug this we'll need an error message or something, because saying "it doesn't work" doesn't help at all.
You say the script is called from Firefox. I'm not sure if you can easily see the error message, maybe you can't. I have an idea for that. Let's call your script script.sh. Create a wrapper script script-wrapper.sh like this:
#!/bin/bash
log=/tmp/script.log
for arg; do
echo arg="'$arg'" | tee $log
done
/path/to/script.sh >>$log 2>&1
Make this script executable, trigger it from Firefox, and then look at the log, which will include both the output and error output of your original script. If you still can't figure out what is wrong, then edit your question, and paste in the content of /tmp/script.log so we can debug.
UPDATE
Based on your update, it looks like the script does not receive the URL correctly. In particular, it looks like the URL is not quoted properly when you pass it to the script. It's not surprising that the cut-off happens right in front of a & character, as that means something to the shell. You should call your script like this:
./axel.sh "http://....?version=4&lid=..."
But this is not happening, it looks it's getting called without the double-quotes, which will result in the behavior you're observing.
Just using
#!/bin/sh
axel "$1"
will work. If it isn't, you're going to need to give a lot more information...