I have been tasked with replacing ISQL in a lot of our bash scripts with sqlcmd. ISQL allows piping a variable in it's execution.
An example would be:
SQL_STATEMENT="SELECT TOP 1 SYS_USER_NAME FROM SYS_USER"
echo $SQL_STATEMENT | isql -b -d, $DSN $DBUID $DBPWD >> setupdb_test.txt
From what I can tell this is not viable in sqlcmd. How can I do this? What flags does sqlcmd have to allow this to happen?
Here is what I have tried and have had a good result BUT I really do not want to create the file sql_command.sql every time a particular script runs:
echo "SELECT TOP 1 SYS_USER_NAME FROM SYS_USER" > sql_command.sql
sqlcmd -S $DB -U $DBUID -P $DBPWD -d $DSN -i sql_command.sql >> setupdb_test.txt
Programs originating on Windows can be picky about how they handle non-regular files and I don't have the opportunity to test, but you can try the typical Unix tricks for providing a "file" with data from an echo.
Either /dev/stdin:
echo "SELECT TOP 1 SYS_USER_NAME FROM SYS_USER" | sqlcmd -S "$DB" -U "$DBUID" -P "$DBPWD" -d "$DSN" -i /dev/stdin
or process substitution:
sqlcmd -S "$DB" -U "$DBUID" -P "$DBPWD" -d "$DSN" -i <(echo "SELECT TOP 1 SYS_USER_NAME FROM SYS_USER")
Say I have a file at the URL http://mywebsite.example/myscript.txt that contains a script:
#!/bin/bash
echo "Hello, world!"
read -p "What is your name? " name
echo "Hello, ${name}!"
And I'd like to run this script without first saving it to a file. How do I do this?
Now, I've seen the syntax:
bash < <(curl -s http://mywebsite.example/myscript.txt)
But this doesn't seem to work like it would if I saved to a file and then executed. For example readline doesn't work, and the output is just:
$ bash < <(curl -s http://mywebsite.example/myscript.txt)
Hello, world!
Similarly, I've tried:
curl -s http://mywebsite.example/myscript.txt | bash -s --
With the same results.
Originally I had a solution like:
timestamp=`date +%Y%m%d%H%M%S`
curl -s http://mywebsite.example/myscript.txt -o /tmp/.myscript.${timestamp}.tmp
bash /tmp/.myscript.${timestamp}.tmp
rm -f /tmp/.myscript.${timestamp}.tmp
But this seems sloppy, and I'd like a more elegant solution.
I'm aware of the security issues regarding running a shell script from a URL, but let's ignore all of that for right now.
source <(curl -s http://mywebsite.example/myscript.txt)
ought to do it. Alternately, leave off the initial redirection on yours, which is redirecting standard input; bash takes a filename to execute just fine without redirection, and <(command) syntax provides a path.
bash <(curl -s http://mywebsite.example/myscript.txt)
It may be clearer if you look at the output of echo <(cat /dev/null)
This is the way to execute remote script with passing to it some arguments (arg1 arg2):
curl -s http://server/path/script.sh | bash /dev/stdin arg1 arg2
For bash, Bourne shell and fish:
curl -s http://server/path/script.sh | bash -s arg1 arg2
Flag "-s" makes shell read from stdin.
Use:
curl -s -L URL_TO_SCRIPT_HERE | bash
For example:
curl -s -L http://bitly/10hA8iC | bash
Using wget, which is usually part of default system installation:
bash <(wget -qO- http://mywebsite.example/myscript.txt)
You can also do this:
wget -O - https://raw.github.com/luismartingil/commands/master/101_remote2local_wireshark.sh | bash
The best way to do it is
curl http://domain/path/to/script.sh | bash -s arg1 arg2
which is a slight change of answer by #user77115
You can use curl and send it to bash like this:
bash <(curl -s http://mywebsite.example/myscript.txt)
I often using the following is enough
curl -s http://mywebsite.example/myscript.txt | sh
But in a old system( kernel2.4 ), it encounter problems, and do the following can solve it, I tried many others, only the following works
curl -s http://mywebsite.example/myscript.txt -o a.sh && sh a.sh && rm -f a.sh
Examples
$ curl -s someurl | sh
Starting to insert crontab
sh: _name}.sh: command not found
sh: line 208: syntax error near unexpected token `then'
sh: line 208: ` -eq 0 ]]; then'
$
The problem may cause by network slow, or bash version too old that can't handle network slow gracefully
However, the following solves the problem
$ curl -s someurl -o a.sh && sh a.sh && rm -f a.sh
Starting to insert crontab
Insert crontab entry is ok.
Insert crontab is done.
okay
$
Also:
curl -sL https://.... | sudo bash -
Just combining amra and user77115's answers:
wget -qO- https://raw.githubusercontent.com/lingtalfi/TheScientist/master/_bb_autoload/bbstart.sh | bash -s -- -v -v
It executes the bbstart.sh distant script passing it the -v -v options.
Is some unattended scripts I use the following command:
sh -c "$(curl -fsSL <URL>)"
I recommend to avoid executing scripts directly from URLs. You should be sure the URL is safe and check the content of the script before executing, you can use a SHA256 checksum to validate the file before executing.
instead of executing the script directly, first download it and then execute
SOURCE='https://gist.githubusercontent.com/cci-emciftci/123123/raw/123123/sample.sh'
curl $SOURCE -o ./my_sample.sh
chmod +x my_sample.sh
./my_sample.sh
This way is good and conventional:
17:04:59#itqx|~
qx>source <(curl -Ls http://192.168.80.154/cent74/just4Test) Lord Jesus Loves YOU
Remote script test...
Param size: 4
---------
17:19:31#node7|/var/www/html/cent74
arch>cat just4Test
echo Remote script test...
echo Param size: $#
If you want the script run using the current shell, regardless of what it is, use:
${SHELL:-sh} -c "$(wget -qO - http://mywebsite.example/myscript.txt)"
if you have wget, or:
${SHELL:-sh} -c "$(curl -Ls http://mywebsite.example/myscript.txt)"
if you have curl.
This command will still work if the script is interactive, i.e., it asks the user for input.
Note: OpenWRT has a wget clone but not curl, by default.
bash | curl http://your.url.here/script.txt
actual example:
juan#juan-MS-7808:~$ bash | curl https://raw.githubusercontent.com/JPHACKER2k18/markwe/master/testapp.sh
Oh, wow im alive
juan#juan-MS-7808:~$
I'm trying to run a script for pulling finance history from yahoo. Boris's answer from this thread
wget can't download yahoo finance data any more
works for me ~2 out of 3 times, but fails if the crumb returned from the cookie has a "\" character in it.
Code that sometimes works looks like this
#!usr/bin/sh
symbol=$1
today=$(date +%Y%m%d)
tomorrow=$(date --date='1 days' +%Y%m%d)
first_date=$(date -d "$2" '+%s')
last_date=$(date -d "$today" '+%s')
wget --no-check-certificate --save-cookies=cookie.txt https://finance.yahoo.com/quote/$symbol/?p=$symbol -O C:/trip/stocks/stocknamelist/crumb.store
crumb=$(grep 'root.*App' crumb.store | sed 's/,/\n/g' | grep CrumbStore | sed 's/"CrumbStore":{"crumb":"\(.*\)"}/\1/')
echo $crumb
fileloc=$"https://query1.finance.yahoo.com/v7/finance/download/$symbol?period1=$first_date&period2=$last_date&interval=1d&events=history&crumb=$crumb"
echo $fileloc
wget --no-check-certificate --load-cookies=cookie.txt $fileloc -O c:/trip/stocks/temphistory/hs$symbol.csv
rm cookie.txt crumb.store
But that doesn't seem to process in wget the way I intend either, as it seems to be interpreting as described here:
https://askubuntu.com/questions/758080/getting-scheme-missing-error-with-wget
Any suggestions on how to pass the $crumb variable into wget so that wget doesn't error out if $crumb has a "\" character in it?
Edited to show the full script. To clarify I've got cygwin installed with wget package. I call the script from cmd prompt as (example where the script above is named "stocknamedownload.sh, the stock symbol I'm downloading is "A" from the startdate 19800101)
c:\trip\stocks\StockNameList>bash stocknamedownload.sh A 19800101
This script seems to work fine - unless the crumb returned contains a "\" character in it.
The following implementation appears to work 100% of the time -- I'm unable to reproduce the claimed sporadic failures:
#!/usr/bin/env bash
set -o pipefail
symbol=$1
today=$(date +%Y%m%d)
tomorrow=$(date --date='1 days' +%Y%m%d)
first_date=$(date -d "$2" '+%s')
last_date=$(date -d "$today" '+%s')
# store complete webpage text in a variable
page_text=$(curl --fail --cookie-jar cookies \
"https://finance.yahoo.com/quote/$symbol/?p=$symbol") || exit
# extract the JSON used by JavaScript in the page
app_json=$(grep -e 'root.App.main = ' <<<"$page_text" \
| sed -e 's#^root.App.main = ##' \
-e 's#[;]$##') || exit
# use jq to extract the crumb from that JSON
crumb=$(jq -r \
'.context.dispatcher.stores.CrumbStore.crumb' \
<<<"$app_json" | tr -d '\r') || exit
# Perform our actual download
fileloc="https://query1.finance.yahoo.com/v7/finance/download/$symbol?period1=$first_date&period2=$last_date&interval=1d&events=history&crumb=$crumb"
curl --fail --cookie cookies "$fileloc" >"hs$symbol.csv"
Note that the tr -d '\r' is only necessary when using a native-Windows jq mixed with an otherwise native-Cygwin set of tools.
You are adding quotes to the value of the variable instead of quoting the expansion. You are also trying to use tools that don't know what JSON is to process JSON; use jq.
wget --no-check-certificate \
--save-cookies=cookie.txt \
"https://finance.yahoo.com/quote/$symbol/?p=$symbol" \
-O C:/trip/stocks/stocknamelist/crumb.store
# Something like thist; it's hard to reverse engineer the structure
# of crumb.store from your pipeline.
crumb=$(jq 'CrumbStore.crumb' crumb.store)
echo "$crumb"
fileloc="https://query1.finance.yahoo.com/v7/finance/download/$symbol?period1=$first_date&period2=$last_date&interval=1d&events=history&crumb=$crumb"
echo "$fileloc"
wget --no-check-certificate \
--load-cookies=cookie.txt "$fileloc" \
-O c:/trip/stocks/temphistory/hs$symbol.csv
I'm getting 7: Syntax error: "(" unexpected error while running bellow code on Ubuntu. But It's run on centos without any issues.
#!/bin/sh
#
TODATE=`date '+%Y-%b-%d'`
#
# Backup Creation for Databases
#
databases=(`echo 'show databases;' | mysql -u root -ppaSSword | grep -v ^Database$`)
for DB in "${databases[#]}"; do
mysqldump --force --opt --user=root --password=paSSword $DB | gzip > /mnt/Backup/DB/${DB}_${TODATE}.sql.gz
done
#
Please help me to solve this.
I can't figure out problem. But,
I'm using bellow code for backup. It's working fine with Ubuntu
#!/bin/bash
#
TODATE=`date '+%Y-%b-%d'`
databases="$(mysql -u root -ppaSSword -Bse 'show databases')"
for DB in $databases
do
mysqldump -u root -psqlMYadmin $DB | gzip > /mnt/Backup/DB/${DB}_${TODATE}.sql.gz
done
You can redirect the 'show databases' output to dump.txt file, if done then try.
#!/bin/bash
da=$(date +"%d-%m-%y")
for db in `cat dump.txt` ; do mysqldump --force --opt --user=root --password=paSSword $db | gzip /path/to/backup/$db_"$da".sql.gz ; done
You need to escape the last '$' on the line databases= ...
There's only one ( in the script and you have the shebang line #!/bin/sh. My best guess is that the program /bin/sh does not recognize array assignment, whereas /bin/bash would.
Change your shebang to #!/bin/bash.
You'd probably do better to use $(...) in place of the back ticks.) Also, as Sami Laine points out in his answer, it would be better if you quoted the regex to the grep command (though it is not the cause of your problem):
databases=( $(echo 'show databases;' | mysql -u root -ppaSSword | grep -v '^Database$') )
I have bash script and try run command inside it
That's ok
echo ${something:="zip -r -q $TAG -P $PASS $LOCPATH"}
>zip -r -q evolution -P evolution ~/.gconf/apps/evolution
That's ok too
zip -r -q evolution -P evolution ~/.gconf/apps/evolution
But here order have been changed only when passed values and added strange . -i
zip -r -q $TAG -P $PASS $LOCPATH
>zip error: Nothing to do! (try: zip -r -q -P evolution evolution . -i ~/.gconf/apps/evolution
Thanks for any advice.
BASH FAQ entry #50: "I'm trying to put a command in a variable, but the complex cases always fail!"
something=(zip -r -q "$TAG" -P "$PASS" "$LOCPATH")
"${something[#]}"
Try doing type zip, seems it's aliased to something.
Maybe put the full path of zip to override this ,something like :
/usr/bin/zip