cqlsh having issue executing using cqlsh -e - shell

I'm having hard time executing using -e option with column name is in quotes.
I want to execute some thing like below using unix level. Trying to run from shell script. When i try to put my values in quotes, its taking away the quotes for my column.
select * from keyspace.cf where "columnname"=''
Tried this:
cqlsh hostname -e "select * from keyspace.cf where "columnname"=''"
It is executing as cqlsh hostname -e 'select * from keyspace.cf where columnname='
stdin>:1:InvalidRequest: Error from server: code=2200 [Invalid query] message="Undefined name columnaname in where clause ('columnname= 'value'')"

You don't need to put quotes around columnname, you just need to set it and prefix it with a $.
cqlsh -u cassandra -p cassandra -e "SELECT $COLUMN FROm $KEYSPACE.$TABLE;"
That's an excerpt from a script I wrote called getVersion.sh.
#!/bin/bash
KEYSPACE="system"
TABLE="local"
COLUMN="release_version"
~/local/apache-cassandra-3.10/bin/cqlsh -u cassandra -p cassandra -e "SELECT $COLUMN FROm $KEYSPACE.$TABLE;"
aploetz#dockingBay94:~/scripts$ ./getVersion.sh
release_version
-----------------
3.10
(1 rows)
The same will work if your column names contain quotes. Just be sure to escape them in your variable definition. This is a similar script, but it queries the "columnName" TEXT column:
#!/bin/bash
KEYSPACE="stackoverflow"
TABLE="stuff_with_quotes"
COLUMN="\"columnName\""
~/local/apache-cassandra-3.10/bin/cqlsh -u cassandra -p cassandra -e "SELECT $COLUMN FROm $KEYSPACE.$TABLE;"

Related

how to run beeline and hive query from a bash shell script

I am able to run below steps manually in order after logging in to unix bash shell.
echo "Connecting beeline"
beeline
!connect jdbc:hive2://a301-1234-1234.stm.XXX.com:10000/default;;ssl=true;sslTrustStore=/app/bds/cloudera_truststore.jks;sslTrustPassword=;principal=hive/_HOST#BDS.XXXX.COM
INSERT OVERWRITE DIRECTORY "/dev/ref/HIVE_EXPORT/" ROW FORMAT DELIMITED FIELDS TERMINATED BY "," ESCAPED BY "\\" SELECT * FROM test_ref_st.Daily_report limit 10;
hadoop fs -get('/dev/ref/HIVE_EXPORT/000000_0', '/user/rj/hiveExtract.csv')
echo "Query result extracted "
I need to run all above steps in sequence via a shell script test1.sh like
bash-4.2$ sh -x test1.sh
then it is only running till beeline and remaining commands are not being run.
Current output:
bash-4.2$ sh test1.sh
Picked up JAVA_TOOL_OPTIONS:
Beeline version 1.1.0-cdh5.16.2 by Apache Hive
beeline>
Bash is processing your script line by line. It runs beeline and waits for your input.
You can use heredoc to write to stdin from your script:
beeline <<EOF
!connect jdbc:hive2://a301-1234-1234.stm.XXX.com:10000/default;;ssl=true;sslTrustStore=/app/bds/cloudera_truststore.jks;sslTrustPassword=;principal=hive/_HOST#BDS.XXXX.COM
INSERT OVERWRITE DIRECTORY "/dev/ref/HIVE_EXPORT/" ROW FORMAT DELIMITED FIELDS TERMINATED BY "," ESCAPED BY "\\" SELECT * FROM test_ref_st.Daily_report limit 10;
EOF
Using !connect will open up the beeline shell console. For using beeline CLI command you can do the following in SHELL:
#!/bin/bash
HIVE_CONN=jdbc:hive2://a301-1234-1234.stm.XXX.com:10000/default;; ## limited for simplicity
echo "executing query using beeline"
beeline -u $HIVE_CONN -e "INSERT OVERWRITE DIRECTORY "/dev/ref/HIVE_EXPORT/" ROW FORMAT DELIMITED FIELDS TERMINATED BY "," ESCAPED BY "\\" SELECT * FROM test_ref_st.Daily_report limit 10;"
...
rest of your code
-e stand for the query you want to execute
More on beeline CLI in here.

psql execute stored function and store the result in the variable

I need to execute sql function and store the result
#!/bin/bash
RESULT=`psql -A -t postgresql://user:password#localhost:5432/db -c "select main.return_anything();" db`
echo $RESULT
And expect the result to contain 1.
But I get the result
psql: warning: extra command-line argument "select main.return_anything();" ignored
psql: warning: extra command-line argument "db" ignored
And it is just waiting for something and not produce any result. What is the problem?
From psql manual:
psql [option...] [dbname [username]]
So first options, then optionally dbname and then optionally username.
Try this:
RESULT=$(psql -A -t -c "select main.return_anything();" postgresql://user:password#localhost:5432/db db)
Side note: backticks ` are deprecated. Use $(...) which look cleaner and allow for nesting.

use sql file in "\copy" psql command line

I'm trying to script a query from a bash file but the select query is in a file. For compatibility and test, I'd like to keep it this way.
In short I'd like to write something like:
psql -c data_base "\copy (<file.sql>) To './test.csv' With CSV"
Would you know how to do that ?
EDIT
The "file.sql" contains a query :
$cat file.sql
$SELECT country FROM world;
You can use bash substitution as,
psql -c data_base "\copy ($(<file.sql)) To './test.csv' With CSV"
$(<file) expands the contents of the file

Passing parameters to Impala shell

I am running a impala query in while loop and for that I have created one separate query file and I am calling it from my shell script.
My question is: Can we pass shell variable matching with impala query in query file?
A="INSERT_SBP_ME_VS_ME_INCOME_LAST_THIRTY_DAYS_Q"${Count}
echo "value of A is $A"
source ${SBP2_MNY_IN_LAST_THIRTY_DAYS_QF}
${IMPALA_CON} -q "${${A}}"
'A' value is like INSERT_SBP_ME_VS_ME_INCOME_LAST_THIRTY_DAYS_Q1 (as count is 1)
I am doing this in this way but getting bad substitution error and I also tried
${IMPALA_CON} -q "${A}"
but not getting a successful result.
You seem to be looking for --var (IMPALA-2179). To substitute from the command line, you can do:
impala-shell -f test.q --var=L=2;
where test.q is:
select * from p_test limit ${VAR:L};
Your query should be :
impala-shell -q "$A"
Refer:
impala-shell Configuration Options
similar post
impala-shell -i node.domain:port -B --var"table=metadata" --var="db=retail" -f "file.sql"
file.sql:
SELECT * FROM ${var:db}.${var:table}"

Bash string parsing, skewed by internal strings

I have a string in my bash shell like:
out=$(su - user -c "someCommand -f 'string text "problemString"'")
The problem here is that it's getting parsed as so:
out=\$(su - user -c \"someCommand -f 'string text \"problemString\"'\")
I don't want "problemString" to be parsed out -- i.e., it needs to stay exactly as-is, including the quotes. How can I do that?
Update: I've attempted to escape the inner " with:
out=$(su - user -c "someCommand -f 'string text \"problemString\"'"),
but when the command is executed on the host machine, it returns an error from someCommand:
Unknown command '\p'
Update 2:
Real example:
OUTPUT=$(su - mysql -c "mysql --skip-column-names --raw --host=localhost --port=3306 --user=user--password=pass -e 'show variables where variable_name = \"max_connections\"'")
I'm passing this bash script via fabric in Python:
# probably not relevant, but just in case..
def ParseShellScripts(runPath, commands):
for i in range(len(commands)):
if commands[i].startswith('{shell}'):
# todo: add validation/logging for directory `sh` and that scripts actually exist
with open(os.path.join(runPath, 'sh', commands[i][7:]),"r") as shellFile:
commands[i] = shellFile.read()
print commands[i]
return commands
This prints:
OUTPUT=$(su - mysql -c "mysql --skip-column-names --raw --host=localhost --port=3306 --user=pluto_user --password=pluto_user -e 'show variables where variable_name = \"max_connections\"'")
which then gets executed on some remote box via fabric, which results in ERROR at line 1: Unknown command '\m'.
You can write:
out=$(su - user -c "someCommand -f 'string text \"problemString\"'")
Simply use single quotes. Strings in single quotes don't get parsed or interpreted. For instance:
echo 'a"b'
outputs:
a"b
Because no parsing occurs.
For reference: bash manual on quoting.

Resources