MonetDB doesn't recognize function names given to mclient via command line - bash

I am trying to export a few columns from a table as encoded integer.
Basically I want to use a bash script to pass the SQL command to mclient as command line argument. My bash script looks like:
#!/bin/bash
dir=`pwd`
for col in occupation native_country martial_status race sex
do
mclient -d adult -s \
"create function encode${col}(s varchar(200)) returns int begin return (select code from ${col}_dict where ${col}=s); end;"
mclient -d adult -s \
"COPY (select encode${col}($col) from adult) INTO '${dir}/${col}.txt' NULL AS '0'"
mclient -d adult -s \
"drop function encode${col}"
done
In each iteration, I want to create a SQL function on the fly. Then use the function to encode an attribute and export it to a text file. And lastly drop the function.
However, the output strangely contains some monster characters,
as if it can't recognize the function name.
If I remove the second mclient command, the other operations are successful.
operation successful
Function 'X�..X�' not defined
operation successful
operation successful
Function '��"X�.X�' not defined
operation successful
operation successful
Function ' X�.PX�' not defined

Related

why result set value not stored in arraylist in shell script

sample code below
psql -h $host -U postgres -d postgres -At -c "select partner_country_id as country , case when (threshold is null) then global_threshold else threshold end as threshold from ra_country_day_threshold " \
| while read -a Record
do
arrIN=(${Record[0]//|/ })
col1=${arrIN[0]}
col2=${arrIN[1]}
country_array["$col1"]="$col2"
echo "Col1:$col1 Col2:$col2"
done
echo "Elements:${country_array[#]}"
echo "length: ${#country_array[#]}"
Result
empty elements and length 0
The answer is simple, while command create a subprocess with its own context, if you create a new variable in that context, it will not be accessible outside of it.
Meaning the variable will not be accessible when you are outside the loop.
My suggestion is that you store the result inside a temporary file that will be available within all your script, then outside your loop, read that file.

Passing parameter in a BigQuery Script

I want to pass argument to a BigQuery script in shell, here is the example of script I wrote
#!/bin/bash
bq query --use_legacy_sql=false --destination_table=abc --append 'select * from `xyz.INFORMATION_SCHEMA.VIEWS` union all Select * from `def.VIEWS`) where table_name = "$1"'
when I run this script and pass the argument, I do not get any errors but no row is appended to the table. whereas when i specify the table_name as rty that row is appended to the table. What am I missing here?
When you run the script you'll get a prompt like:
Waiting on <BIGQUERY_JOB_ID> ... (0s) Current status: DONE
You can inspect the job in many ways, including the bqtool:
bq show -j --format=prettyjson <BIGQUERY_JOB_ID>
If you have jq installed (sudo apt install jq) you can get just the translated query with:
bq show -j --format=prettyjson <BIGQUERY_JOB_ID> | jq '.configuration.query.query'
which will get you something similar to:
select * from xyz.INFORMATION_SCHEMA.VIEWS where table_name = \"$1\"
As you can see the variable is not correctly escaped so no table matches the WHERE filter. To avoid this you can enclose the query in double quotes and the variable in single ones like this:
#!/bin/bash
bq query \
--use_legacy_sql=false \
--destination_table=xyz.abc \
--append \
"select * from xyz.INFORMATION_SCHEMA.VIEWS where table_name='$1'"
You can get the INFORMATION_SCHEMA.VIEWS: command not found error if using back-ticks. You can omit or escape them with a backslash:
"select * from \`xyz\`.INFORMATION_SCHEMA.VIEWS where table_name='$1'"

BigQuery BashScript-- Not transferring to the destination

I wrote a simple bash script to that takes the results from a query and appends them to an existing table. My script executes but the data doesn't seem to make it to the destination table. Any idea what i might be doing wrong? is it possible that I can't use a partition ($) as a destination?
Thank you so much for your help.
#!/bin/bash
bq query \
--destination_table=logs.p_activity_428001$20170803 \
--append_table <<EOF
SELECT
*
FROM log.p_activity_428001
where _PARTITIONTIME = TIMESTAMP('2017-08-03')
EOF
You need to escape the dollar sign; bash is expanding the positional parameter $20170803, which is empty unless you provide 20,170,803 arguments to the script. A single backslash will suffice:
#!/bin/bash
bq query \
--destination_table=logs.p_activity_428001\$20170803 \
--append_table <<EOF
SELECT
*
FROM log.p_activity_428001
where _PARTITIONTIME = TIMESTAMP('2017-08-03')
EOF
although single-quoting the whole table name may be more readable:
#!/bin/bash
bq query \
--destination_table='logs.p_activity_428001$20170803' \
--append_table <<EOF
SELECT
*
FROM log.p_activity_428001
where _PARTITIONTIME = TIMESTAMP('2017-08-03')
EOF

How to fetch more than one column value from oracle select query to shell variable

I am trying to fetch a row with more than one column value to different shell variables. Infact I found that at a time all the column values can be stored to single shell variable. But how can I put those column values to seperate shell variables. Below is an example I am trying for time being
function sqlQuery {
sqlplus -S shiyas/********* <<'EOF'
set heading OFF termout ON trimout ON feedback OFF
set pagesize 0
SELECT name,open_mode from v$database;
EOF
}
OUTPUT="$( sqlQuery )"
echo $OUTPUT
Here I am getting the output as
ORCL READ WRITE
But my requirement is column values ORCL, READ WRITE should get assigned to different shell variable.
I tried the below of parsing.
echo "$OUTPUT" | while read name open_mode
but it was throwing unexpected end of file error.
-bash-3.2$ sh call_sql_col_val_1.sh
ORCL READ WRITE
call_sql_col_val_1.sh: line 18: syntax error: unexpected end of file
Please let me know what concept I can use to fetch a single row column values to different shell variables.
I do this via eval myself:
oracle#******:/*****> cat test.sh
#!/bin/bash
function sqlQuery {
sqlplus -S / as sysdba <<'EOF'
set heading OFF termout ON trimout ON feedback OFF
set pagesize 0
SELECT name,open_mode from v$database;
EOF
}
eval x=(`sqlQuery`)
NAME=${x[0]}
OPEN_MODE="${x[1]} ${x[2]}"
echo NAME IS $NAME
echo OPEN_MODE IS $OPEN_MODE
So we are running the same function you have above, passing it into x and running it through eval to handle the delimitation. Then you have an array and call call is as such: x[0] for the first item, for example.
Output is:
oracle#******:/******> sh test.sh
NAME IS ******
OPEN_MODE IS READ WRITE

Bash script makes connection using FreeTDS, interacts, doesn't exit (just hangs)

I'm using FreeTDS in a script to insert records into a MSSQL database. TheUSEandINSERTcommands work, but theexitcommand doesn't and it hangs. I've tried redirectingstdoutbutcatcomplains. I suppose I will use Expect otherwise. Meh. Thanks.
echo -e "USE db\nGO\nINSERT INTO db_table (id, data, meta)\nVALUES (1, 'data', 'meta')\nGO\nexit" > tempfile
cat tempfile - | tsql -H 10.10.10.10 -p 1433 -U user -P pass
Did you mean to do this: cat tempfile -? It means that it will wait for you to press Ctrl+D, because it is trying to read from standard input as well.
If not, remove the -.
Also, as Ignacio suggests, you could write it more cleanly as a heredoc:
tsql -H 10.10.10.10 -p 1433 -U user -P pass <<EOF
USE db
GO
INSERT INTO db_table (id, data, meta)
VALUES (1, 'data', 'meta')
GO
exit
EOF
Or just do the echo with literal newlines rather than \n:
echo "
USE db
GO
INSERT INTO db_table (id, data, meta)
VALUES (1, 'data', 'meta')
GO
exit
" > tempfile
and then run it by using standard input redirection (<) like this:
tsql -H 10.10.10.10 -p 1433 -U user -P pass < tempfile

Resources