badly placed ()'s when creating oracle database link in tcsh - oracle

#my code
echo \
'create database link remotec101 \
connect to "os_user" \
identified by "password" \
using ' \
(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP) \
(HOST=c101) \
(PORT=1521)) \
(CONNECT_DATA=(SID=XE)))';'|sqlplus
I tried to run some sql in this way and it worked. But when creating database link I got error, saying badly placed ()'s
This code is in tcsh.
Please help me.
Thanks

The parentheses are not quoted, so they're treated as shell metacharacters.
This:
echo \
'create database link remotec101 \
connect to "os_user" \
identified by "password" \
using \
(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP) \
(HOST=c101) \
(PORT=1521)) \
(CONNECT_DATA=(SID=XE)));' | sqlplus
will feed the following to the sqlplus command:
create database link remotec101
connect to "os_user"
identified by "password"
using
(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)
(HOST=c101)
(PORT=1521))
(CONNECT_DATA=(SID=XE)));
But a "here document" is probably cleaner:
sqlplus <<'EOF'
create database link remotec101
connect to "os_user"
identified by "password"
using
(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)
(HOST=c101)
(PORT=1521))
(CONNECT_DATA=(SID=XE)));
'EOF'
If you want the last 4 lines to become a single line of input to sqlplus, I think you'll need to put them all on one line in your script. Or you might find it easier to use the printf command to organize your output, for example:
printf '%s\n%s\n%s\n%s\n%s %s %s %s\n' \
'create database link remotec101' \
'connect to "os_user"' \
'identified by "password"' \
'using' \
'(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)' \
'(HOST=c101)' \
'(PORT=1521))' \
'(CONNECT_DATA=(SID=XE)));' | sqlplus
This prints the last 4 lines as a single line. You can adjust the format string as needed.

I figured it out. You have to utilize the fact that tcsh supports both quote types and echo allows multiple arguments:
echo 'create database link remotec101 \
connect to "os_user" \
identified by "password" \
using ' "'" ' ( DESCRIPTION= ( ADDRESS= ( PROTOCOL=TCP )
( HOST=c101 )
( PORT=1521 ) )
( CONNECT_DATA= ( SID=XE ) ) ) ' "';"

Related

Saving a single log file from multiples sh scripts

I have 2 sh scripts that saves log files accordingly, but and I want to create another sh to call them.
They are using a property file that contains 3 log functions
my first sh(bteq.sh) creates a table in my Teradata
#!/bin/bash
source "/valdc_procs/properties/general_config_file"
exec_logs "/valdc_procs/logs/" "BTEQ_VALDC_PROCS"
echo "INFO : Starting table creation"
function exec_btqe (){
bteq<<EOF 2>&1
.logmech LDAP
.LOGON ${ipServerPR}/${userNamePR},${passwordPR};
.SET TITLEDASHES OFF;
.SET WIDTH 2000
.SET SEPARATOR ';'
SELECT 1
FROM DBC.TABLES
WHERE DatabaseName = ''
AND TableName = ''
AND TableKind = 'T';
.IF ACTIVITYCOUNT = 0 THEN .GOTO CreateNewTable;
DROP TABLE ;
.LABEL CreateNewTable;
CREATE MULTISET TABLE xpto AS (
SELECT
...
)
WITH DATA;
.LOGOFF;
.QUIT;
EOF
}
exec_btqe
#it gets the rc of the last command if it's an error
rc_result "Error creating table " "/valdc_procs/logs/" "BTEQ_VALDC_PROCS"
#it logs the process when there were no errors
log_output "/valdc_procs/logs/" "BTEQ_VALDC_PROCS"
exit $rc
my second sh(tdch.sh) export that table to a file
#!/bin/bash
source "/valdc_procs/properties/general_config_file"
exec_logs "/valdc_procs/logs/" "TDCH_VALDC_PROCS"
hadoop jar $TDCH_JAR com.teradata.connector.common.tool.ConnectorImportTool \
-libjars $LIB_JARS \
-Dmapred.job.queue.name=default \
-Dtez.queue.name=default \
-Dmapred.job.name=TDCH \
-classname com.teradata.jdbc.TeraDriver \
-url jdbc:teradata://$ipServer/logmech=ldap,database=$database,charset=UTF16 \
-jobtype hdfs \
-fileformat textfile \
-separator ',' \
-enclosedby '"' \
-targettable ${targetTable} \
-username ${userName} \
-password ${password} \
-sourcequery "select * from ${database}.${targetTable}" \
-nummappers 1 \
-sourcefieldnames "" \
-targetpaths ${targetPaths}
#it gets the rc of the last command if it's an error
rc_result "Error exporting the file " "/valdc_procs/logs/" "TDCH_VALDC_PROCS"
echo "INFO : Moving file from HDFS to the FileSystem"
hdfs dfs -copyToLocal ${targetPaths}/"part-m-00000" ${targetFileSystemPath}/ARQ_VALDC_PROCS_OPBK_$TIMESTAMP
rc_result "Error moving the file " "/valdc_procs/logs/" "TDCH_VALDC_PROCS"
echo "INFO : File Moved Arquivo movido com sucesso"
#it logs the process when there were no errors
log_output "/valdc_procs/logs/" "TDCH_VALDC_PROCS"
exit $rc
This log process is working fine when I run each sh individually, but now I want to have another sh calling them.
But I'm not sure how can I get the rc of their commands(bteq.sh and tdch.sh) and save in a single log file
Not 100% certain I understand your need, but that might solve your question:
#!/bin/bash
logfile="/tmp/somefile.txt"
bteq.sh
status_bteq=$?
tdch.sh
status_tdch=$?
echo "Status bteq: $status_bteq" >"$logfile"
echo "Status tdch: $status_tdch" >>"$logfile"
If you want just the return codes, raw without anyother text, you could do:
#!/bin/bash
logfile="/tmp/somefile.txt"
bteq.sh
echo "$?" >"$logfile"
tdch.sh
echo "$?" >>"$logfile"

Exporting data from Teradata to HDFS using TDCH

I'm trying to export a table from Teradata into a file in my hdfs using TDCH.
I'm using the below parameters :
hadoop jar $TDCH_JAR com.teradata.connector.common.tool.ConnectorImportTool \
-libjars $LIB_JARS \
-Dmapred.job.queue.name=default \
-Dtez.queue.name=default \
-Dmapred.job.name=TDCH \
-classname com.teradata.jdbc.TeraDriver \
-url jdbc:teradata://$ipServer/logmech=ldap,database=$database,charset=UTF16 \
-jobtype hdfs \
-fileformat textfile \
-separator ',' \
-enclosedby '"' \
-targettable ${targetTable} \
-username ${userName} \
-password ${password} \
-sourcequery "select * from ${database}.${targetTable}" \
-nummappers 1 \
-sourcefieldnames "" \
-targetpaths ${targetPaths}
It's working, but I need the headers in the file, and when I add the parameter:
-targetfieldnames "ID","JOB","DESC","DT","REG" \
It doesnt work, I don't even generate the file anymore.
Can anyonne help me?
The -targetfieldnames option is only valid for -jobtype hive.
It does not put headers in the HDFS file, it specifies Hive column names.
(There is no option to prefix CSV with a header record.)
Also the value supplied for -targetfieldnames should be a single string like "ID,JOB,DESC,DT,REG" rather than a list of strings.

LiquiBase CLI BASH script not outputting full CLI with ECHO

Hey all I am new to the world of BASHing :)
I am trying to put together this CLI line of code by doing the following:
echo "------"
echo '--url=$URL '\
'--username=$UN '\
'--password=$PW '\
'--referenceUrl=$RURL '\
'--referenceUsername=$RUN '\
'--referencePassword=$RPW '\
'--changeSetAuthor=TD '\
'--diffTypes=tables, views, columns, indexes, foreignkeys, primarykeys, uniqueconstraints '\
'$LIQUI_ACTION'
echo "------"
And the output I am getting is this:
------
--url=$URL --username=$UN
/jenkins/liquibase-3.6.2/liquibase: line 141: --password=$PW : command not found
I can tell right off that its not placing the varibles' value within the cli. Second, I'm not sure why its saying command not found when I am only outputting text? I am just doing the above to make sure all of it is indeed the correct data in the $ and also that it's spaced between each command correctly.
So any help about the above would be great!
UPDATE FOR #RavinderSingh13:
code used:
echo "------"
echo "--url=$URL \
--username=$UN \
--password=$PW \
--referenceUrl=$RURL \
--referenceUsername=$RUN \
--referencePassword=$RPW \
--changeSetAuthor=TD \
--diffTypes=tables, views, columns, indexes, foreignkeys, primarykeys, uniqueconstraints \
$LIQUI_ACTION"
echo "------"

Cron job - exported file emailed is blank

I am receiving the email from the following CRON job but with a blank file attached. I have checked the /home/XXXXXX/public_html/tmp/ directory and a good tqhsa_hikashop_order.txt file is created in that /tmp/ folder but the one attached to the email that I receive is empty.
mysql --database=XXXXXXXX_jml_6UCHadruwR8veju3 -u XXXXXXXX_8eh6Nu --password=fRa6r7wRerEtuzud -B -e "SELECT order_number, order_created, order_invoice_number, order_full_price, order_discount_code, order_discount_price, order_payment_method, order_payment_price FROM tqhsa_hikashop_order;" > /home/XXXXXXXX/public_html/tmp/tqhsa_hikashop_order.txt | mail -s "Daily Discount Table Dump" -a /home/XXXXXXXX/public_html/tmp/tqhsa_hikashop_order.txt luke#example.com
Any ideas why the good text file is not being attached?
You need to ensure that your first command finishes before running the second, so that the file exists on disk.
So essentially replace the | with a &&:
mysql \
--database=XXXXXXXX_jml_6UCHadruwR8veju3 \
-u XXXXXXXX_8eh6Nu \
--password=fRa6r7wRerEtuzud \
-B \
-e "SELECT order_number, order_created, order_invoice_number, order_full_price, order_discount_code, order_discount_price, order_payment_method, order_payment_price FROM tqhsa_hikashop_order;" > /home/XXXXXXXX/public_html/tmp/tqhsa_hikashop_order.txt \
&& \
mail \
-s "Daily Discount Table Dump" \
-a /home/XXXXXXXX/public_html/tmp/tqhsa_hikashop_order.txt luke#example.com
To illustrate using an example:
myuser#myshell:~ $ echo "testing" > testy.txt | cat testy.txt
cat: testy.txt: No such file or directory
myuser#myshell:~ $ echo "testing" > testy.txt && cat testy.txt
testing

Escape character for sql query

Im using the below query in the properties file and using in the shell script but due to special characters in the query it not giving me the output with special characters.
query="select top 10 source_system,updt_etl_instnc_run_id,negative_posting_flag, to_number(to_varchar(to_date(create_tmstmp),'yyyymm')) as part_date from c_fin_a.gl_transaction_data where to_number(to_varchar(to_date(create_tmstmp),'yyyymm'))=$NOW and \$CONDITIONS"
I have used escape characters for all special chars then also its not giving me same output with escape characters.
query= \ " select top 10 source_system,updt_etl_instnc_run_id,negative_posting_flag, to_number \ (to_varchar \ ( to_date \ ( create_tmstmp \ ) , \ ' yyyymm \ ' \ ) \ ) as part_date from c_fin_a.gl_transaction_data where to_number \ ( to_varchar \ ( to_date \ ( create_tmstmp \ ) , \ ' yyyymm \ ' \ ) \ )= \ $NOW and \ \$CONDITIONS \ "
First: never use spaces around the equal sign when you assign a value to variable. I mean var=value is OK, var = value is not ok.
Now, let's assume your shell variables have the following values:
NOW=201603
CONDITIONS="city='New York'"
Then you need to use the following:
query="select top 10 source_system,updt_etl_instnc_run_id,negative_posting_flag, to_number(to_varchar(to_date(create_tmstmp),'yyyymm')) as part_date from c_fin_a.gl_transaction_data where to_number(to_varchar(to_date(create_tmstmp),'yyyymm'))=${NOW} and ${CONDITIONS}"
to generate a valid SQL statement.

Resources