sqlplus SP2-0734 Command not found ignoring rest of the line - oracle

I have a script which runs the following sqlplus query
sqlplus -s "username/password#//hostname:port/SID" <<< $1 > $2
In the above statement $1 is a query that is to be executed and $2 is the name of the filePath where the results needs to be written
$2 is something like /data/files/...
On Executing the script I get SP2-0734 command not found "/data/files/..." ignoring rest of the lines
I am not sure what's wrong here

Related

Execute awk output exactly inside of executed script

There is some similar topics, but this is slightly different.
I have database with names of scripts and parameters a. When I execute:
sqlite3 log/log.db "select name, a from result" | awk -F '|' '{printf("a[%s]=%s;\n",$1,$2);}'
I see:
a[inc.bash]=4.23198234894777e-06;
a[inc.c]=3.53343440279423e-10;
In my bash script I would like to use an associative array.
When I execute this code (coding by hand value of a[inc.bash]):
declare -A a
a[inc.bash]=4.23198234894777e-06;
echo ${a[inc.bash]}
It works correctly and print
4.23198234894777e-06
But I do not know, how to use output of first presented command with awk to assign values of key of associative array a declared in my script.
I want to execute code that is printed by awk inside of my script, but when I use something like $() or ``, it prints a error like this:
code:
declare -A a
$(sqlite3 log/log.db "select name, a from result" | awk -F '|' '{printf("a[%s]=%s;\n",$1,$2);}')
echo ${a[inc.bash]}
output:
a[inc.bash]=4.23198234894777e-06; not found command
To tell Bash to interpret your output as commands, you can use process substitution and the source command:
declare -A a
source <(sqlite3 log/log.db "select name, a from result" |
awk -F '|' '{printf("a[%s]=%s;\n",$1,$2);}')
echo ${a[inc.bash]}
The <() construct (process substitution) can be treated like a file, and source (or the equivalent .) runs the commands in its argument without creating a subshell, making the resulting a array accessible in the current shell.
A simplified example to demonstrate, as I don't have your database:
$ declare -A a
$ source <(echo 'a[inc.bash]=value')
$ echo "${a[inc.bash]}"
value
This all being said, this is about as dangerous as using eval: whatever the output of your sqlite/awk script, it will be executed!

Passing arguments to a shell script

I want to execute a shell script using usage() in the script.
I have to give three parameters in the usage block like below:
./script_name server_name path flag
How do I write the code block for this?
Finally, I need to read all the three parameters in a line to execute a jar file.
once you pass parameters you can access them with $1, $2 ... $n and $# to get everything at once.
./script_name server_name path flag
In the script code
echo $1, $2, $3
echo "$#" #will print the same thing

Passing a variable into awk within a shell script

I have a shell script that I'm writing to search for a process by name and return output if that process is over a given value.
I'm working on finding the named process first. The script currently looks like this:
#!/bin/bash
findProcessName=$1
findCpuMax=$2
#echo "parameter 1: $findProcessName, parameter2: $findCpuMax"
tempFile=`mktemp /tmp/processsearch.XXXXXX`
#echo "tempDir: $tempFile"
processSnapshot=`ps aux > $tempFile`
findProcess=`awk -v pname="$findProcessName" '/pname/' $tempFile`
echo "process line: "$findProcess
`rm $tempFile`
The error is occuring when I try to pass the variable into the awk command. I checked my version of awk and it definitely does support the -v flag.
If I replace the '/pname/' portion of the findProcess variable assignment the script works.
I checked my syntax and it looks right. Could anyone point out where I'm going wrong?
The processSnapshot will always be empty: the ps output is going to the file
when you pass the pattern as a variable, use the pattern match operator:
findProcess=$( awk -v pname="$findProcessName" '$0 ~ pname' $tempFile )
only use backticks when you need the output of a command. This
`rm $tempFile`
executes the rm command, returns the output back to the shell and, it the output is non-empty, the shell attempts to execute that output as a command.
$ `echo foo`
bash: foo: command not found
$ `echo whoami`
jackman
Remove the backticks.
Of course, you don't need the temp file at all:
pgrep -fl $findProcessName

Error in awk code in shell script

I'm using the ls command to list files to be used as input. For each file found, I need to
Perform a system command (importdb) and write to a log file.
Write to an error log file if the first character of column 2, line 6 of the log file created in step 1 is not "0".
rename the file processed so it won't get re-processed on the next run.
My script:
#!/bin/sh
ls APCVENMAST_[0-9][0-9][0-9][0-9]_[0-9][0-9] |
while read LINE
do
importdb -a test901 APCVENMAST ${LINE} > importdb${LINE}.log
awk "{if (NR==6 && substr($2,1,1) != "0")
print "ERROR processing ", ${LINE} > importdb${LINE}err.log
}" < importdb${LINE}.log
mv ${LINE} ${LINE}.PROCESSED
done
This is very preliminary code, and I'm new to this, but I can't get passed parsing errors as the one below.
The error context is:
{if (NR==6 && >>> substr(, <<< awk The statement cannot be correctly parsed.
Issues:
Never double quote an awk script.
Always quote literal strings.
Pass in shell variables correctly either by using -v if you need to access the value in the BEGIN block or after the scripts i.e. awk -v awkvar="$shellvar" 'condition{code}' file or by awk condition{code}' awkvar="$shellvar"
Always quote shell variables.
Conditional should be outside block.
There is ambiguity with redirection and concatenation precedence so use parenthesis.
So the corrected (syntactical) script:
awk 'NR==6 && substr($2,1,1) != 0 {
print "ERROR processing ", line > ("importdb" line "err.log")
}' line="${LINE}" "importdb${LINE}.log"
You have many more issues but as I don't know what you are trying to achieve it's difficult to suggest the correct approach...
You shouldn't parse the output of ls
Awk reads files you don't need to loop using shell constructs

Shell script isn't working correctly on crontab, works when manually called

I've got a script in sh under Solaris 5.8 that isn't working as expected and don't really know why...
The script reads a list of URLs from a file, tests them with curl and writes the output to a log file:
#!/bin/sh
# Logs path
LOG_DIR=/somedir/logs
# URLs file path
URL_FILE=/somedir/url
# Actual date
DATE=`date +%Y%m%d%H%M`
# CURL
CURL=/somedir/bin/curl
test_url()
{
cat $URL_FILE | grep -i $1 | while read line
do
NAME=`echo $line | awk '{ print $1 }'`
URL=`echo $line | awk '{ print $2 }'`
TIME=`$CURL -s -o /dev/null -w %{time_total} $URL`
echo "$DATE $TIME" >> $LOG_DIR/${NAME}_${1}.log
done
}
test_url someurl
test_url someotherurl
The URL_FILE has this layout:
somename1 http://someurl/test
somename2 http://someotherurl/test
The script loads the URLs from the file and then uses curl to get the total time the URL takes to load, then prints the date and the time (in ms). The problem I find is that the variable TIME doesn't work when called inside a crontab, but it does when called with the user itself:
# Output when called with the user ./script.sh
201202201018 0.035
# Output when called from crontab.
201202201019
If I redirect all output * * * * * /path/to/script/script.sh 1&2 > /tmp/output, the output file is blank.
Also I haven't been able to see any output in /var/log/syslog about it. Any clue why TIME variable isn't displaying correctly when called via crontab?
Things you should check out:
Is /path/to/script/script.sh 1&2 > /tmp/output valid in your cron? On my machine, it would run the script with argument "1" and try to find a program called "2" to run it. Failing to find it, it creates an empty output file. I think you're looking for something like /path/to/script/script.sh >> /tmp/output 2>&1
Do you set CURL to the full path of curl? Cron normally doesn't have the same path information that you have.
Which user do you use for running cron? Could there be access restrictions to curl or the network?
% is indeed handled differently by cron, if it's written in the crontab. It means newline, and should otherwise be escaped. Have you been running tests with curl directly in cron? As long as you keep this in your script you should be fine.

Resources