Windows batch script output not written to log - oracle

We have this Windows batch script:
call commands/do-work.cmd | tee my.log
The do-work.cmd includes
impdp user/pw#db directory=mydir dumpfile=my.dmp logfile=logdir:imp.log schemas=a,b,c,c parallel=6
(
echo my.sql
echo exit
) | sqlplus user/pw#db
call mvn clean install
Of these commands the output from sqlplus and mvn is written to my.log but the output of impdp is not. How can I get impdp output into my.log?
Tried using "call" ahead of impdp but the impdp command choked for some reason... complaining about log not found.
Any ideas?

I don't have IMPDP in order to test it how it works. So, "logfile=logdir:imp.log" generates a imp.log file, right? you want the content of this file inside of MY.LOG?
Try:
TYPE imp.log >> my.log
If IMPDP write the info in the console you can try adding ">> my.log" at the end of the command line instead.

Related

Snowsql script not executed when run as here document from bash script

I have a bash script that contains something like this:
snowsql -a <account> -u <user> --authenticator externalbrowser -d <dbname> -o quiet=false <<-EOF
!source foo.sql
EOF
When I run it I don't see any of the output from the commands in foo.sql on the screen. It also appears that none of the SQL in foo.sql is executed (not reflected in state of database). Terminal output is:
* SnowSQL * v1.2.13
Type SQL statements or !help
Goodbye!
If I run foo.sql from an interactive Snowsql session the output from foo.sql is shown and the database is updated accordingly.
Why is foo.sql not executed when called in batch mode from the bash script?
Can you try to run it using -f parameter?
https://docs.snowflake.com/en/user-guide/snowsql-use.html#running-while-connecting-f-connection-parameter
As #Gokhan says this looks to be down to the authentication method. Authentication via a non-browser method makes the problem go away.

Run sqlplus commands from multiple files with query logging or spool in Windows Batch File

I am quite new to batch scripting, and I am trying to run multiple sql files, which in turn may contain multiple sql DML/DDL queries from bat file. The output files must contain all the queries being executed and the query output. Unlike this example , I don't have spool command inside my sql file, and I can not edit the input sql files. The following command works for me in ksh file (thanks to here-document):
$sqlplus /nolog <<! >>$sqlLogs.lst
connect $USERNAME/$PASSWORD#${DBNAME}
set echo on timing on
spool ${SCRIPTRUNFILE_SPOOL}
select name from v\$database;
#${SCRIPTRUNFILE};
spool off
exit
!
I want the exact same in Windows bat file. I have tried using ^. I can't do combine all sql files into one, as I need logging for each sql file into different file. My attempt at the bat file script is as follows and I have played around this much, and it fails with spool command not recognized. I also prefixed below commands with sqlplus, but still unable to achieve something like above ksh file:
sqlplus -s username/pwd#DBName >> sqlLogs.lst
set echo on timing on
spool %RUNFILENAME%.lst
#%RUNFILENAME% > %RUNFILENAME%.lst
select name from v\$database;
spool off
quit
Following logic executes my scripts but does not log the query being executed. Also, I don't want to connect twice to the database.
echo select name from v$database; | sqlplus -s username/pwd#DBName >> sqlLogs.lst
echo quit | sqlplus -s username/pwd#DBName #%SCRIPTRUNFILE%>> %SCRIPTRUNFILE_SPOOL%.lst
Can someone here please help to spool to a file where I can log the queries as well, while maintaining a single DB Connection?
Pass your immediate SQL*Plus commands to your sqlplus via stdout, grouped together by Windows' ( and ) symbols...
(
echo.set echo on timing on
echo.spool %SCRIPTRUNFILE_SPOOL%
echo.select name from v$database;
echo.#%SCRIPTRUNFILE%
echo.spool off
echo.exit
) | sqlplus -s %USERNAME%/%PASSWORD%#%DBNAME% >> sqlLogs.lst

PSQL: How can I prevent any output on the command line?

My problem: I'm trying to run a database generation script at the command line via a batch file as part of a TFS build process to enable nightly testing on a known dataset.
The scripts we run are outputting Notices, Warnings and some Errors on the command line. I would like to suppress at least the Notices and Warnings, and if possible the Errors as they don't seem to have an impact on the overall success of the scripts. This output seems to be affecting the success or failure of the process as far as the TFS build process is concerned. It's highlighting every line of output from the scripts as errors and failing the build.
As our systems are running on Windows, most of the potential solutions I've found online don't work as they seem to target Linux.
I've changed the client_min_messages to error in the postgresql.conf file, but when looking at the same configuration from pgAdmin (tools > server configuration) it shows the value as Error but the current value as Notice.
All of the lines in the batch file that call psql use the -q flag as well but that only seems to prevent the basics such as CREATE TABLE and ALTER TABLE etc.
An example line from the batch file is:
psql -d database -q < C:\Database\scripts\script.sql
Example output line from this command:
WARNING: column "identity" has type "unknown"
DETAIL: Proceeding with relation creation anyway.
Specifying the file with the -f flag makes no difference.
I can manually run the batch file on my development machine and it produces the expected database regardless of what errors or messages show on the command prompt.
So ultimately I need all psql commands in my batch files to run silently.
psql COMMAND &> output.txt
Or, using your example command:
psql -d database -q < C:\Database\scripts\script.sql &> output.txt
use psql -o flag to send the command output to the filename you wish or /dev/null if you don't care about it.
The -q option will not prevent the query output.
-q, --quiet run quietly (no messages, only query output)
to avoid the output you have to send the query result to a file
psql -U username -d db_name -pXXXX -c "SELECT * FROM table_name LIMIT 5;" > C:\test.csv
use 1 > : create new file each time
use 2 >> : will create and keep adding

Oracle Script File Execution in *.bat file not completing

I have the following *.bat file executing itself on a schedule. This file consists of different tasks, primarily involves in having a the backup dump file exported to the Test DB Machine every day.
#echo off
#echo %time%
del C:\Factory\index.sql
echo exporting indexes...
Imp jksbschema/password file=C:\DBDUMP\jksb.dmp full=yes log=C:\Factory\log\dump_index.log indexfile=index.sql
#echo %time%
echo connecting to Oracle...
echo Delete and Re-Creation of User
sqlplus "sys/password as sysdba" #C:\Factory\script\step_1.sql
echo importing dump file begins...
#echo %time%
Imp jksbschema/password file=C:\DBDUMP\jksb.dmp full=yes log=C:\Factory\log\dump.log Compile=n INDEXES=n
echo Applying security priviledges
#echo %time%
sqlplus "sys/password as sysdba" #C:\Factory\script\step_2.sql
#echo %time%
ssr 0 "CONNECT" "REM CONNECT" C:\Factory\index.sql
echo Loading Indexes....`enter code here`
sqlplus "jksbschema/password as sysdba" #C:\Factory\index.sql
#echo %time%
exit
shutdown.exe -r -t 00
Since its a long operation, I execute itself on every night, expecting it to be ready next morning. I also include a command restart the machine once the entire process is over. All the command finishes without asking an input expect the following command:
sqlplus "jksbschema/password as sysdba" #C:\Factory\index.sql
after the above command executes, the batch file asks for an input to exit the process as the snapshot below says
Can anybody help in automatically finish the job and restart itself, rather than ask for an input after the index.sql file is loaded?
Create a file named quit.txt, inside it put the text "quit".
Where you see the lines that execute sqlplus, like this one:
sqlplus "jksbschema/password as sysdba" #C:\Factory\index.sql
change them to this:
sqlplus "jksbschema/password as sysdba" #C:\Factory\index.sql < quit.txt
Old stuff, but however: Let "exit;" be the last command of your SQL script. This tells sqlplus to quit and finish execution.

How to make stderr of OS commands ran with "!" go to Sql*plus spool?

Background
I have a long Sql*plus script that for some reason needs to run some unix commands using the exclamation mark syntax.
I write to a spool in order to have a log file at the end of the process.
The problem is that if an OS command fails, the stderr is lost and doesn't go to the spooled file.
Example code
spool mylog.txt
!echo alter user xxx identified by yyyy;
alter user xxx identified by yyyy;
!echo cp file1 folder1/
!cp file1 folder1/
!echo alter user yyy identified by xxx;
alter user yyy identified by xxx;
!echo cp file2 folder2/
!cp file2 folder2/
spool off
If one cp fails, I wouldn't know just by looking at mylog.txt
Obviously doing !cp file1 folder1/ &> mylog.txt would only mess up the log beeing spooled to in unpredictable ways.
Question
What can be done in order for the the stderr of the unix commands be writen to the file beeing spooled to ?
Update
I tried lc.'s suggestion, appending 2>&1 at the end of every cp command in order to redirect stderr to stdout but I get this:
Enter value for 1:
Update 2
SET DEFINE OFF made it to not prompt for a value. It allowed me to discover that it's not only stderr that doesn't get spooled: stdout doesn't either. It seems that everything executed with "!" is un-spool-able.
Actually, the stdout and stderr are not lost but they won't go in the spool file.
Given the script echo.sql :
spool sql.out
select 1 from dual ;
!echo 1
!invalid command
spool off
If you launch your script from a shell as so :
sqlplus *connect string* #echo.sql > host.out 2> host.err
You will get the output from your sql command in sql.out, the output from echo in host.out and the error message in host.err. If you're launching your script non interactively -from a cron or something- you'll have to capture stdout/stderr as you would do any other non sql*plus script.
Edit regarding comment
With the option SET TERMOUT ON in the script file you will have the output of your sql commands both in the spool file and stdout.
So finally, if you want everything in one file you can call your script as so :
sqlplus *connect string* #echo.sql &>echo.log
You will still have the output of just the sql in the spool file.
You can append a 2>&1 to each command to redirect stderr to stdout.
If you want to send shell command output to your sqlplus spool file, you can't do it while SPOOL has the file locked for writing. Here's the sequence in the .sql script that worked for me:
SPOOL out.log
...sql stuff here...
SPOOL OFF
host ps -eaf | grep something 1>>out.log
SPOOL out.log APPEND
...more sql stuff...

Resources