Can some one please help me to let me know how to call a oracle subroutine in Perl script
I have a procedure already existing in oracle db. Say below
Case 1) return a ref cursor and accept a IN parameter.
CREATE OR REPLACE PROCEDURE procCursorExample(
cursorParam OUT SYS_REFCURSOR, userNameParam IN VARCHAR2)
IS
BEGIN
OPEN cursorParam FOR
SELECT * FROM DBUSER WHERE USERNAME = userNameParam;
END;
In SQL developer i can then execute it directly:
DECLARE
dbUserCursor SYS_REFCURSOR;
dbUserTable DBUSER%ROWTYPE;
BEGIN
procCursorExample(dbUserCursor,'mkyong');
LOOP
FETCH dbUserCursor INTO dbUserTable;
EXIT WHEN dbUserCursor%NOTFOUND;
dbms_output.put_line(dbUserTable.user_id);
END LOOP;
CLOSE dbUserCursor;
END;
Can some tell me how to invoke subroutine with argument through Perl script
Anwser
#!/usr/bin/perl
use warnings ;
use strict ;
use DBI;
print "Connecting to DB..";
my $dbh = DBI->connect('dbi:Oracle:xe', 'scott', 'tiger') or
die "Cannot connect to DB => " . DBI->errstr;
# prepare ????????
am not sure about prepare statement. Any help is highly appreciable.
Please read the documentation examples for calling stored procedures in Perl with DBD::Oracle, which is the driver you are using.
From this link specifically:
use DBI;
my($db, $csr, $ret_val);
$db = DBI->connect('dbi:Oracle:database','user','password')
or die "Unable to connect: $DBI::errstr";
# So we don't have to check every DBI call we set RaiseError.
# See the DBI docs now if you're not familiar with RaiseError.
$db->{RaiseError} = 1;
# Example 1 Eric Bartley <bartley#cc.purdue.edu>
#
# Calling a PLSQL procedure that takes no parameters. This shows you the
# basic's of what you need to execute a PLSQL procedure. Just wrap your
# procedure call in a BEGIN END; block just like you'd do in SQL*Plus.
#
# p.s. If you've used SQL*Plus's exec command all it does is wrap the
# command in a BEGIN END; block for you.
$csr = $db->prepare(q{
BEGIN
PLSQL_EXAMPLE.PROC_NP;
END;
});
$csr->execute;
# Example 2 Eric Bartley <bartley#cc.purdue.edu>
#
# Now we call a procedure that has 1 IN parameter. Here we use bind_param
# to bind out parameter to the prepared statement just like you might
# do for an INSERT, UPDATE, DELETE, or SELECT statement.
#
# I could have used positional placeholders (e.g. :1, :2, etc.) or
# ODBC style placeholders (e.g. ?), but I prefer Oracle's named
# placeholders (but few DBI drivers support them so they're not portable).
my $err_code = -20001;
$csr = $db->prepare(q{
BEGIN
PLSQL_EXAMPLE.PROC_IN(:err_code);
END;
});
$csr->bind_param(":err_code", $err_code);
# PROC_IN will RAISE_APPLICATION_ERROR which will cause the execute to 'fail'.
# Because we set RaiseError, the DBI will croak (die) so we catch that with eval.
eval {
$csr->execute;
};
print 'After proc_in: $#=',"'$#', errstr=$DBI::errstr, ret_val=$ret_val\n";
# Example 3 Eric Bartley <bartley#cc.purdue.edu>
#
# Building on the last example, I've added 1 IN OUT parameter. We still
# use a placeholders in the call to prepare, the difference is that
# we now call bind_param_inout to bind the value to the place holder.
#
# Note that the third parameter to bind_param_inout is the maximum size
# of the variable. You normally make this slightly larger than necessary.
# But note that the Perl variable will have that much memory assigned to
# it even if the actual value returned is shorter.
my $test_num = 5;
my $is_odd;
$csr = $db->prepare(q{
BEGIN
PLSQL_EXAMPLE.PROC_IN_INOUT(:test_num, :is_odd);
END;
});
# The value of $test_num is _copied_ here
$csr->bind_param(":test_num", $test_num);
$csr->bind_param_inout(":is_odd", \$is_odd, 1);
# The execute will automagically update the value of $is_odd
$csr->execute;
print "$test_num is ", ($is_odd) ? "odd - ok" : "even - error!", "\n";
# Example 4 Eric Bartley <bartley#cc.purdue.edu>
#
# What about the return value of a PLSQL function? Well treat it the same
# as you would a call to a function from SQL*Plus. We add a placeholder
# for the return value and bind it with a call to bind_param_inout so
# we can access its value after execute.
my $whoami = "";
$csr = $db->prepare(q{
BEGIN
:whoami := PLSQL_EXAMPLE.FUNC_NP;
END;
});
$csr->bind_param_inout(":whoami", \$whoami, 20);
$csr->execute;
print "Your database user name is $whoami\n";
$db->disconnect;
Related
I have a stored procedure in which a linked server is addressed only under certain conditions.
Especially on slow network connections, it becomes clear that the execution of the stored procedure is slowed down simply by naming (and not using) the linked server (runtime greater about 5 seconds).
DROP PROC IF EXISTS dbo.slowMystery;
GO
CREATE PROC dbo.slowMystery
AS
BEGIN
DECLARE #innerTimestamp DATETIME = GETDATE();
IF 1 = 0
BEGIN -- this will never be executed!
SELECT TOP 1
*
FROM
[remoteSqlInstance].someDb.dbo.someTable;
END;
ELSE
BEGIN
PRINT 'Hello world!';
END;
PRINT CONCAT('inner runtime: ', DATEDIFF(MILLISECOND, #innerTimestamp, GETDATE()));
END;
GO
Now, give it a try:
DECLARE #outerTimestamp DATETIME = GETDATE();
EXEC dbo.slowMystery;
PRINT CONCAT('outer runtime: ', DATEDIFF(MILLISECOND, #outerTimestamp, GETDATE()));
Result:
Hello world!
inner runtime: 0
outer runtime: 3733
As you can see in the example, the code regarding the linked sever is never executed.
The (inner) runtime of the stored procedure is therefore 0 milliseconds. However, the period for the entire execution is around 4 seconds (connection to the linked server via a slow vpn), although the linked server is de facto not addressed.
If you remove the SELECT-statement, it quickly becomes clear that the outer runtime is not determined by other factors:
...
IF 1 = 0
BEGIN -- this will never be executed!
PRINT 'This will never be executed!'
END;
...
Result:
Hello world!
inner runtime: 0
outer runtime: 0
I think this is because the local server checks if the linked server is available before executing the stored procedure.
Is there an option to turn off this check or some other way to get around these delays?
Thanks for any input.
Robert
I'm using Airflow for some ETL things and in some stages, I would like to use temporary tables (mostly to keep the code and data objects self-contained and to avoid to use a lot of metadata tables).
Using the Postgres connection in Airflow and the "PostgresOperator" the behaviour that I found was: For each execution of a PostgresOperator we have a new connection (or session, you name it) in the database. In other words: We lose all temporary objects of the previous component of the DAG.
To emulate a simple example, I use this code (do not run, just see the objects):
import os
from airflow import DAG
from airflow.operators.postgres_operator import PostgresOperator
default_args = {
'owner': 'airflow'
,'depends_on_past': False
,'start_date': datetime(2018, 6, 13)
,'retries': 3
,'retry_delay': timedelta(minutes=5)
}
dag = DAG(
'refresh_views'
, default_args=default_args)
# Create database workflow
drop_exist_temporary_view = "DROP TABLE IF EXISTS temporary_table_to_be_used;"
create_temporary_view = """
CREATE TEMPORARY TABLE temporary_table_to_be_used AS
SELECT relname AS views
,CASE WHEN relispopulated = 'true' THEN 1 ELSE 0 END AS relispopulated
,CAST(reltuples AS INT) AS reltuples
FROM pg_class
WHERE relname = 'some_view'
ORDER BY reltuples ASC;"""
use_temporary_view = """
DO $$
DECLARE
is_correct integer := (SELECT relispopulated FROM temporary_table_to_be_used WHERE views LIKE '%<<some_name>>%');
BEGIN
start_time := clock_timestamp();
IF is_materialized = 0 THEN
EXECUTE 'REFRESH MATERIALIZED VIEW ' || view_to_refresh || ' WITH DATA;';
ELSE
EXECUTE 'REFRESH MATERIALIZED VIEW CONCURRENTLY ' || view_to_refresh || ' WITH DATA;';
END IF;
END;
$$ LANGUAGE plpgsql;
"""
# Objects to be executed
drop_exist_temporary_view = PostgresOperator(
task_id='drop_exist_temporary_view',
sql=drop_exist_temporary_view,
postgres_conn_id='dwh_staging',
dag=dag)
create_temporary_view = PostgresOperator(
task_id='create_temporary_view',
sql=create_temporary_view,
postgres_conn_id='dwh_staging',
dag=dag)
use_temporary_view = PostgresOperator(
task_id='use_temporary_view',
sql=use_temporary_view,
postgres_conn_id='dwh_staging',
dag=dag)
# Data workflow
drop_exist_temporary_view >> create_temporary_view >> use_temporary_view
At the end of execution, I receive the following message:
[2018-06-14 15:26:44,807] {base_task_runner.py:95} INFO - Subtask: psycopg2.ProgrammingError: relation "temporary_table_to_be_used" does not exist
Someone knows if Airflow has some way to retain the same connection to the database? I think it can save a lot of work in creating/maintaining several objects in the database.
You can retain the connection to the database by building a custom Operator which leverages the PostgresHook to retain a connection to the db while you perform some set of sql operations.
You may find some examples in contrib on incubator-airflow or in Airflow-Plugins.
Another option is to persist this temporary data to XCOMs. This will give you the ability to keep the metadata used with the task in which it was created. This may help troubleshooting down the road.
I try to write a batch file using Oracles SQLcl. In this file, i want to insert a new table row with util.execute. This just returns true / false, which is a boolean return of success/failure.
My question is, how i get the error message of the exception which is thrown, so that i can find out, what the problem is with my insert-statement.
What i do:
First of all, i connect to my database server and start my script:
me#pc:/myproject$ /sqlcl/bin/sql schemaname/pw#server.com:1521/sid
SQLcl: Release 17.3.0 Production [...]
Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 - 64bit [...]
SQL>
SQL> #mybatchscript.js path/image.jpg
My mybatchscript.js looks like this:
script
var tabName = "MY_TABLE_NAME";
var HashMap = Java.type("java.util.HashMap");
var bindmap = new HashMap();
var filePath="&1";
print("\nreading file: "+ filePath);
var blob=conn.createBlob();
var bstream=blob.setBinaryStream(1);
java.nio.file.Files.copy(java.nio.file.FileSystems.getDefault().getPath(filePath),bstream);
bstream.flush();
bindmap.put("content",blob); // has content
bindmap.put("size",blob.length()); // is 341989
// the follow command fails
var doInsert = util.execute("insert into "
+ tabName
+ " (id, main_id, file_name, file_type,"
+ " file_size, file_content, table_name)"
+ " values("
+ " SEQ_MY_TABLE_NAME.nextval, 1,"
+ " 'testname', 'image/jpeg', :size, :content,"
+ " 'my_table_name')"
,bindmap);
sqlcl.setStmt(
"show errors \n"
);
sqlcl.run();
if(!doInsert) {
print("insert failed");
print(doInsert);
exit;
}
/
The console output is like:
reading file: path/image.jpg
insert failed
false
The script is working until the util.execute insert-statement. It returns false, so the insert-statement failed. But it doesn't tell me, why. I have no idea, how i get access to the error message or the exception which is thrown inside the util.execute?
I also tried to turn on SERVEROUTPUT or ERRORLOGGING, but it has the same output as above and the error log table is empty:
SQL> set errorlogging on
SQL> show errorlogging
errorlogging is ON TABLE SPERRORLOG
SQL> set serveroutput on
SQL> show serveroutput
serveroutput ON SIZE UNLIMITED FORMAT WORD_WRAPPED
My knowledge source were these slides where my script is also based on, i didn't find information about the error / exception handling for the util functions in general?
There's basically 2 ways.
1- When using util.execute ( or any util.XYZ functions ) the last error message is retrieved with the following. I also just updated the scripting README with this : https://github.com/oracle/oracle-db-tools/blob/master/sqlcl/README.md
var msg = util.getLastException()
2- When using sqlcl.run()
There's an example I wrote here:
https://github.com/oracle/oracle-db-tools/blob/master/sqlcl/examples/audio.js
The example is a tad silly in that it makes noises on success/failure but you'll see the code that gets the error. Check the ctx.getProperty("sqldev.last.err.message" That will get the last sqlerr message.
if ( ctx.getProperty("sqldev.last.err.message") ) {
//
// FAILED !
//
play("chew_roar.wav");
} else {
//
// Success !!
//
play("R2.wav");
}
I installed the package DEXIF and am able to read some EXIF-Entries. But not computed values as described in the documentation.
The following code shows what works. For the commented lines I get the Error: identifier idents no member "focalLenght" and so on..
How can I get hold on these and more fields?
procedure TForm1.EXIFAnzeigen(filename: string);
var
ImgData: TImgData;
i :integer;
begin
//EDitor leeren
ValueListEditor1.Strings.Clear;
if FileExists(filename) then begin
ImgData:= TImgData.Create();
ImgData.Tracelevel :=1;
try
if uppercase(ExtractFileExt(filename)) = '.JPG' then begin
if ImgData.ProcessFile(filename) then begin
if ImgData.HasEXIF then begin
ValueListEditor1.InsertRow('Camera Make',
ImgData.ExifObj.CameraMake,True);
ValueListEditor1.InsertRow('Camera Modell',
ImgData.ExifObj.CameraModel,True);
ValueListEditor1.InsertRow('Picture DateTime',
FormatDateTime(ISO_DATETIME_FORMAT, ImgData.ExifObj.GetImgDateTime),True);
ValueListEditor1.InsertRow('Width',
inttostr(ImgData.ExifObj.Width),True);
ValueListEditor1.InsertRow('FlashUsed',
intToStr(ImgData.ExifObj.FlashUsed),True);
// ValueListEditor1.InsertRow('FocalLength',
// inttostr(ImgData.ExifObj.FocalLength),True);
// ValueListEditor1.InsertRow('ApertureFNumber',
// ImgData.ExifObj.ApertureFNumber,True);
// ValueListEditor1.InsertRow('ExposureTime',
// ImgData.ExifObj.ExposureTime,True);
// ValueListEditor1.InsertRow('Distance',
// ImgData.ExifObj.Distance,True);
// ValueListEditor1.InsertRow('Process',
// ImgData.ExifObj.Process,True);
end else begin
ValueListEditor1.InsertRow('No EXIF','No Data',True);
end;
end else begin
ValueListEditor1.InsertRow('No EXIF','Processdata',True);
end;
end else begin
ValueListEditor1.Strings.Clear;
end;
finally
ImgData.Free;
end;
end;
end;
The documentation says:
Some of the more common fields are accessible as properties of the
EXIFObj of the ImgData.
and shows an example reading those properties, partly same as you succeed to read with your code.
But the FocalLength, and the others that fail in your code, have to be accessed in another way as the document says:
Other EXIF field can be read by using the property TagValue and
specifying the name of the EXIF property
The following example clarifies:
ValueListEditor1.InsertRow('FocalLength',
inttostr(ImgData.ExifObj.TagValue['FocalLength']),True);
I have the following code which connects to a oracle database via soap, creates an XML Blob and returns it to the screen.
I am receiving the following error, and cannot figure out why.
array(3) {
["faultcode"]=>
string(11) "soap:Client"
["faultstring"]=>
string(22) "Error processing input"
["detail"]=>
array(1) {
["OracleErrors"]=>
string(39) "
Incorrect Input Doc/URL
"
}
}
I am using the following function to call a stored procedure.
function getUsersData(){
$xmlfunc = 'GETUSERS';
$pkg = 'JSON_EXPORTS';
$inparam = array("SESSIONHASH-VARCHAR2-IN" => $_SESSION['sessionhash']);
$outparam = array("USERSDATA-XMLTYPE-OUT");
$oradb = oradb::getconnection();
$oradb->newxml($xmlfunc,$pkg,$inparam,$outparam);
$result = $oradb->getxml(false,false,false,true);
print_r($result);
}
This is the stored procedure I am calling:
CREATE OR REPLACE PACKAGE BODY vivouser.json_exports IS
-- #Oracle bexV2
PROCEDURE getusers(sessionhash IN VARCHAR2,
usersdata OUT XMLTYPE)
IS
p_companyid number;
p_storegroupid number;
p_userid number;
BEGIN
bexcore.checksessionid(sessionhash, p_companyid, p_storegroupid, p_userid);
usersdata := bexcore.CreateXMLData(
'select userid,
tbu.companyid,
tbu.firstname,
tbu.middlename,
tbu.lastname,
tbu.gender,
tbu.payrollnumber,
tbu.ismanager,
tpt.description,
tpt.wagerate
from tbuser tbu
left join tbposition tbp using (USERID)
left join tbpositiontype tpt using (POSITIONTYPEID);'
);
END getusers;
END json_exports;
Also, please note: $_SESSION['sessionhash'] is proven to be a logical hash value. All other soap calls using this format function as expected. Bexcore.checksessionid is also proven to be valid, and not the cause of this error, as is bexcore.createXmlData (they are each used in thousands of other cases in the same way and run as expected.)
The problem I was having, was that the user accessing the database did not have permissions set to allow calling the requested packages.
use
grant all on <packagename> to <user>;
to solve this problem.