Get Output of stored procedure using PutSql in Nifi - oracle

I have to get the output of Oracle stored procedure in Nifi.
I've tried PutSql with the following sql statement :
declare out VARCHAR2 ; begin PKG_TEST.P_TEST(1,out); end;
It works fine but it just executes the script.
How can I get the value of output 'out' ?
Edit : I tried the Groovy script here :
https://community.cloudera.com/t5/Support-Questions/Does-ExecuteSQL-processor-allow-to-execute-stored-procedure/td-p/158922
I get the following error :
2022-06-17 13:38:53,353 ERROR [Timer-Driven Process Thread-9] o.a.n.p.groovyx.ExecuteGroovyScript ExecuteGroovyScript[id=26ab18f1-3b0c-18cf-d90b-3d5904676458] groovy.lang.MissingMethodException: No signature of method: Script6a6d0a35$_run_closure1.doCall() is applicable for argument types: (String, String, java.sql.Date, null, String, null, null, String...) values: [xxxx, xxxx, 2022-05-30, null, OK, null, null, ...]: groovy.lang.MissingMethodException: No signature of method: Script6a6d0a35$_run_closure1.doCall() is applicable for argument types: (String, String, java.sql.Date, null, String, null, null, String...) values: [xxxx,xxxx, 2022-05-30, null, OK, null, null, ...]
at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:255)
So I have the output of procedure but I get the error !
Script :
import org.apache.commons.io.IOUtils
import org.apache.nifi.controller.ControllerService
import org.apache.nifi.processor.io.StreamCallback
import java.nio.charset.*
import groovy.sql.OutParameter
import groovy.sql.Sql
import java.sql.ResultSet
////Get the session values from Nifi flow Start
def flowFile = session.get()
if(!flowFile) return
String TYPE_NOTIFICATION = flowFile.getAttribute('TYPE_NOTIFICATION')
String ID_NOTIFICATION = flowFile.getAttribute('ID_NOTIFICATION')
////Get the session values from Nifi flow END
String sqlString ="""{call PKG_TEST.P_TEST(?,?,?,?,?,?,?,?,?,?,?)}""";
def parametersList = [ID_NOTIFICATION, TYPE_NOTIFICATION,Sql.VARCHAR,Sql.VARCHAR,Sql.DATE,Sql.VARCHAR,Sql.VARCHAR,Sql.VARCHAR,Sql.VARCHAR,Sql.VARCHAR,Sql.DATE ];
SQL.mydbxx.call(sqlString, parametersList) {out1, out2,...->
flowFile.putAttribute("out1",out1)...
};
session.transfer(flowFile, REL_SUCCESS)
Signature of my stored procedure :
Thank you!

i can't test it - so, it's just a reference code
use ExecuteGroovyScript processor
add SQL.mydb parameter on the level of processor and link it to required DBCP pool.
set approximately this as a script body
def ff=session.get()
if(!ff)return
def statement = '''
declare out VARCHAR2;
begin
PKG_TEST.P_TEST(?, out);
end;
? := out;
'''
//parameters for each ? placeholder
def params = [
ff.param_input as Long, //get parameter value from flowfile attribute
SQL.mydb.VARCHAR, //out varchar parameter https://docs.groovy-lang.org/latest/html/api/groovy/sql/Sql.html#VARCHAR
]
SQL.mydb.call(statement, params){p_out-> //we have only one out patameter
//closure to process output parameters
ff.param_output = p_out //assign value into flowfile attribute
}
//transfer flowfile to success
REL_SUCCESS << ff

Related

DolphinDB error when calling udf: “The object date is neither a XDB connection nor a function definition.”

server version: 2.00.8 2022.09.28
I define a function myFunc in DolphinDB as follows. The definition part and each line of code in the function body can be executed successfully. But error occurs when the script is run:
Server response: 'myFunc("20230118", t) => myFunc: tmp = select * from t where date(dt) == date => The object date is neither a XDB connection nor a function definition.'
dt = 2023.01.18T04:01:51.100 2023.01.19T04:01:51.000 2023.01.19T04:01:51.900
sym = ["IBM", "MSFTN", "GOOGS"]
value = 1..3
t=table(dt, sym, value)
def myFunc(day, t){
date = temporalParse(day, "yyyyMMdd")
tmp = select * from t where date(dt)=date
return tmp
}
myFunc("20230118", t)
It works after just replaced the variable date to dateParsed:
def myFunc(day, t){
dateParsed = temporalParse(day, "yyyyMMdd")
tmp = select * from t where date(dt)=dateParsed
return tmp
}
The error is reported because a variable named “date” is defined in the function body, and the built-in function date is called later. When the function is parsed, date(dt) cannot be recognized as a function as the variable is parsed first. Thus it is not recommended to define variables with the same name as built-in functions or other keywords.
The script can be corrected to:
def myFunc(day, t){
myDate = temporalParse(day, "yyyyMMdd")
tmp = select * from t where date(dt)=myDate
return tmp
}
myFunc("20230118", t)

Oracle OCI error when executing stored procedure with output parameter

I am trying to execute a simple stored procedure using Oracle OCI. The stored procedure takes a string as an input and copies it to the output parameter. Below is the oracle statement that I am executing:
DECLARE OutParam VARCHAR2(50);
BEGIN
my_stored_procedure('Test String', OutParam);
END;
And I wrote the OCI code as follows:
/* Bind a placeholder for the output parameter */
if (status = OCIBindByPos(stmthp, &bnd6p, errhp, 1,
(dvoid *)result, 1024, SQLT_STR,
(dvoid *)0, (ub2 *)0, (ub2 *)0, (ub4)0, (ub4 *)0, OCI_DEFAULT))
{
checkerr(errhp, status);
cleanup();
return OCI_ERROR;
}
/* execute and fetch */
if (status = OCIStmtExecute(svchp, stmthp, errhp, (ub4)1, (ub4)0,
(CONST OCISnapshot *) NULL, (OCISnapshot *)NULL, OCI_DEFAULT))
{
if (status != OCI_NO_DATA)
{
checkerr(errhp, status);
cleanup();
return OCI_ERROR;
}
}
With Oracle 11g and older versions, this worked fine and I was able to get the output parameter stored in the 'result' variable I used in the OCIBindByPos call.
However, with Oracle 12 and above this does not work for me, I am getting the following error:
OCI_ERROR - ORA-03137: malformed TTC packet from client rejected: [kpoal8Check-5] [32768]
Does anyone know why this does not work with Oracle versions 12 and above? I tested this with Oracle 12 and Oracle 19 and got the same error.

What is wrong with my Oracle RAW return param?

I have a stored procedure with the following signature and local variables:
PROCEDURE contract_boq_import(i_project_id IN RAW,
i_boq_id IN RAW,
i_master_list_version IN NUMBER,
i_force_update_if_exists IN BOOLEAN,
i_user_id IN NUMBER,
o_boq_rev_id OUT RAW) AS
v_contract_id RAW(16);
v_contract_no VARCHAR2(100);
v_series_rev_id_count NUMBER(1);
v_project_id_count NUMBER(5);
v_now DATE;
v_boq_import_rev_id RAW(16);
v_master_project_id RAW(16);
v_prj_duplicate_items VARCHAR2(1000) := '';
I set up an output parameter using one of our DAL utilities:
var revParam = new byte[16];
dataHandler.CreateParameterRaw("o_boq_rev_id", revParam).Direction = ParameterDirection.Output;
Where CreateParameterRaw is declared as:
public DbParameter CreateParameterRaw(string name, object value)
{
OracleParameter oracleParameter = new OracleParameter();
oracleParameter.ParameterName = name;
oracleParameter.OracleDbType = OracleDbType.Raw;
oracleParameter.Value = value;
this.Parameters.Add((DbParameter) oracleParameter);
return (DbParameter) oracleParameter;
}
Then when I execute the procedure with ExecuteNonQuery I get the following error:
Oracle.ManagedDataAccess.Client.OracleException
HResult=0x80004005
Message=ORA-06502: PL/SQL: numeric or value error: raw variable length too long
ORA-06512: at "ITIS_PRCDRS.PA_PRJ_IMP", line 1235
The exception is thrown on line 1235:
o_boq_rev_id := v_boq_import_rev_id;
As you can see from the procedure declaration above, v_boq_import_rev_id has type RAW(16) and o_boq_rev_id has type OUT RAW, so why should the assignment on line 1235 fail? What am I doing wrong?
PS: The proc executes fine when I call it in plain PL/SQL.
In OracleParameter the default size is 0 for the parameters that may have size values. (Official reference here.)
That is why you need to modify your method which generates the raw values. Below you can find the modified method:
public DbParameter CreateParameterRaw(string name, object value, int parameterSize)
{
OracleParameter oracleParameter = new OracleParameter();
oracleParameter.ParameterName = name;
oracleParameter.OracleDbType = OracleDbType.Raw;
oracleParameter.Value = value;
oracleParameter.Size = parameterSize; /* THIS IS THE ADDED PARAMETER */
this.Parameters.Add((DbParameter) oracleParameter);
return (DbParameter) oracleParameter;
}
And as a result you can pass the size while calling CreateParameterRaw as you did in your existing code:
var revParam = new byte[16];
/* CHECK THE 16 value in the parameters that are sent to CreateParameterRaw */
dataHandler.CreateParameterRaw("o_boq_rev_id", revParam, 16).Direction = ParameterDirection.Output;
Additional suggestion: In order to keep apples with apples, I would suggest you can take Direction parameter also into the CreateParameterRawmethod. By this way CreateParameterRawbecomes the whole responsible about generating the parameters.
Credits:
Official page: https://docs.oracle.com/en/database/oracle/oracle-database/12.2/odpnt/ParameterCtor5.html#GUID-04BE7E69-A80A-4D28-979A-CDC2516C0F93
A blog that has similar problem: http://devsilos.blogspot.com/2013/01/ora-06502-with-out-parameter-called.html?m=1
Size usage from Microsoft: https://learn.microsoft.com/en-us/dotnet/api/system.data.oracleclient.oracleparameter.size?view=netframework-4.8
This is an interesting problem with a weird solution.
Actually while using the RAW in output parameter you MUST provide some buffer space for it when adding this parameter.
Can you please provide some buffer space for this variable and try something like the following:
byte[] RAWPlaceholder = new byte[16];
cmd.AddParameter(new OracleParameter("o_boq_rev_id",
OracleDbType.Raw,
16,
RAWPlaceholder,
ParameterDirection.Output);
Please share the result of the aforementioned exercise.
Thanks

Passing large BLOBs to Stored Procedure

I have a simple (example) script to upload a file into the database (Oracle, if it matters):
<cfscript>
param string filename;
if ( FileExists( filename ) )
{
result = new StoredProc(
datasource = "ds",
procedure = "FILE_UPLOAD",
result = "NA",
parameters = [
{ value = FileReadBinary( filename ), type = "in", cfsqltype = "CF_SQL_BLOB" }
]
).execute();
}
</cfscript>
However, the ColdFusion CFML Reference states for FileReadBinary( filepath ):
Note:
This action reads the file into a variable in the local Variables scope. It is not intended for use with large files, such as logs, because they can bring down the server.
If I should not use FileReadBinary( filepath ), how should I upload a large (0.5 - 1Tb) file?
If using Java is an option, then you can pass an InputStream object to a PreparedStatement for filling a Blob field. Something like this, exception handling and all other stuff to be added:
Connection con = someDataSource.getConnection();
String sql = "INSERT INTO MY_TABLE(MY_BLOB) VALUES(?)";
PreparedStatement ps = con.prepareStatement(sql);
InputStream fis = new FileInputStream("MyBigFile.big");
ps.setBlob(1, fis);
ps.executeUpdate();
I think Java will do it using buffers, and not load the whole file into memory.
As suggested by #Galcoholic, you can utilise the underlying Java classes and use CallableStatement.setBlob( int, InputStream ):
<cfscript>
param string filename;
// Get the necessary Java classes:
Files = createObject( 'java', 'java.nio.file.Files' );
Paths = createObject( 'java', 'java.nio.file.Paths' );
// Do not timeout the request
setting requesttimeout = 0;
try {
input = Files.newInputStream( Paths.get( filename, [] ), [] );
connection = createObject( 'java', 'coldfusion.server.ServiceFactory' )
.getDataSourceService()
.getDataSource( 'ds' )
.getConnection()
.getPhysicalConnection();
statement = connection.prepareCall( '{call FILE_UPLOAD(?)}' );
statement.setBlob( JavaCast( 'int', 1 ), input );
statement.executeUpdate()
}
finally
{
if ( isDefined( "statement" ) )
statement.close();
if ( isDefined( "connection" ) )
connection.close();
}
</cfscript>
Note:
Every argument must be supplied for a Java method; so for methods with variable number of arguments then the VARARGS arguments must be passed as an array (or an empty array for no additional arguments).
ColdFusion numeric values will not be implicitly coerced to Java numeric literals so JavaCast( 'int', value ) is required.
Error handling is not included in the above example.
If the files have been uploaded then the "Maximum size of post data" and "Request Throttle Memory" settings in the Admin console will need to be increased from the default sizes to an appropriate limit for the size of the files being uploaded (otherwise coldfusion.util.MemorySemaphore will throw out-of-memory exceptions when it handles the upload before the script gets parsed).

Return one value from a table using a function

I have this code:
public int GetUserIdByEmail(string email)
{
using (SqlConnection conn = new SqlConnection(ZincModelContainer.CONNECTIONSTRING))
{
using (SqlCommand cmd = conn.CreateCommand())
{
conn.Open();
cmd.CommandType = System.Data.CommandType.Text;
cmd.CommandText = String.Concat("SELECT [Zinc].[GetUserIdByEmail] (", email, ")"); //is this correct??? the problem lies here
return (int)cmd.ExecuteScalar();
}
}
}
I get the error here in above code. this is still not right
I have my function now as below suggested by veljasije
thanks
Modify your procedure:
CREATE PROCEDURE [Zinc].[GetUserIdByEmail]
(
#Email varchar (100)
)
AS
BEGIN
SELECT zu.UserId from Zinc.Users zu WHERE Email = #Email
END
And in you code change type of parameter from NVarChar to VarChar
Function
CREATE FUNCTION [Zinc].[GetUserIdByEmail]
(
#Email varchar(100)
)
RETURNS int
AS
BEGIN
DECLARE #UserId int;
SET #UserId = (SELECT zu.UserId from Zinc.Users zu WHERE Email = #Email)
RETURN #UserId
END
Firstly, specify the size for the #Email parameter in the sproc - without it, it will default to 1 character which will therefore not be attempting to match on the value you are expecting it to.
Always specify the size explicitly to avoid any issues (e.g. per Marc_s's comment, plus demo I blogged about here, it behaves differently bu defaulting to 30 chars when using CAST/CONVERT )
Secondly, use SqlCommand.ExecuteScalar()
e.g.
userId = (int)cmd.ExecuteScalar();

Resources