NHibernate Transaction fails with Insert and Update on Oracle - oracle

Post Update: I have tracked down the problem at the command "ExecuteNonQuery". That's the one that fails during an update or hangs during an insert. Trying a simple example using plain ADO.NET and their transactions works perfect. Also... it works great on my local home computer connection an Oracle Express edition. Pointing it again in some kind of server config??
It would be nice to step into the NHibernate code while debuging, but so far I'm still not able to set this up, even if I have rebuild the source and use those dll and pdb files. Was anyone able to do this before?
I've been scratching my head on this for a while now. I've been developing with NHibernate and an Oracle 10g database for a few days now, so far only using select statements which are all working great with the mapping.
I now started to implement my first insert (save) and update statements, but the tests all fail.
They all fail on the transaction.commit() part.
When performing an INSERT (Save), the code reaches transaction.commit(), but then got "stucked". The test keeps on running without moving forward.
This is the output of the test (note that the test keeps running)
NHibernate: select hibernate_sequence.nextval from dual
NHibernate: INSERT INTO MOB_PL_MAPPING_TEST (DES, TEST_ID) VALUES (:p0, :p1);:p0 = 'This is a test!', :p1 = 161
When performing an UPDATE, the transaction.commit() fails and I receive following error stack:
NHibernate: SELECT test0_.TEST_ID as TEST1_10_0_, test0_.DES as DES10_0_ FROM MOB_PL_MAPPING_TEST test0_ WHERE test0_.TEST_ID=:p0;:p0 = 61
NHibernate: UPDATE MOB_PL_MAPPING_TEST SET DES = :p0 WHERE TEST_ID = :p1;:p0 = 'Changed!', :p1 = 61
TestCase 'Data.Tests.Test_Update_on_Test_Table'
failed: NHibernate.TransactionException : Rollback failed with SQL Exception
----> System.InvalidOperationException : This OracleTransaction has completed; it is no longer usable.
c:\CSharp\NH\nhibernate\src\NHibernate\Transaction\AdoTransaction.cs(260,0): at NHibernate.Transaction.AdoTransaction.Rollback()
E:\SubVersion\Application\Src\Data\UnitOfWork\Data.UnitOfWork\GenericTransaction.cs(26,0): at Data.UOW.GenericTransaction.Rollback()
E:\SubVersion\Application\Src\Data\UnitOfWork\Data.UnitOfWork\UnitOfWorkImplementor.cs(49,0): at Data.UOW.UnitOfWorkImplementor.TransactionFlush(IsolationLevel isolationLevel)
E:\SubVersion\Application\Src\Data\UnitOfWork\Data.UnitOfWork\UnitOfWorkImplementor.cs(36,0): at Data.UOW.UnitOfWorkImplementor.TransactionFlush()
E:\SubVersion\Application\Src\Data\Data.Tests\Repositories\LoyaltyRepositoryTests.cs(159,0): at Data.Tests.Test_Update_on_Test_Table()
--InvalidOperationException
at System.Data.OracleClient.OracleTransaction.AssertNotCompleted()
at System.Data.OracleClient.OracleTransaction.Rollback()
c:\CSharp\NH\nhibernate\src\NHibernate\Transaction\AdoTransaction.cs(246,0): at NHibernate.Transaction.AdoTransaction.Rollback()
I must say I'm unknown to oracle, but it seems that establishing the transaction causes the problem. Though the same code (using transactions) for a select statement (GET) works fine.
Could this be an oracle config problem (blocking insert/update transactions) or do I have to configure something else at application level?
Can anybody help me out here or shed more light on the problem that may occure?
Thanks in advance.

After managing to hook up the NHibernate code to my debuger, I was able to step through the code up to the point where the Command object is executed.
There, the problem was to be found in the parameter types. Parameters that where a string had the type set to "String", where they where suposed to be "AnsiString".
Turned out, I already ran into this article when I was mapping a string to an id
http://www.jameskovacs.com/blog/NHibernateAndTheCaseOfTheCrappyOracleErrorMessage.aspx
but didn't thought more of it.
Either way, adding the type to each string property in the mapping resolved the problem.
<property name="Description" column="DES" type="AnsiString" />
A hectic 3 days... but it's solved :D

Related

SSIS For Loop ODBC

I’m extracting data from a table in Oracle.
I have an ODBC connection manager to the Oracle database and the query for extraction should include a where clause because the table contain transactional data and there is no reason to extract it all every time.
I want initialize the table once and do it in with a For Loop which will iterate the whole table.
Since it’s an ODBC connection I can’t just put a where clause because I need to use a variable hence I realized I need to parameterize the DataFlow task and write my query at the sqlcommand property containing the ODBC source.
The property value is:
SELECT *
FROM DDC.DDC_SALES_TBL
WHERE trunc(CALDAY) between to_date('"+ #[User::vstart]+"','MM/DD/YYYY')
and to_date('"+ #[User::vstop]+"','MM/DD/YYYY')
Where the #vstart and #vstop are variables containing the ‘from/to’ dates to be extracted based on a DATEADD function and another variable (#vcount) which supposed to be the iterator as follows:
(DT_WSTR, 2) MONTH( DATEADD( "day", #[User::vcount] , GETDATE() ) )+"/"+
(DT_WSTR, 2) DAY( DATEADD( "day", #[User::vcount] , GETDATE() ) )+"/"+
(DT_WSTR, 4) YEAR( DATEADD( "day", #[User::vcount] , GETDATE() ) )
What’s happening is that the first iteration works fine but the second one generates an error and the package fails.
I marked the variable as EvaluateAsExpression=True
I also marked the DelayValidation=True in both the For Loop and the DataFlow tasks.
The errors are:
(1)Data Flow Task:Error: SQLSTATE: HY010, Message: [Microsoft][ODBC Driver Manager] Function sequence error;
(2) Data Flow Task:Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "ODBC Source.Outputs[ODBC Source Output]" failed because error code 0xC020F450 occurred, and the error row disposition on "ODBC Source" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.
(3) Data Flow Task:Error: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on ODBC Source returned error code 0xC0209029. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.
Please assist.
I don't know why initially i didn't use OLEDB, as I thought it doesn't work.
What i tried was to use create an OLEDB via oracle driver and the connection manager worked so i used it.
As this way you can parameterize the source directly and the loop worked just fine.
Don't know what cause the conflict with the OBDC source but that's my workaround.
I didn't find a way to setup the sqlcommand property in ODBC source and using it in a loop which should change the the command every iteration. It crashed after the first iteration ni matter what i tried.
Thanks,
I was having the same issue when using Oracle Source, updating the Attunity Connectors for Oracle as well as the OLEDB driver for SQL Server worked to fix the problem.

Issue with Generate table fetch SQL SERVER 2016

I'm try to pull the data from SQL Server and using the generate table fetch. When I use MYSQL database instead SQL Server for the same generate table fetch it's working as expected. Whenever I use to connect SQL Server I'm getting error as below.
GenerateTableFetch[id=07bed292-0162-1000-0000-00004bc12345] failed to process session due to java.lang.IllegalArgumentException: Order by clause cannot be null or empty when using row paging: Order by clause cannot be null or empty when using row paging
SQL Server Version: 2016
I gone through the below link and came to know that there is a bug for generate table fetch for SQL Server. However I'm not whether the bug is fixed or not.
https://github.com/apache/nifi/pull/1510
Nifi Version I'm using - 1.5
Could someone please let me know whether the bug is fixed or not, If not any work around solution for this bug.
Here is my flow.
Edit:
GenerateTableFetc:
This is a bug in some of the DatabaseAdapters in NiFi, using GenerateTableFetch with no Max-value Column set. In this case there's a workaround, you can use the 2008 driver, then a ReplaceText processor to replace "ORDER BY asc" with "ORDER BY newid() asc". I'm trying to find out everywhere this could be an issue, I'll write up a Jira to cover all the cases. The general symptom is OFFSET/LIMIT clauses without an ORDER BY clause.

SAP BO - Report from stored proc

I'm trying to get SAP Business Objects to get a report off of a stored procedure. I had no luck, so I'm now just trying with an example/tutorial of how to do it I found online, and I can't get that to work either.
I'm following:
https://irfansworld.wordpress.com/2012/11/17/what-you-should-know-about-stored-procedure-universe-in-bi-4-0/
I created the objects shown in the exact same way, as shown here:
create or replace package emp_package
as type emp_row_type is ref cursor return emp%rowtype;
end emp_package;
/
create or replace
procedure getEmployeesByDepartment
(
return_rows_cursor in out emp_package.emp_row_type,
dept_parameter in emp.deptno%type
)
as begin
open return_rows_cursor for
SELECT *
FROM emp
WHERE emp.deptno = dept_parameter;
end;
/
I get good results back:
Package EMP_PACKAGE compiled
Procedure GETEMPLOYEESBYDEPARTMENT compiled
But here is where I see a glaring difference... For me, it takes what should only be the out/return parameters, and prompts me as if they are input parameters.
Even if i say "OK" to this... it doesn't recognize the fields in the "out" cursor as fields for me to show on the report.
I've even tried changing the cursor paramater from "in out" to just "out"... but still no luck.
Any ideas as to why I can't make this example work for me?
Using SAP Universal Design Tool 4.1.
Oracle 11g
This isn't really an answer, but too long for a comment.
I just tried this with BI4.1 SP Patch 5, and the SP Editor displayed as expected (only showing the DEPT_PARAMETER parameter).
Two possibilities I can think of for the different behavior you're seeing:
One is that there's something going on with the database middleware client that's confusing BO about the parameters. I'm using Oracle 11g client, and an Oracle Native connection in BO (i.e., not ODBC or JDBC). If you're using an ODBC or JDBC connection, try the native client instead.
It might be a bug with the specific version of UDT that you're using. I'd suggest upgrading, or contacting SAP support to see if it's a known issue.

Cannot update Oracle view from JDBC

Overview: Need to read row from Oracle view and create a Notes document, save document, then write Document Unique ID back to Oracle.
I am able to read connect and read data no problem. I am using a type 4 connection connecting to an Oracle 11 database. The Oracle view is setup to allow Updating. The view has nothing in it that is outline here: In Oracle, is it possible to INSERT or UPDATE a record through a view?
*With the same username and password, you are able to successfully update view by typing in SQL statement.
*Tried using conn.setAutoCommit(false); This had no effect.
*Verified that the result set was updatable (1008)
*User has been given full DBA access (temporarily)
*I have tried every possible combination of the first parameter in the createStatement method
...
Statement statement = conn.createStatement(ResultSet.TYPE_FORWARD_ONLY, ResultSet.CONCUR_UPDATABLE);
ResultSet rs = statement.executeQuery(fetch);
...
String UNID = doc.getUniversalID(); //gets unique id from saved Notes Document
System.out.println("This is what to write to Oracle:" + UNID);
System.out.println("is updatable=1008, not updatable=1007 value is:" + rs.getConcurrency());
System.out.println("is Result Set Closed:" + rs.isClosed());
rs.updateString("NOTES_DOC_ID", UNID);
System.out.println("got past updating NOTES_DOC_ID column");
rs.updateRow(); //fails here
Here is the error from console:
This is what to write to Oracle:BF8091259610C61B87257B16005C14FB
is updatable=1008, not updatable=1007 value is:1008
is Result Set Closed:false
got past updating NOTES_DOC_ID column
java.security.AccessControlException: Access denied (java.lang.RuntimePermission exitVM.0)
Prior to asking for the user to have DBA access I would get a
java.sql.SQLSyntaxErrorException: ORA-01031: insufficient privileges
I think this a big clue. My DBA doesn't know what further access to give me.
The DBA is wanting me to start using ref cursors, which is fine, but I suspect there is some kind of security setting for JDBC access that is tripping me up, and I want to explore that first. If there is a security issue, then I don't think changing the way I read the rows is going to make a difference. Most of all the documentation on how to do this was obtained from Oracle's website, as well as this site.
I am going to answer my question and explain how I got past this roadblock. In the end, I basically did what 'a_horse_with_no_name' suggested.
Instead of using the resultSet cursor or a ref cursor to perform the update, I was able to use a plain UPDATE statement. This was possible, because I was able to convince the DBA to create a column for a unique identifier. We could never get around the exceptions caused by the updateRow() method of the resultSet. Prior to him adding the unique identifier, there was not a key in which to reliably use the UPDATE statement.
Here is the code where updateSQL is a string holding the update SQL statement:
updateResultInt = updateStatement.executeUpdate(updateSQL);
It returns a 1 if successful.
One word of caution, if you are using a tools like TOra or sql plus to check your update statements, you have to remember to manually commit them. If you don't your java agent will hang when trying to run it. Here is an good reference that helped me with that issue: SQL Update hangs Java program
Thanks to those who commented!

NHibernate + Oracle: speed problems when querying data

We have an WCF application which uses NHibernate to query data from the database. After installing the application into a new test environment we're facing some performance problems with queries. Our old and new environment use different Oracle-servers but both of the databases have the same data.
We have went through our NHibernate logs and identified the problematic part:
2010-12-02 07:14:22,673 NHibernate.SQL - SELECT this_.CC...
2010-12-02 07:14:22,688 NHibernate.Loader.Loader - processing result set
2010-12-02 07:14:27,235 NHibernate.Loader.Loader - result set row: 0
In this case the query returned one row. But it seems that in our new environment the "processing result set" is taking much longer (5 seconds vs 0.5 seconds) than in our other environment. Is there some way to find out exactly what inside the "processing result set" is taking so long?
Note. Executing the same exact query directly into the DB with Toad doesn't reproduce the problem. With Toad, both database server are equally fast.
We are using DetachedCriteria to create the query and then it is executed like this:
Dim criteria As ICriteria = crit.GetExecutableCriteria(GetSession())
Return New Generic.List(Of T)(criteria.List(Of T))
The version of NHibernate is 2.1.2.4 and we're using ActiveRecord 2.1.0 to create the mappings. Oracle servers are of version 10g.
So in our case we have two environments that have the same version of the application with identical configuration files and are querying against identical databases, but which have different application and oracle servers. In one environment querying through the NHibernate takes around 5.5 seconds and in the another 0.5 seconds. The results are consistent and the same query has been executed around 50 times to both environments.
Is there something in the Oracle configuration which could cause it to misbehave with NHiberate? And is there a way to get more detailed logging out from NHibernate so that the exact problem inside the "processing result set" could be found?
Any advice is greatly appreciated.
We were able to fix our problem by switching the database drivers from Microsoft's to Oracle's ODP.net. Now, both servers are equally fast and even our previously fast server executes the queries much more rapidly. We don't know what setting in our new server made the Microsoft's Oracle driver so slow.
And it seems that Microsoft is nowadays recommending everyone to use something else than their own Oracle-drivers. http://blogs.msdn.com/b/adonet/archive/2009/06/15/system-data-oracleclient-update.aspx
Do a sql trace on both environments by adding this statement to your session:
alter session set timed_statistics=true;
alter session set max_dump_file_size=unlimited;
alter session set events '10046 trace name context forever, level 8' ;
-- your query goes here
select * from mytable where x= 1;
alter session set events '10046 trace name context off';
then use tkprof to examine the trace file ( goto the user_dump_dest usually a directory with udump in the name and tkprof outputfile.log inputtracefilename.trc )
type tkprof
by itself to see help screen and command options
Also
Check that you are using the same settings in the INIT.ORA for things like CURSOR_SHARING=
in both databases

Resources