Performance issues with Before INSERT Trigger, is taking lot of time compared to calling the Procedure Directly after insert - oracle

I have a before INSERT trigger calling my procedure to process the XML in the XMLTYPE fields in the STAGE_TBL and insert the data into PROCESSED_DATA_TBL
I have to go for Before INSERT trigger(I can use Compound Trigger as well but I didnt tried it yet) in order to update the status on STAGE_TBL row based on the outcome from processing the XML.
The issue I am having is my XML can be huge it can have about 100 - 2000 rp_sendRow chunks, if it is huge, then the trigger is taking so much time. I tried with 100 rp_sendRow and it takes about 4 minutes thru trigger.
But if I disable trigger and insert into STAGE_TBL and then call the XML_PROCESS for the newly inserted record using the ID, then its completing(Process XML and insert into PROCESSED_DATA_TBL) in less than a second from SQL Developer.
I cannot use regular SQL Insert huge XML from SQL Developer as there is a 4000 character limit, as the Database is not on my local, I cannot even use the XMLType(bfilename('XMLDIR', 'MY.xml') option so I am using JDBC code to insert huge XML.
I have called the XML_PROCESS directly from JDBC for the same XML and it took less than a second to process and insert into PROCESSED_DATA_TBL
Please let me know why the Trigger is taking time ?
I am using Oracle 11g, SQL Developer 4.1.0.19
--Trigger Code
create or replace TRIGGER STAGE_TRIGGER
BEFORE INSERT ON STAGE_TBL
FOR EACH ROW
DECLARE
ROW_COUNT NUMBER;
PROCESS_STATUS VARCHAR2(1);
STATUS_DESCRIPTION VARCHAR2(300);
BEGIN
XML_PROCESS(:NEW.ID, :NEW.XML_DOCUMENT, PROCESS_STATUS, STATUS_DESCRIPTION, ROW_COUNT);
IF(ROW_COUNT > 0) THEN
:NEW.STATUS := PROCESS_STATUS;
:NEW.STATUS_DATE := SYSDATE;
:NEW.STATUS_DESCRIPTION := STATUS_DESCRIPTION;
:NEW.SHRED_TS := SYSTIMESTAMP;
ELSE--This is to handle 0 records inserted scenario & exception scenarios
:NEW.STATUS := STATUS.ERROR;
:NEW.STATUS_DATE := SYSDATE;
:NEW.STATUS_DESCRIPTION := STATUS_DESCRIPTION;
END IF;
EXCEPTION
WHEN OTHERS THEN
:NEW.STATUS := PROCESS_STATUS;
:NEW.STATUS_DESCRIPTION := STATUS_DESCRIPTION;
NULL;
END STAGE_TRIGGER;
--Stored Procedure
create or replace PROCEDURE XML_PROCESS (ID IN RAW, xData IN XMLTYPE, PROCESS_STATUS OUT VARCHAR2, STATUS_DESCRIPTION OUT VARCHAR2, ROW_COUNT OUT NUMBER) AS
BEGIN
INSERT ALL INTO PROCESSED_DATA_TBL
(ID,
STORE,
SALES_NBR,
UNIT_COST,
ST_FLAG,
ST_DATE,
ST,
START_QTY,
START_VALUE,
START_ON_ORDER,
HAND,
ORDER,
COMMITED,
SALES,
RECEIVE,
VALUE,
COST,
ID_1,
ID_2,
ID_3,
UNIT_PRICE,
EFFECTIVE_DATE,
STATUS,
STATUS_DATE,
STATUS_REASON)
VALUES (ID
,storenbr
,SalesNo
,UnitCost
,StWac
,StDt
,St
,StartQty
,StartValue
,StartOnOrder
,Hand
,Order
,Commit
,Sales
,Rec
,Value
,Id1
,Id2
,Id3
,UnitPrice
,to_Date(EffectiveDate||' '||EffectiveTime, 'YYYY-MM-DD HH24:MI:SS')
,'N'
,SYSDATE
,'XML PROCESS INSERT')
SELECT E.* FROM XMLTABLE('rp_send/rp_sendRow' PASSING xData COLUMNS
store VARCHAR(20) PATH 'store'
,SalesNo VARCHAR(20) PATH 'sales'
,UnitCost NUMBER PATH 'cost'
,StWac VARCHAR(20) PATH 'flag'
,StDt DATE PATH 'st-dt'
,St NUMBER PATH 'st'
,StartQty NUMBER PATH 'qty'
,StartValue NUMBER PATH 'value'
,StartOnOrder NUMBER PATH 'order'
,Hand NUMBER PATH 'hand'
,Order NUMBER PATH 'order'
,Commit NUMBER PATH 'commit'
,Sales NUMBER PATH 'sales'
,Rec NUMBER PATH 'rec'
,Value NUMBER PATH 'val'
,Id1 VARCHAR(30) PATH 'id-1'
,Id2 VARCHAR(30) PATH 'id-2'
,Id3 VARCHAR(30) PATH 'id-3'
,UnitPrice NUMBER PATH 'unit-pr'
,EffectiveDate VARCHAR(30) PATH 'eff-dt'
,EffectiveTime VARCHAR(30) PATH 'eff-tm'
) E;
ROW_COUNT := SQL%ROWCOUNT;
PROCESS_STATUS := STATUS.PROCESSED;
STATUS_DESCRIPTION := ROW_COUNT || ' Rows Successfully Inserted ';
EXCEPTION
WHEN DUP_VAL_ON_INDEX THEN
BEGIN
ROW_COUNT := 0;
PROCESS_STATUS := STATUS.ERROR;
STATUS_DESCRIPTION := SUBSTR(SQLERRM, 1, 250);
END;
WHEN OTHERS THEN
BEGIN
ROW_COUNT := 0;
PROCESS_STATUS := STATUS.ERROR;
STATUS_DESCRIPTION := SUBSTR(SQLERRM, 1, 250);
END;
END XML_PROCESS;
--Standalone Procedure calling XML_PROCESS
SET DEFINE OFF
DECLARE
ROW_COUNT NUMBER;
PROCESS_STATUS VARCHAR2(1);
STATUS_DESCRIPTION VARCHAR2(300);
V_ID NUMBER;
V_XML XMLTYPE;
BEGIN
SELECT ID, XML_DOCUMENT INTO V_ID, V_XML FROM STAGE_TBL WHERE ID = '7954';
XML_PROCESS(ID, V_XML, PROCESS_STATUS, STATUS_DESCRIPTION, ROW_COUNT);
update STAGE_TBL SET STATUS = PROCESS_STATUS,
STATUS_DATE = SYSDATE,
STATUS_DESCRIPTION = STATUS_DESCRIPTION
WHERE ID = V_ID;
END;
XML
<?xml version = \"1.0\" encoding = \"UTF-8\"?>
<rp_send xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\">
<rp_sendRow>
<store>0123</store>
<sales>022399190</sales>
<cost>0.01</cost>
<flag>true</flag>
<st-dt>2013-04-19</st-dt>
<st>146.51</st>
<qty>13.0</qty>
<value>0.0</value>
<order>0.0</order>
<hand>0.0</hand>
<order>0.0</order>
<commit>0.0</commit>
<sales>0.0</sales>
<rec>0.0</rec>
<val>0.0</val>
<id-1/>
<id-2/>
<id-3/>
<unit-pr>13.0</unit-pr>
<eff-dt>2015-06-16</eff-dt>
<eff-tm>09:12:21</eff-tm>
</rp_sendRow>
</rp_send>

There are two many unknown variables to determine the problem, but with this information I see four (edited to include more answers) possible answers:
1) If you are inserting many rows in only one statement (INSERT ... SELECT) the trigger will slow performance.
But your standalone procedure call example operates with only one row (ID = '7954'), so I assume the problem persist with only one tuple insertion. In this case 1) is not the problem.
2) You have some kind of index on STAGE_TBL(XML_DOCUMENT). When the BEFORE INSERT trigger is called the XMLType is not indexed and your trigger calls the procedure with a non-indexed version of XML_DOCUMENT. But in your standalone procedure example, XML_DOCUMENT is inserted and indexed, so the procedure uses the index.
Complex indexes on complex objects can be used by oracle optimizer not only when selecting data from a table, but they can be used when processing the data itself. This means: if you have an index on a particular data it can be used by a procedure that use this data. And Oracle's XMLType are complex objects that can be indexed in many, many ways (see: http://docs.oracle.com/cd/B28359_01/appdev.111/b28369/xdb_indexing.htm#CHDCGACG).
I think that XMLTABLE function is being optimized when XML_DOCUMENT is actually inserted in STAGE_TBL.
You can test it by calling your standalone procedure with a XML_DOCUMENT not extracted from STAGE_TBL (or any table that could index the document). In this case both, trigger and standalone, performances should be similar.
EDITED: You comment that you have tested the second answer and the performance problem persists. So I include a third option:
3) You have included a XML validation check constraint in STAGE_TBL. And this validation is the source of the performance difference. The standalone example does not validate the XML document, but the insert validates it.
You can check if this is what is happening by disabling the trigger. If the insert without the trigger is still slow then the problem is not the trigger, but it is the XML validation.
EDITED: You comment that you have tested the third answer and the performance problem persists. So I include a fourth option:
4) In (https://community.oracle.com/thread/2526907) a performance problem with XMLTable is described when working with big XML documents. They comment that using TABLE(XMLSequence()) approach is better in these cases, because XMLTable creates big intermediate results, and TABLE(XMLSequence()) does not.
So in your INSERT statement change your SELECT from:
SELECT E.* FROM XMLTABLE('rp_send/rp_sendRow' PASSING xData COLUMNS
store VARCHAR(20) PATH 'store'
,SalesNo VARCHAR(20) PATH 'sales'
,UnitCost NUMBER PATH 'cost'
,StWac VARCHAR(20) PATH 'flag'
,StDt DATE PATH 'st-dt'
...
...
,EffectiveTime VARCHAR(30) PATH 'eff-tm'
) E;
To:
SELECT value(e).extract('//store/text()').getStringVal() store,
value(e).extract('//sales/text()').getStringVal() SalesNo,
value(e).extract('//cost/text()').getNumberVal() UnitCost,
value(e).extract('//flag/text()').getStringVal() StWac,
to_date(value(e).extract('//st-dt/text()').getStringVal(),'YYYY-MM-DD') StDt,
...
...
value(e).extract('//eff-tm/text()').getStringVal() EffectiveTime
FROM TABLE(XMLSEQUENCE(EXTRACT(xData, '/rp_send/rp_sendRow'))) e;

Related

Update of column value within a trigger

Before insert or update of any columns I want to update 1 system column with standard hash MD5 of all table columns, trigger is attached to. My intention is not to tailor this trigger with enumeration of all columns for each trigger and have a function that returns concatenated list of columns per table.
Table DDL:
create table TEST (
id int,
test varchar(100),
"_HASH" varchar(32)
);
Here is my trigger DDL that I would love to work :
CREATE TRIGGER TEST_SYS_HASH_BEFORE_INSERT_OR_UPDATE
BEFORE INSERT OR UPDATE
ON TEST
FOR EACH ROW
DECLARE
var_columns VARCHAR2(10000);
BEGIN
var_columns := FUNC_LISTAGG_EXT(‘TEST');
EXECUTE IMMEDIATE 'SELECT STANDARD_HASH(' || var_columns || ', ''MD5'') from dual'
INTO :new."_HASH";
END;
However this is simply taking headers and set same hash for every row. If I should do this manually , trigger would look like this, what works as I desire, but create it for several tens of tables would be overwhelming
CREATE OR REPLACE TRIGGER TEST_SYS_HASH_BEFORE_INSERT_OR_UPDATE
BEFORE INSERT OR UPDATE
ON TEST
FOR EACH ROW
DECLARE
var_columns VARCHAR(10000);
BEGIN
var_columns := FUNC_LISTAGG_EXT('TEST');
SELECT STANDARD_HASH( :new."ID" || :new."TEST" , 'MD5' )
INTO :new."_HASH";
FROM DUAL;
END;
So my question is whether solution is achievable
Note:
FUNC_LISTAGG_EXT function returns concatenated list of columns from system view

Create insert record dynamically by changing pk of existing record for passed in table

I want to pass a table name and schema into a procedure, and have it generate insert, update and delete statements for the particular table. This is part of an automated testing solution (in a development environment) in which I need to test some change data capture. I want to make this dynamic as it is going to be need to be done for lots of different tables over a long period of time, and I need to call it via a REST request through ORDS, so don't want to have to make an endpoint for every table.
Update and delete are fairly easy, however I am struggling with the insert statement. Some of the tables being passed in have hundreds of columns with various constraints, fks etc. so I think it makes sense to just manipulate an existing record by changing only the primary key. I need to be able to modify the primary key to a new value known to me beforehand (e.g. '-1').
Ideally I would create a dynamic rowtype, and select into where rownum = 1, then loop round the primary keys found from all_constraints, and update the rowtype.pk with my new value, before inserting this into the table. Essentially the same as this but without knowing the table in advance.
e.g. rough idea
PROCEDURE manipulate_records(p_owner in varchar2, p_table in varchar2)
IS
cursor c_pk is
select column_name
from all_cons_columns
where owner = p_owner
and constraint_name in (select constraint_name
from all_constraints
where table_name = p_table
and constraint_type = 'P');
l_row tbl_passed_in%ROWTYPE --(I know this isn't possible but ideally)
BEGIN
-- dynamic sql or refcursor to collect a record
select * into tbl_passed_in from tablename where rownum = 1;
-- now loop through pks and reassign their values to my known value
for i in c_pk loop
...if matches then reassign;
...
end loop;
-- now insert the record into the table passed in
END manipulate_records;
I have searched around but haven't found any examples which fit this exact use case, where an unknown column needs to be modified and insert into a table.
Depending on how complex your procedure is, you might be able to store it as a template in a CLOB. Then pull it in, replace table and owner, then compile it.
DECLARE
prc_Template VARCHAR2(4000);
vc_Owner VARCHAR2(0008);
vc_Table VARCHAR2(0008);
BEGIN
vc_Table := 'DUAL';
vc_Owner := 'SYS';
-- Pull code into prc_Template from CLOB, but this demonstrates the concept
prc_Template := 'CREATE OR REPLACE PROCEDURE xyz AS r_Dual <Owner>.<Table>%ROWTYPE; BEGIN NULL; END;';
prc_Template := REPLACE(prc_Template,'<Owner>',vc_Owner);
prc_Template := REPLACE(prc_Template,'<Table>',vc_Table);
-- Create the procedure
EXECUTE IMMEDIATE prc_Template;
END;
Then you have the appropriate ROWTYPE available:
CREATE OR REPLACE PROCEDURE xyz AS r_Dual SYS.DUAL%ROWTYPE; BEGIN NULL; END;
But you can't create the procedure and run it in the same code block.

Inserting into a table using a procedure only if the record doesn't exist yet

I have a table that i'm trying to populate via a plsql script (runs on plsql developer). The actual DML statement
is contained in a procedure inside a package. The procedure only inserts if the record doesn't exist yet.
It doesn't work. The part that checks for existence returns true after the first iteration of the script loop even if it doesn't actually exist in the table.
If i put the commit outside of the loop, nothing gets inserted at all and the existence checks return true for all iteration even if the table it empty.
When i try to simplify the insert with existence check to be in just one statement without the exception handling, i get the same outcome.
Please tell me what I'm doing wrong here.
CREATE OR REPLACE PACKAGE BODY some_package
IS
PROCEDURE add_to_queue(id IN NUMBER)
IS
pending_record VARCHAR2(1);
BEGIN
-- this part succeeds even if nothing matches the criteria
-- during the loop in the outside script
SELECT 'Y'
INTO pending_record
FROM dual
WHERE EXISTS (SELECT 'x' FROM some_queue smq
WHERE smq.id = id AND smq.status IS NULL);
EXCEPTION
WHEN NO_DATA_FOUND THEN
INSERT INTO some_queue (seqno, id, activity_date)
VALUES (some_sequence.nextval, id, SYSDATE);
WHEN OTHERS THEN
NULL;
END;
END some_package;
CREATE TABLE some_queue
(
seqno VARCHAR2(500) NOT NULL,
id NUMBER NOT NULL,
activity_date DATE NOT NULL,
status VARCHAR2(25),
CONSTRAINT some_queue_pk PRIMARY KEY (seqno)
);
-- script to randomly fill in the table with ids from another table
declare
type ids_coll_tt is table of number index by pls_integer;
ids_coll_table ids_coll_tt;
cursor ids_coll_cur is
select tab.id
from (select *
from ids_source_table
order by dbms_random.value ) tab
where rownum < 10;
begin
open ids_coll_cur;
fetch ids_coll_cur bulk collect into ids_coll_table;
close ids_coll_cur;
for x in 1..ids_coll_table.count
loop
some_package.add_to_queue(ids_coll_table(x));
commit; -- if this is here, the first iteration gets inserted
end loop;
-- commit; -- if the commit is done here, nothing gets inserted
end;
Note: I translated this code to be more generic for posting. Forgive me if there are any typos.
Update: even if i put everything inside the script and not use the package, i'm not able to properly check for existence and I get the same results.
I figured out the solution:
CREATE OR REPLACE PACKAGE BODY some_package
IS
PROCEDURE add_to_queue(p_id IN NUMBER)
IS
pending_record VARCHAR2(1);
BEGIN
-- this part succeeds even if nothing matches the criteria
-- during the loop in the outside script
SELECT 'Y'
INTO pending_record
FROM dual
WHERE EXISTS (SELECT 'x' FROM some_queue smq
WHERE smq.id = p_id AND smq.status IS NULL);
EXCEPTION
WHEN NO_DATA_FOUND THEN
INSERT INTO some_queue (seqno, id, activity_date)
VALUES (some_sequence.nextval, p_id, SYSDATE);
WHEN OTHERS THEN
NULL;
END;
END some_package;
changing the parameter name fixed it. I guess the compiler gets confused if it's the same name as the table field.
Don't name the parameter the same as the column (use a prefix like p_ or in_) and you can do it in a single statement if you use a MERGE statement self-joining on the ROWID pseudo-column:
CREATE OR REPLACE PACKAGE BODY some_package
IS
PROCEDURE add_to_queue(
in_id IN NUMBER
)
IS
BEGIN
MERGE INTO some_queue dst
USING ( SELECT ROWID AS rid
FROM some_queue
WHERE id = in_id
AND status IS NULL ) src
ON ( src.rid = dst.ROWID )
WHEN NOT MATCHED THEN
INSERT (seqno, id, activity_date)
VALUES (some_sequence.nextval, in_id, SYSDATE);
END;
END some_package;

Is there an easy to to iterate over all :NEW values from an Oracle database trigger execution?

I am attempting to write a generic trigger that will provide all of the :NEW values for the row inserted. Ultimately I want to turn them into XML and insert the XML string into a binary field on another table.
There are a variable number of columns in each table - many times over 100 fields and over 100 tables in all, so individual mapping to XML per table is extremely time consuming.
Is there a way to reference the :NEW pseudorecord as a collection of column values - or perhaps a way to pass the whole :NEW record to a Stored Procedure that could pass it to a Java function (hosted on the database) that might make the individual values iterable?
I've found an example here:
https://docs.oracle.com/database/121/LNPLS/triggers.htm
Create history table and trigger:
CREATE TABLE tbl_history ( d DATE, old_obj t, new_obj t)
/
CREATE OR REPLACE TRIGGER Tbl_Trg
AFTER UPDATE ON tbl
FOR EACH ROW
BEGIN
INSERT INTO tbl_history (d, old_obj, new_obj)
VALUES (SYSDATE, :OLD.OBJECT_VALUE, :NEW.OBJECT_VALUE);
END Tbl_Trg;
/
This seems to imply there is some sort of way it is storing all of the values as a variable, but this appears to put them directly back into a database table. I want to get the 'text' values of the column values listed.
You can create a stored procedure to create your trigger
for table tbl like
create table tbl (id number, value varchar2(10));
and an history table like
create table tbl_history (d date,id number, value varchar2(10));
you can create your trigger like this
create or replace procedure CREATE_TRIGGER IS
trig_str VARCHAR2(32767);
col_str VARCHAR2(32767) := '(d';
values_str VARCHAR2(32767) := '(sysdate';
begin
trig_str := 'CREATE OR REPLACE TRIGGER Tbl_Trg AFTER UPDATE ON tbl FOR EACH ROW'||chr(10)||
'BEGIN'||chr(10)||chr(9)||'INSERT INTO tbl_history ';
for col in (
SELECT column_name FROM all_tab_columns where table_name = 'TBL'
) loop
col_str := col_str||','||col.column_name;
values_str := values_str||','||':OLD.'||col.column_name;
end loop;
col_str := substr(col_str,1,length(col_str)-1)||')';
values_str := substr(values_str,1,length(values_str)-1)||')';
trig_str := trig_str||col_str||' VALUES '||values_str||';'||chr(10)||'END;';
execute immediate trig_str;
END;
/
With an history table with old and new values it's a bit more complicated but same idea

How to deal with sequence in insert from XMLTable?

I have write a PL/SQL function that takes input in XML format for the
following table:
TABLE: TBL_MEDICAL_CENTER_BILLS
Name Null Type
------------- -------- -------------
MED_RECORDNO NOT NULL NUMBER
MED_EMPID NVARCHAR2(10)
MED_BILL_HEAD NVARCHAR2(20)
MED_DATE DATE
MED_AMOUNT FLOAT(126)
Here is the function code:
FUNCTION save_medical_center_bills(medical_bill_data NVARCHAR2 ) RETURN clob IS ret clob;
xmlData XMLType;
v_code NUMBER;
v_errm VARCHAR2(100);
BEGIN
xmlData:=XMLType(medical_bill_data);
INSERT INTO TBL_MEDICAL_CENTER_BILLS SELECT x.* FROM XMLTABLE('/medical_center_bill'
PASSING xmlData
COLUMNS MED_RECORDNO NUMBER PATH 'MED_RECORDNO' default null,
MED_EMPID NVARCHAR2(11) PATH 'employee_id',
MED_BILL_HEAD NVARCHAR2(20) PATH 'bill_head' ,
MED_DATE DATE PATH 'effective_date',
MED_AMOUNT FLOAT PATH 'bill_amount'
) x;
ret:=to_char(sql%rowcount);
COMMIT;
RETURN '<result><status affectedRow='||ret||'>success</status></result>';
EXCEPTION
WHEN OTHERS THEN
v_code := SQLCODE;
v_errm := SUBSTR(SQLERRM, 1, 100);
DBMS_OUTPUT.PUT_LINE (v_code || ' ' || v_errm);
-- '<result><status>Error</status> <error_message>'|| 'Error Code:' || v_code || ' ' || 'Error Message:' || v_errm ||'</error_message> </result>';
RETURN '<result><status>Error</status> <error_message>'|| 'Error Message:' || v_errm ||'</error_message> </result>';
END save_medical_center_bills;
However, I want to keep table's first column MED_RECORDNO as incrementing sequence (at the moment I am keeping it null since I don't know how to put the sequence in the XMLTable clause) and the rest of the
inputs [MED_EMPID, MED_BILL_HEAD , MED_DATE , MED_AMOUNT] will be taken from the XML passed to the function.
I created a sequence and a trigger to keep this sequence incremented for that table column MED_RECORDNO:
CREATE SEQUENCE MED_RECORDNO_SEQ;
create or replace TRIGGER MED_RECORDNO_TRIGGER
BEFORE INSERT ON TBL_MEDICAL_CENTER_BILLS FOR EACH ROW
WHEN (new.MED_RECORDNO is null)
DECLARE
v_id TBL_MEDICAL_CENTER_BILLS.MED_RECORDNO%TYPE;
BEGIN
SELECT MED_RECORDNO_seq.nextval INTO v_id FROM DUAL;
:new.MED_RECORDNO := v_id;
END;
As you can see, my XMLTable is inserting 4 column values in a 5 column table, because columns MED_RECORDNO will take its value from sequence MED_RECORDNO_SEQ using TRIGGER MED_RECORDNO_TRIGGER.
I don't know any thing about doing this. If you have ever experience such things, then please share your idea.
I sort of hinted at this in an earlier answer. You should specify the names of of the columns in the table you are inserting into; this is good practice even if you are populating all of them, as it will avoid surprises if the table structure changes (or differs between environments), and makes it much easier to spot like having columns or values in the wrong order.
INSERT INTO TBL_MEDICAL_CENTER_BILLS (MED_EMPID, MED_BILL_HEAD, MED_DATE, MED_AMOUNT)
SELECT x.MED_EMPID, x.MED_BILL_HEAD, x.MED_DATE, x.MED_AMOUNT
FROM XMLTABLE('/medical_center_bill'
PASSING xmlData
COLUMNS MED_EMPID NVARCHAR2(11) PATH 'employee_id',
MED_BILL_HEAD NVARCHAR2(20) PATH 'bill_head' ,
MED_DATE DATE PATH 'effective_date',
MED_AMOUNT FLOAT PATH 'bill_amount'
) x;
The insert you have should actually work (if the column order in the table matches); the trigger will still replace the null value you get from the XMLTable with the sequence value. At least, until you make the MED_RECORDNO column not-null, and you probably want to if it's the primary key.
Incidentally, if you're on 11g or higher your trigger can assign the sequence straight to the NEW pseudorecord:
create or replace TRIGGER MED_RECORDNO_TRIGGER
BEFORE INSERT ON TBL_MEDICAL_CENTER_BILLS
FOR EACH ROW
BEGIN
:new.MED_RECORDNO := MED_RECORDNO_seq.nextval;
END;
The when null check implies you sometimes want to allow a value to be specified; that is a bad idea as manually inserted values can clash with sequence values, either giving you duplicates or a unique/primary key exception.

Resources