How can I extract or read data from xlsx file in Oracle apex data parser using the worksheet name instead of worksheet code:
In the sample code below, we can see that what is used is the file worksheet code name:
SELECT line_number,
col001,
col002,
col003 FROM TABLE (apex_data_parser.parse (p_content => (SELECT file_content
FROM loader_files
WHERE id = 1),
p_file_name => 'data.xlsx',
**p_xlsx_sheet_name => 'sheet1.xml',**
p_skip_rows => 0));
My requirement is to instead of using the code name "sheet1.xml" that I use the worksheet name "employees", the reason is that we need to extract data from multiple worksheets and the position of the worksheets might change, changing the worksheet code names and therefore breaking my code.
Is there anyway this can be done?
Use the function GET_XLSX_WORKSHEETS to convert a SHEET_DISPLAY_NAME into a SHEET_FILE_NAME:
select
line_number, col001, col002, col003
from table
(
apex_data_parser.parse
(
p_content => (select file_content from loader_files where id = 1),
p_file_name => 'data.xlsx',
p_skip_rows => 0,
p_xlsx_sheet_name =>
(
select sheet_file_name
from table
(
apex_data_parser.get_xlsx_worksheets
(
p_content => (select file_content from loader_files where id = 1)
)
)
where sheet_display_name = 'employees'
)
)
);
i have created apex_item.datepopup() function to provide entry for user to select date from calendar, it should populate sysdate bydefault..
APEX_ITEM.DATE_POPUP (2,ROWNUM,ISSUED_DATE,'dd-mon-yyyy')
Try this:
APEX_ITEM.DATE_POPUP (2,ROWNUM, nvl(ISSUED_DATE, sysdate),'dd-mon-yyyy')
Directly copied from my code where I have it
APEX_ITEM.DATE_POPUP (p_idx => 2 , p_value => (CASE WHEN UPPER(cur_parameter.default) = 'SYSDATE' THEN SYSDATE WHEN UPPER(cur_parameter.default) != 'SYSDATE' AND cur_parameter.default IS NOT NULL THEN TO_DATE(cur_parameter.default) ELSE NULL END) , p_item_id => cur_parameter.name, p_date_format => cur_parameter.format , p_item_label => cur_parameter.name)
The CASE is because I keep the defaults stored in a table, and most of the time, the default date is sysdate, however, its viewed as a string and not SYSDATE. The cur_parameter is a row in a cursor.
If a call has more than one or two parameters, its always best to use named notation, avoids confusion and helps prevent mix ups.
APEX_ITEM.DATE_POPUP (p_idx =>2 , p_value => SYSDATE , p_item_id => ISSUED_DATE, p_date_format => 'DD-MON-YYYY') IssuedDate,
I'm building an Oracle connector that reads data periodically from a couple of very big table, some are divided into partitions.
I'm trying to figure out which table were updated from the last time they were read to avoid unnecessary queries. I have the last ora_rowscn or updated_at and the only methods I find requires a full table scan to see if there are new or updated rows in the table.
Is there a way to tell if a table a row was inserted or updated without the full scan?
A couple of ideas:
1. Create a table to store last DML by table_name and then create a simple trigger on the table to update meta table.
2. Create a Materialized View Log on the table and use the data from the log to determine the changes.
If there are archive logs for the search period. You can use the utility LogMiner. for example:
insert into "ASOUP"."US"("KEY_COLUMN","COD_ROAD","COD_COMPUTER","COD_STATION_OPER","NUMB_TRAIN","STAT_CREAT","NUMB_SOSTAVA","STAT_APPOINT","COD_OPER","DIRECT_1","DIRECT_2","DATE_OPER","PARK","PATH","LOCOMOT","LATE","CAUSE_LATE","COD_CONNECT","CATEGORY","TIME") values ('42018740','988','0','9200','2624','8642','75','9802','1','8891','0',TO_DATE('18-Dec-2018', 'DD-Mon-RRRR'),'0','0','0','0','0','0',NULL,TO_DATE('18-Dec-2018', 'DD-Mon-RRRR'));
select name, first_time, next_time
from v$archived_log
where first_time >sysdate -3/24
/oracle/app/oracle/product/11.2/redolog/edcu/1_48060_769799469.dbf 18-дек-2018 09:03:06 18-дек-2018 10:22:00
/oracle/app/oracle/product/11.2/redolog/edcu/1_48061_769799469.dbf 18-дек-2018 10:22:00 18-дек-2018 10:30:02
/oracle/app/oracle/product/11.2/redolog/edcu/1_48062_769799469.dbf 18-дек-2018 10:30:02 18-дек-2018 10:56:07
Run the logminer utility.
EXECUTE DBMS_LOGMNR.add_logfile(LOGFILENAME => '/oracle/app/oracle/product/11.2/redolog/edcu/1_48060_769799469.dbf', OPTIONS => DBMS_LOGMNR.NEW);
EXECUTE DBMS_LOGMNR.add_logfile(LOGFILENAME => '/oracle/app/oracle/product/11.2/redolog/edcu/1_48061_769799469.dbf', OPTIONS => DBMS_LOGMNR.addfile);
EXECUTE DBMS_LOGMNR.add_logfile(LOGFILENAME => '/oracle/app/oracle/product/11.2/redolog/edcu/1_48062_769799469.dbf', OPTIONS => DBMS_LOGMNR.addfile);
EXECUTE DBMS_LOGMNR.START_LOGMNR(OPTIONS => DBMS_LOGMNR.DICT_FROM_ONLINE_CATALOG);
SELECT scn,ROW_ID,to_char(timestamp,'DD-MM-YYYY HH24:MI:SS'),
table_name,seg_name,operation, sql_redo,sql_undo
FROM v$logmnr_contents
where seg_owner='ASOUP' and table_name='US'
SCN ROW_ID TIMESTAMP TABLE_NAME SEG_NAME OPERATION SQL_REDO SQL_UNDO
1398405575908 AAA3q2AAoAACFweABi 18-12-2018 09:03:15 US US,ADCU201902 INSERT insert into "ASOUP"."US"("KEY_COLUMN","COD_ROAD","COD_COMPUTER","COD_STATION_OPER","NUMB_TRAIN","STAT_CREAT","NUMB_SOSTAVA","STAT_APPOINT","COD_OPER","DIRECT_1","DIRECT_2","DATE_OPER","PARK","PATH","LOCOMOT","LATE","CAUSE_LATE","COD_CONNECT","CATEGORY","TIME") values ('42018727','988','0','8800','4404','1','895','8800','1','8838','0',TO_DATE('18-Dec-2018', 'DD-Mon-RRRR'),'4','2','0','0','0','0',NULL,TO_DATE('18-Dec-2018', 'DD-Mon-RRRR')); delete from "ASOUP"."US" where "KEY_COLUMN" = '42018727' and "COD_ROAD" = '988' and "COD_COMPUTER" = '0' and "COD_STATION_OPER" = '8800' and "NUMB_TRAIN" = '4404' and "STAT_CREAT" = '1' and "NUMB_SOSTAVA" = '895' and "STAT_APPOINT" = '8800' and "COD_OPER" = '1' and "DIRECT_1" = '8838' and "DIRECT_2" = '0' and "DATE_OPER" = TO_DATE('18-Dec-2018', 'DD-Mon-RRRR') and "PARK" = '4' and "PATH" = '2' and "LOCOMOT" = '0' and "LATE" = '0' and "CAUSE_LATE" = '0' and "COD_CONNECT" = '0' and "CATEGORY" IS NULL and "TIME" = TO_DATE('18-Dec-2018', 'DD-Mon-RRRR') and ROWID = 'AAA3q2AAoAACFweABi';
You can see inserted row without full scan:
select * from asoup.us where ROWID = 'AAA3q2AAoAACFweABi';
I am using a API's with lot's of calculation almost 100 database fields at the end with a big Foreach loop.
In every iteration i insert data in database. I want to insert data in once at the end (Batch Insert like in CodeIgniter).
Any body have idea how to insert all data at the end of iteration. instead of every iteration it insert row in database.
I want to insert data at the end of loop. Any help or idea appreciated.
Use insert() method for bulk insertion. First, build an array with this structure:
$data = [
['name' => 'John', 'age' => 25],
['name' => 'Maria', 'age' => 31],
['name' => 'Julia', 'age' => 55],
];
Then insert the data using Eloquent model:
Model::insert($data);
Or using query builder:
DB::table('table_name')->insert($data);
I'm using ctx_doc.markup to highlight search results and insert them into a temporary table. Then i retrieve the results from the temporary table. All is running in one transaction. However, the results get deleted from the temporary table (or never inserted?) before i can retrieve them. If i use a normal table it's working fine. Here's the query i'm using:
BEGIN
FOR cur_rec IN (SELECT id FROM contents WHERE CONTAINS(text, 'test', 1) > 0)
LOOP
CTX_DOC.markup(
index_name => 'I_CONTENTS_TEXT',
textkey => TO_CHAR(cur_rec.id),
text_query => 'test',
restab => 'CONTENTS_MARKUP',
query_id => cur_rec.id,
plaintext => FALSE,
tagset => 'HTML_NAVIGATE');
END LOOP;
END;
EOF