I'm new to Oracle AWS RDS, we have a RDS deployed and an S3 bucket. The download from S3 works fine, however I wanted to write a script which checks if the file download has been completed.
DECLARE
V_TASKID VARCHAR2(100);
V_CTR integer := 0;
V_CMD VARCHAR2(4000) := NULL;
BEGIN
SELECT rdsadmin.rdsadmin_s3_tasks.download_from_s3(p_bucket_name => 'humm-cards-dev', p_directory_name => 'DATA_PUMP_DIR') INTO V_TASKID FROM DUAL;
dbms_output.put_line( V_TASKID ) ;
WHILE V_CTR = 0 LOOP
dbms_output.put_line(V_CMD);
V_CMD := 'SELECT count(*) FROM table(rdsadmin.rds_file_util.read_text_file(''BDUMP'', ''dbtask-' || V_TASKID || '.log'')) WHERE text LIKE ''%finished successfully%''';
dbms_output.put_line(V_CMD);
execute immediate V_CMD INTO V_CTR;
IF V_CTR > 0 THEN EXIT; END IF;
END LOOP;
END;
/
The script kicks off the download - I get the TaskID and the V_CMD output in the DBMS Output.
However the script fails on
execute immediate V_CMD INTO V_CTR
with this error
ORA-06512: at "RDSADMIN.RDS_FILE_UTIL", line 90 ORA-06512: at line 14
29283. 00000 - "invalid file operation%s"
*Cause: An attempt was made to read from a file or directory that does
not exist, or file or directory access was denied by the
operating system.
*Action: Verify file and directory access privileges on the file system,
and if reading, verify that the file exists.
Interesting thing is, that if I run the DBMS output of V_CMD separately, it works just fine. DBMS output is below.
SELECT count(*) FROM table(rdsadmin.rds_file_util.read_text_file('BDUMP', 'dbtask-1636692191529-641.log')) WHERE text LIKE '%finished successfully%';
Does anyone have any idea what I'm doing wrong?
Not sure if your issue was resolved, but I faced same problem and I think I found the reason for this. Once you download from s3 bucket, you need to wait till the file download is complete. What happens with your code (And my old code too), was that the file download operation was started, but not completed and hence I believe the log is still not available when you immediately try to read it.
What I did was after the download operation, put a while loop to find if the log file dbtask-taskid.log exists in the BDUMP directory.
v_logfilename:='dbtask-'|| taskid ||'.log';
v_logs_created:=0;
WHILE v_logs_created<1)
LOOP
Select Count(1)
INTO v_logs_created
FROM TABLE(rdsadmin.rds_file_util.listdir(p_directory=>'BDUMP')) where filename=logfilename;
END LOOP;
Once it was, I would proceed to the next step that is read if there is a success message in that.
Solution #Yogesh Sati works well also in the same way with upload to S3:
set serveroutput on format wrapped;
DECLARE
V_TASKID VARCHAR2(100);
V_LOGFILENAME VARCHAR2(100);
V_CTR integer := 0;
V_LOGS_CREATED integer := 0;
V_CMD VARCHAR2(4000) := NULL;
BEGIN
-- DOWNLOAD
SELECT rdsadmin.rdsadmin_s3_tasks.download_from_s3(p_bucket_name => 'humm-cards-dev', p_directory_name => 'DATA_PUMP_DIR') INTO V_TASKID FROM DUAL;
-- UPLOAD
-- SELECT rdsadmin.rdsadmin_s3_tasks.upload_to_s3(p_bucket_name => 'humm-cards-dev', p_prefix => 'mydump.dmp', p_s3_prefix => 'DEV/', p_directory_name => 'DATA_PUMP_DIR') INTO V_TASKID FROM DUAL;
dbms_output.put_line('TASKID: ' || V_TASKID);
V_LOGFILENAME:='dbtask-'|| V_TASKID ||'.log';
WHILE V_LOGS_CREATED < 1
LOOP
Select Count(1)
INTO V_LOGS_CREATED
FROM TABLE(rdsadmin.rds_file_util.listdir(p_directory=>'BDUMP')) where filename=V_LOGFILENAME;
END LOOP;
WHILE V_CTR = 0 LOOP
V_CMD := 'SELECT count(*) FROM table(rdsadmin.rds_file_util.read_text_file(''BDUMP'', ''dbtask-' || V_TASKID || '.log'')) WHERE text LIKE ''%finished successfully%''';
execute immediate V_CMD INTO V_CTR;
IF V_CTR > 0 THEN
dbms_output.put_line('The task finished successfully.');
EXIT;
END IF;
END LOOP;
END;
/
I just had this issue right now:
ORA-00904: "RDSADMIN"."RDSADMIN_S3_TASKS"."DOWNLOAD_FROM_S3": invalid
identifier.
All my RDS have the same configurations working fine. Just one brand new created was giving the error when using rdsadmin.rdsadmin_s3_tasks.download_from_s3. Reading the AWS page on RDS oracle S3 integration, I decided to remove the option S3_INTEGRATION, then I removed the IAM role and rebooted the RDS. Then, I added the same IAM and then the option S3_INTEGRATION back. Works just fine now.
Related
create or replace directory MYCSV as 'E:\sqlloader\';
grant read, write on directory MYCSV to public;
declare
F UTL_FILE.FILE_TYPE;
V_LINE VARCHAR2 (1000);
V_id NUMBER(4);
V_NAME VARCHAR2(10);
V_risk VARCHAR2(10);
BEGIN
F := UTL_FILE.FOPEN ('MYCSV', 'testfile.csv', 'R');
IF UTL_FILE.IS_OPEN(F) THEN
LOOP
BEGIN
UTL_FILE.GET_LINE(F, V_LINE, 1000);
IF V_LINE IS NULL THEN
EXIT;
END IF;
V_id := REGEXP_SUBSTR(V_LINE, '[^,]+', 1, 1);
V_NAME := REGEXP_SUBSTR(V_LINE, '[^,]+', 1, 2);
V_risk := REGEXP_SUBSTR(V_LINE, '[^,]+', 1, 3);
INSERT INTO loader_tab VALUES(V_id, V_NAME, V_risk);
COMMIT;
EXCEPTION
WHEN NO_DATA_FOUND THEN
EXIT;
END;
END LOOP;
END IF;
UTL_FILE.FCLOSE(F);
END;
/
CSV file content wherein I need to start loading from 1,a,aa and need to skip first 4 lines:
portal,,
ex portal,,
,,
i_id,i_name,risk
1,a,aa
2,b,bb
3,c,cc
4,d,dd
5,e,ee
6,f,ff
7,g,gg
8,h,hh
9,i,ii
10,j,jj
I want to load the data from excel but I am getting an invalid file operation error. Will someone help with this? Not able to load the data from an excel file. I am Getting invalid file operation error though file is present in my local system.
though file is present in my local system
It won't work unless your local system (I presume you mean your own PC) also runs the database into which you're trying to load data. Oracle directory (probably in 99% of all cases) resides on the database server.
I want to load the data from excel
It won't work either, if that's really an Excel file. Code you posted suggests that it is a comma-separated values file (textual, that is), and yes - it should be such a file, not XLSX.
Here's the command to get the DDL or the procedure and make it pretty:
EXEC DBMS_METADATA.SET_TRANSFORM_PARAM(DBMS_METADATA.SESSION_TRANSFORM , 'PRETTY' , TRUE);
SELECT DBMS_METADATA.GET_DDL('PROCEDURE', UPPER('LOOPPROC'), 'MYSCHEMA') FROM DUAL;
This is the output which is exactly like the input and the same as if PRETTY was set to FALSE above.
CREATE OR REPLACE EDITIONABLE PROCEDURE "MYLANID"."LOOPPROC" (inval NUMBER)
IS
tmpvar NUMBER;
tmpvar2 NUMBER;
total NUMBER;
BEGIN
tmpvar := 0;
tmpvar2 := 0;
total := 0;
FOR lcv IN 1 .. inval
LOOP
total := 2 * total + 1 - tmpvar2;
tmpvar2 := tmpvar;
tmpvar := total;
END LOOP;
IF inval = 1 THEN
DBMS_OUTPUT.put_line ('IN IF TRUE branch, inval = ' || inval);
DBMS_OUTPUT.put_line ('IN IF TRUE branch, inval is still = ' || inval);
ELSE
DBMS_OUTPUT.put_line ('IN ELSE, inval = ' || inval);
END IF;
DBMS_OUTPUT.put_line ('TOTAL IS: ' || total);
END loopproc;
Note how the IF - THE - ELSE clause is indented like the Snake River.
Is there a way to get the procedure to indent more conventionally?
I’m also open to open source that can be called from .net. I’m considering as a last resort to use Selenium to take advantage of free web sites that do sql formatting but only as a last resort.
Idea: SQL Developer has already built-in formatting capabilities and allow to set multiple options.
Then using: Command-Line Interface for SQL Developer:
For certain operations, you can invoke SQL Developer from the command line as an alternative to the graphical user interface. To use the command-line interface, go to the sqldeveloper\sqldeveloper\bin folder or sqldeveloper/sqldeveloper/bin directory under the location where you installed SQL Developer, and enter sdcli.
C:\sqldeveloper\sqldeveloper\bin>sdcli
Available features:
format: Format Task
Invocation:
sdcli format input=<here goes file>
I could imagine building following batch script/pipeline:
get object definition from Oracle
save it as file
use sdcli format
read the file content back
I run the write_test procedure, which works good.
begin
koll_data_pkg.write_test(p_customer_id=>247, p_addr=>'address', p_dir=>'\\SERVER01\Backup\Log\');
end;
But, when I change value of p_dir to another directory p_dir=>\SERVER12\Backup\Log\ it gives following error:
ORA-29283: invalid file operation
ORA-06512: by "SYS.UTL_FILE",
ORA-29283: invalid file operation
ORA-06512: by "DATA_PKG",
ORA-06512: by line
I have tried give permission using following commands, but still same error:
CREATE OR REPLACE DIRECTORY DEVO_INVREC_DIR AS '\\SERVER12\Backup\Log\';
GRANT READ, WRITE ON DIRECTORY DEVO_INVREC_DIR TO USER1;
GRANT READ, WRITE ON DIRECTORY DEVO_INVREC_DIR TO USER1;
GRANT EXECUTE ON UTL_FILE TO USER1;
Procedure:
procedure write_test(p_customer_id in koll_customer_party.customer_id%type,
p_addr in varchar,
p_dir in varchar,
p_filename in varchar2 default null)
is
lt_id id_tt;
lt_bolagsnamn bolagsnamn_tt;
l_file utl_file.file_type;
l_line varchar2(2048);
l_name varchar2(300):= 'DEVO_INVREC_DIR';
l_filename varchar2(100):= 'testfile.txt';
l_sql varchar2(512);
begin
select devo_id, bolagsnamn
bulk collect into lt_id, lt_bolagsnamn
from documents where customer_id=p_customer_id
if lt_id.count > 0 then
l_sql := 'create or replace directory ' || l_name || ' as ''' || p_dir || '''';
execute immediate l_sql;
if p_filename is not null then
l_filename := p_filename;
end if;
l_file := utl_file.fopen(l_name,l_filename,'w');
if utl_file.is_open(l_file) is not null then
for i in lt_id.first .. lt_devo_id.last loop
l_line:= lt_id(i) || ';' || replace(lt_bolagsnamn(i),';','');
utl_file.put_line(l_file, l_line);
end loop;
end if;
utl_file.fclose(l_file);
end if;
end;
Check out this forum response: https://community.oracle.com/thread/4145239?start=0&tstart=0
In summary, Oracle can't access the network shares in its default installed configuration because the Windows SYSTEM user can't access network shares by definition. You either have to reconfigure Oracle to run as a user other than SYSTEM, with permissions on the share, or allow SYSTEM to access network shares (a HUGE security risk). I was going to include a link describing how to change the user to be another service account, but they all seem to be broken or removed. It may depend on your exact version of Oracle and Windows, too, so you're best bet in the absence of other documentation would be to contact Oracle Support. There is no simple PL/SQL programming answer to your problem.
currently I am learning PLSQL, using Oracle. I am trying to get data which will be older than PARAM days decalred in another table. I want the procedure to take all the data, check if some records are older (recv_date) than parameter from param_value and if yes than fire my alarm procedure. I have problem with declaring a CURSOR and ln_pallets_container. I know I could somehow get into ln_pallets data only WHERE the recv_date I already filtered but neither here I have no idea how to do it correctly. Maybe I should declare cursor before procedure and not inside of it?
procedure CHECK_STOCK_DATE(warehouse_id_in IN warehouse.warehouse_id%TYPE)
IS
ln_pallet_count NUMBER;
ln_days_till_expiration param_value.param_value%TYPE;
CURSOR ln_pallets IS
SELECT container_id, recv_date
FROM wms_stock ws
ln_pallets_container%ROWTYPE;
BEGIN
OPEN ln_pallets;
LOOP
FETCH ln_pallets INTO ln_pallets_container;
EXIT WHEN ln_pallets%NOTFOUND;
SELECT param_value.param_value
INTO ln_days_till_expiration
FROM param_value
WHERE param_value.parameter_id = 266;
IF(ln_pallets_container.recv_date >= trunc(sysdate - ln_days_till_expiration)
ALARM.ALARM(WAREHOUSE_ID =>MY_COMMONS.GET_WHRS_ID,
SOURCE_TEXT => ln_pallets_container.container_id,
MESSAGE_CODE => 'Cannot find this container on warehouse. Check container code.');
END IF;
END LOOP;
CLOSE ln_pallets;
END;
There are several things wrong with your code, which I've fixed and highlighted in the following:
PROCEDURE check_stock_date(warehouse_id_in IN warehouse.warehouse_id%TYPE) IS
ln_pallet_count NUMBER;
ln_days_till_expiration param_value.param_value%TYPE;
CURSOR ln_pallets IS
SELECT container_id,
recv_date
FROM wms_stock ws; -- added semicolon
ln_pallets_container ln_pallets%ROWTYPE; -- amended to set the datatype of the variable to be the cursor rowtype
BEGIN
OPEN ln_pallets;
LOOP
FETCH ln_pallets
INTO ln_pallets_container;
EXIT WHEN ln_pallets%NOTFOUND;
SELECT param_value.param_value
INTO ln_days_till_expiration
FROM param_value
WHERE param_value.parameter_id = 266;
IF /*removed bracket*/ ln_pallets_container.recv_date >= trunc(SYSDATE - ln_days_till_expiration)
THEN --added
alarm.alarm(warehouse_id => my_commons.get_whrs_id,
source_text => ln_pallets_container.container_id,
message_code => 'Cannot find this container on warehouse. Check container code.');
END IF;
END LOOP;
CLOSE ln_pallets;
END check_stock_date;
However, this could be done much more efficiently. Currently, you are looping through all the rows in wms_stock, plus you are explicitly opening, fetching and closing the cursor yourself.
That means for every row in wms_stock, you are finding the value of parameter_id 266 (which I assume won't change whilst you're looping through the results!), as well as doing the check to see if you can run your alarm procedure.
Instead of fetching all the rows, why not move the check into the cursor - that way, you'll only be fetching the parameter 266 value once and filtering out any rows that don't need to have the alarm procedure run.
At the same time, why not switch to using a cursor-for-loop? That way, you don't have to worry about opening/fetching from/closing the cursor, as Oracle handles all that for you.
Doing that will result in far less code, which happens to be more efficient and easier to read and maintain, like so:
PROCEDURE check_stock_date(warehouse_id_in IN warehouse.warehouse_id%TYPE) IS
BEGIN
FOR ln_pallets_rec IN (SELECT container_id,
recv_date
FROM wms_stock ws
WHERE recv_date >= (SELECT trunc(SYSDATE - param_value.param_value
FROM param_value
WHERE param_value.parameter_id = 266))
LOOP
alarm.alarm(warehouse_id => my_commons.get_whrs_id,
source_text => ln_pallets_rec.container_id,
message_code => 'Cannot find this container on warehouse. Check container code.');
END LOOP;
END check_stock_date;
Fixed some issues in your code.
procedure check_stock_date(warehouse_id_in in warehouse.warehouse_id%type) is
ln_pallet_count number;
ln_days_till_expiration param_value.param_value%type;
l_container_id wms_stock.container_id%type;
l_recv_date wms_stock.recv_date%type;
cursor ln_pallets is
select container_id
,recv_date
from wms_stock ws;
begin
open ln_pallets;
loop
fetch ln_pallets
into l_container_id
,l_recv_date;
exit when ln_pallets%notfound;
select param_value.param_value
into ln_days_till_expiration
from param_value
where param_value.parameter_id = 266;
if l_recv_date >= trunc(sysdate - ln_days_till_expiration)
then
alarm.alarm(warehouse_id => my_commons.get_whrs_id
,source_text => l_container_id
,message_code => 'Cannot find this container on warehouse. Check container code.');
end if;
end loop;
close ln_pallets;
end;
Hi you don't specify table name for ln_pallets_container variable also it's missing a ';' after cursor declaration fix this and try
I need to debug in pl/sql to figure times of procedures, I want to use:
SELECT systimestamp FROM dual INTO time_db;
DBMS_OUTPUT.PUT_LINE('time before procedure ' || time_db);
but I don't understand where the output goes to and how can I redirect it to a log file that will contain all the data I want to collect?
DBMS_OUTPUT is not the best tool to debug, since most environments don't use it natively. If you want to capture the output of DBMS_OUTPUT however, you would simply use the DBMS_OUTPUT.get_line procedure.
Here is a small example:
SQL> create directory tmp as '/tmp/';
Directory created
SQL> CREATE OR REPLACE PROCEDURE write_log AS
2 l_line VARCHAR2(255);
3 l_done NUMBER;
4 l_file utl_file.file_type;
5 BEGIN
6 l_file := utl_file.fopen('TMP', 'foo.log', 'A');
7 LOOP
8 EXIT WHEN l_done = 1;
9 dbms_output.get_line(l_line, l_done);
10 utl_file.put_line(l_file, l_line);
11 END LOOP;
12 utl_file.fflush(l_file);
13 utl_file.fclose(l_file);
14 END write_log;
15 /
Procedure created
SQL> BEGIN
2 dbms_output.enable(100000);
3 -- write something to DBMS_OUTPUT
4 dbms_output.put_line('this is a test');
5 -- write the content of the buffer to a file
6 write_log;
7 END;
8 /
PL/SQL procedure successfully completed
SQL> host cat /tmp/foo.log
this is a test
As an alternative to writing to a file, how about writing to a table? Instead of calling DBMS_OUTPUT.PUT_LINE you could call your own DEBUG.OUTPUT procedure something like:
procedure output (p_text varchar2) is
pragma autonomous_transaction;
begin
if g_debugging then
insert into debug_messages (username, datetime, text)
values (user, sysdate, p_text);
commit;
end if;
end;
The use of an autonomous transaction allows you to retain debug messages produced from transactions that get rolled back (e.g. after an exception is raised), as would happen if you were using a file.
The g_debugging boolean variable is a package variable that can be defaulted to false and set to true when debug output is required.
Of course, you need to manage that table so that it doesn't grow forever! One way would be a job that runs nightly/weekly and deletes any debug messages that are "old".
use
set serveroutput on;
for example:
set serveroutput on;
DECLARE
x NUMBER;
BEGIN
x := 72600;
dbms_output.put_line('The variable X = '); dbms_output.put_line(x);
END;
If you are just testing your PL/SQL in SQL Plus you can direct it to a file like this:
spool output.txt
set serveroutput on
begin
SELECT systimestamp FROM dual INTO time_db;
DBMS_OUTPUT.PUT_LINE('time before procedure ' || time_db);
end;
/
spool off
IDEs like Toad and SQL Developer can capture the output in other ways, but I'm not familiar with how.
In addition to Tony's answer, if you are looking to find out where your PL/SQL program is spending it's time, it is also worth checking out this part of the Oracle PL/SQL documentation.
Using UTL_FILE instead of DBMS_OUTPUT will redirect output to a file:
http://oreilly.com/catalog/oraclebip/chapter/ch06.html
As a side note, remember that all this output is generated in the server side.
Using DBMS_OUTPUT, the text is generated in the server while it executes your query and stored in a buffer. It is then redirected to your client app when the server finishes the query data retrieval. That is, you only get this info when the query ends.
With UTL_FILE all the information logged will be stored in a file in the server. When the execution finishes you will have to navigate to this file to get the information.
Hope this helps.
Its possible write a file directly to the DB server that hosts your database, and that will change all along with the execution of your PL/SQL program.
This uses the Oracle directory TMP_DIR; you have to declare it, and create the below procedure:
CREATE OR REPLACE PROCEDURE write_log(p_log varchar2)
-- file mode; thisrequires
--- CREATE OR REPLACE DIRECTORY TMP_DIR as '/directory/where/oracle/can/write/on/DB_server/';
AS
l_file utl_file.file_type;
BEGIN
l_file := utl_file.fopen('TMP_DIR', 'my_output.log', 'A');
utl_file.put_line(l_file, p_log);
utl_file.fflush(l_file);
utl_file.fclose(l_file);
END write_log;
/
Here is how to use it:
1) Launch this from your SQL*PLUS client:
BEGIN
write_log('this is a test');
for i in 1..100 loop
DBMS_LOCK.sleep(1);
write_log('iter=' || i);
end loop;
write_log('test complete');
END;
/
2) on the database server, open a shell and
tail -f -n500 /directory/where/oracle/can/write/on/DB_server/my_output.log
An old thread, but there is another alternative.
Since 9i you can use pipelined table function.
First, create a type as a table of varchar:
CREATE TYPE t_string_max IS TABLE OF VARCHAR2(32767);
Second, wrap your code in a pipelined function declaration:
CREATE FUNCTION fn_foo (bar VARCHAR2) -- your params
RETURN t_string_max PIPELINED IS
-- your vars
BEGIN
-- your code
END;
/
Replace all DBMS_OUTPUT.PUT_LINE for PIPE ROW.
Finally, call it like this:
SELECT * FROM TABLE(fn_foo('param'));
Hope it helps.
Try This:
SELECT systimestamp INTO time_db FROM dual ;
DBMS_OUTPUT.PUT_LINE('time before procedure ' || time_db);