AWS RDS Oracle Datapump error ORA-39001: invalid argument value - oracle

I want to import my dump file from my local to AWS. I've uploaded my pdv.dpdm file into my S3 bucket
expdp sys/pass schemas=PDV dumpfile=pdv.dpdm NOLOGFILE=YES directory=TEST_DIR
I was success downloading that file into oracle DATA_PUMP_DIR with rdsadmin.rdsadmin_s3_tasks.download_from_s3 command, When I list the files inside that directory, I got this. So I think the problem is not the failure when moving the data.
select * from table
(rdsadmin.rds_file_util.listdir(p_directory => 'DATA_PUMP_DIR'));
datapump/ directory 4096 2020-03-04 06:49:40
pdv2.log file 28411 2020-03-04 06:49:40
29012020.DMP file 825552896 2020-03-03 09:36:59
pdv2.dpdm file 685617152 2020-03-04 06:49:40
pdv.dpdm file 685613056 2020-03-04 06:49:27
When I starting to import that file with DBMS_DATAPUMP.ADD_FILE I got an error on that line.
DECLARE
hdnl NUMBER;
BEGIN
hdnl := DBMS_DATAPUMP.OPEN( operation => 'IMPORT', job_mode => 'SCHEMA', job_name=> NULL, version => 12);
DBMS_DATAPUMP.ADD_FILE(
handle => hdnl,
filename => 'pdv.dpdm',
directory => 'DATA_PUMP_DIR',
filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_DUMP_FILE
);
DBMS_DATAPUMP.START_JOB(hdnl);
END;
Error :
SQL Error [39001] [99999]: ORA-39001: invalid argument value
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
ORA-06512: at "SYS.DBMS_DATAPUMP", line 4087
ORA-06512: at "SYS.DBMS_DATAPUMP", line 4338
ORA-06512: at line 6
It seems I'm missing something, maybe configure in AWS or what. I've done searching for a couple of answers but it doesn't fix my problem. Could you help me with this? I don't know anymore what should I do. Thanks

You are mistaken with the import version.
hdnl := DBMS_DATAPUMP.OPEN( operation => 'IMPORT', job_mode => 'SCHEMA', job_name=> NULL, version => 12);
version=>'12.0.0'
Oracle Documentation
COMPATIBLE - (default) the version of the metadata corresponds to the
database compatibility level and the compatibility release level for
feature (as given in the V$COMPATIBILITY view). Database compatibility
must be set to 9.2 or higher.
LATEST - the version of the metadata corresponds to the database
version.
A specific database version, for example, '11.0.0'.
Specify a value of 12 to allow all existing database features,
components, and options to be exported from Oracle Database 11g
release 2 (11.2.0.3) or later into an Oracle Database 12 c Release 1
(12.1) (either a multitenant container database (CDB) or a non-CDB).
You must export as a non-SYS user!
grant read, write on directory TEST_DIR to PDV;
expdp PDV/password schemas=PDV dumpfile=pdv.dpdm NOLOGFILE=YES directory=TEST_DIR
Oracle Documentation
Note:Do not start Export as SYSDBA, except at the request of Oracle
technical support. SYSDBA is used internally and has specialized
functions; its behavior is not the same as for general users.

Related

Resolving ORA-02019 error during DBMS_FILE_TRANSFER.PUT_FILE()

I am using DBMS_FILE_TRANSFER.PUT_FILE() on a local Oracle Express instance to transfer a local file to a remote AWS RDS Oracle instance, but I am receiving the following error:
ERROR at line 1:
ORA-02019: connection description for remote database not found
ORA-06512: at "SYS.DBMS_FILE_TRANSFER", line 60
ORA-06512: at "SYS.DBMS_FILE_TRANSFER", line 168
ORA-06512: at line 2
I receive this error while executing the following SQL script:
BEGIN
DBMS_FILE_TRANSFER.PUT_FILE(
'DATA_PUMP_DIR',
'some_file.txt',
'DATA_PUMP_DIR',
'some_file.txt',
'MY_DATABASE_LINK'
);
END;
/
MY_DATABASE_LINK is a public database link located in my local Oracle Express instance:
CREATE PUBLIC DATABASE LINK MY_DATABASE_LINK CONNECT TO example_schema IDENTIFIED BY "example_user" USING '(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=example_host_info)(PORT=1521))(CONNECT_DATA=(SID=example_sid)))';
I'm fairly confident that the connection string behind the database link is correct, but I'm not sure how to be 100% sure.
The ORA-02019: connection description for remote database not found error does not make sense because the connection description is defined by the database link. It is not present in tnsnames.ora, and I am confident that it doesn't have to be for DBMS_FILE_TRANSFER.PUT_FILE() to work.

Can't enable JavaScript MLE on APEX

I am running Oracle APEX 21.1 (build 21.1.3.r1531102) on the 21c express database and for some reason I cannot execute server side code with the JavaScript MLE option.
In the develpment environment it does not show the option at all.
In the SQL commands screen I can only execute pure sql or PL/SQL. If I try to execute the following example
SET SERVEROUTPUT ON;
DECLARE
ctx dbms_mle.context_handle_t;
user_code clob := q'~
console.log('Hello World!');
~';
BEGIN
ctx := dbms_mle.create_context();
dbms_mle.eval(ctx, 'JAVASCRIPT', user_code);
dbms_mle.drop_context(ctx);
EXCEPTION
WHEN others THEN
dbms_mle.drop_context(ctx);
RAISE;
END;
I get the following message:
ORA-00922: missing or invalid option
ORA-06512: at "SYS.WWV_DBMS_SQL_APEX_210100", line 673
ORA-06512: at "SYS.DBMS_SYS_SQL", line 1703
ORA-06512: at "SYS.WWV_DBMS_SQL_APEX_210100", line 659
ORA-06512: at "APEX_210100.WWV_FLOW_DYNAMIC_EXEC", line 1854
A few weeks ago we solved this and I forgot to update this question.
After some work we found out that user needed to be granted access to MLE. I was not that one who fixed, but I've asked the code to post here, check below:
GRANT MLE JAVA
GRANT EXECUTE DYNAMIC MLE to XYZ;
GRANT EXECUTE ON JAVASCRIPT TO XYZ;
Remove the first line or comment this out, I suppose this is for on premise DB version support but not allowed in ADB.

Datapump Import Fails With ORA-39006, ORA-39213: “Metadata processing is not available”

I am trying to import multiple dmp files using impdp command i got this error DataPump import (impdp) reports the errors:
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - 64bit Production
With the Partitioning, OLAP and Data Mining options
ORA-39006: internal error
ORA-39213: Metadata processing is not available
Attempting to correct the error ORA-39213 via
dbms_metadata_util.load_stylesheets also reports errors similar to:
SQL> exec dbms_metadata_util.load_stylesheets;
BEGIN dbms_metadata_util.load_stylesheets; END;
* ERROR at line 1: ORA-22288: file or LOB operation FILEEXISTs failed
Permission denied
ORA-06512: at "SYS.DBMS_METADATA_UTIL", line 1807
ORA-06512: at line 1
login from sys and run below query worked for me oracle 11.2
exec dbms_metadata_util.load_stylesheets;

Error while importing oracle dump using DBMS_DATAPUMP in oracle 12g

I have been trying to import a dump file into Oracle 12g under RDS in AWS and getting below error all the time and I can't figure it out on what is happening. Any help or pointers are greatly appreciated.
Here is what I am trying to do.
1) I got a .dat dump file from customer (not sure how they have exported this dump from oracle) and my aim is to import this to AWS oracle RDS instance.
2) As per How to transfer Oracle dump file to AWS RDS instance?
I have created new EC2 instance and installed Oracle Express edition and then copied .dat file to DATA_PUMP_DIR directory (/u01/app/oracle/admin/XE/dpdump/).
3) Created database link to AWS oracle RDS instance and able to copy this to rds instance using DBMS_FILE_TRANSFER.PUT_FILE.
4) I can see the file under DATA_PUMP_DIR path in oracle rds instance.
so far so good.
5) Now, I am trying to import the dump file into RDS using below procedure
DECLARE
hdnl NUMBER;
BEGIN
hdnl := DBMS_DATAPUMP.OPEN( operation => 'IMPORT', job_mode => 'SCHEMA', job_name => null);
DBMS_DATAPUMP.ADD_FILE( handle => hdnl, filename => 'sample.dat', directory => 'DATA_PUMP_DIR');
DBMS_DATAPUMP.START_JOB(hdnl);
END;
/
All the time I get below error
DECLARE
*
ERROR at line 1:
ORA-39001: invalid argument value
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
ORA-06512: at "SYS.DBMS_DATAPUMP", line 4087
ORA-06512: at "SYS.DBMS_DATAPUMP", line 4338
ORA-06512: at line 5
I understand the error is pointing to the line DBMS_DATAPUMP.ADD_FILE() but I could not able to make out what is wrong here.
I have tried various options to import but none helped. I have created tablespace and also make sure user is having correct permissions.
Any pointers are greatly appreciated.
1)Connect to the RDS instance with the Amazon RDS master user account.
2) Run this pl/sql block
DECLARE
hdnl NUMBER;
BEGIN
hdnl := DBMS_DATAPUMP.OPEN( operation => 'IMPORT', job_mode => 'SCHEMA', job_name=>null);
DBMS_DATAPUMP.ADD_FILE( handle => hdnl, filename => 'sample_copied.dmp', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_dump_file);
DBMS_DATAPUMP.METADATA_FILTER(hdnl,'SCHEMA_EXPR','IN (''SCHEMA_NAME'')');
DBMS_DATAPUMP.START_JOB(hdnl);
END;
/
3) You can start importing from any oracle client. For example, Instance EC2.
impdp RDS_master_user/password#rds_instance DUMPFILE=sample.dmp DIRECTORY=DATA_PUMP_DIR parfile=import_parfile
import_parfile
SCHEMAS =SCHEMA_NAME

FTP using UTL_FTP package fails for large files

I am trying to FTP a file from one Unix box to another with UTL_FTP packages using Tim Hall's FTP packages.
BEGIN
--pl_release_id := 'IT3';
pl_release_id := release_id;
l_conn:= ftp.login(SOURCESERVER,'21',SOURCEUSER,SOURCEPASSWORD);
ftp.binary(p_conn => l_conn);
ftp.get
(
p_conn => l_conn,
p_from_file => SOURCEPATH,
p_to_dir => INTERMEDIATEPATH,
p_to_file => FILE
);
-- ftp.logout(l_conn);
utl_tcp.close_connection(c => l_conn);
EXCEPTION
WHEN OTHERS THEN
utl_tcp.close_connection(c => l_conn);
raise;
END;
This is successful for files less than 50 Mb in size, but for large files, I get the following error:
Error at line 1
ORA-29260: network error: not connected
ORA-06512: at "SYS.UTL_TCP", line 231
ORA-06512: at "SYS.UTL_TCP", line 460
ORA-06512: at "SYS.FTP", line 301
ORA-20000: 550 sendfile: Broken pipe.
ORA-06512: at "SYS.FTP_FILES", line 32
ORA-06512: at line 20
I am able to FTP the same files between the database server and the source server using oprating system ftp command.
There is ample space in the partitions. Tried to FTP to different partitions.
Server OS: AIX unix, Oracle version: Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production

Resources