Datapump import is unable to open the log file - oracle

I'm trying to import the Oracle BISAMPLE.dmp schema and I've got this error ( unable to open the log file )

The argument to the directory parameter is the name of an Oracle directory object, not a direct reference to the operating system directory.
If you do not already have an Oracle directory object pointing to that operating system directory, which has to be on the database server not a client machine, you (as DBA) will have to create it, and grant privileges to any other Oracle users who will need to use it.
For example:
create directory MY_DATAPUMP_DIR as 'C:\installs\datapumpdir`;
and then
impdp directory=MY_DATAPUMP_DIR dumpfile=...
Alternatively you can move your .dmp file to the default directory, and either omit the directory parameter or specify the default for that, DATA_PUMP_DIR.
Also, note the big warning from the documentation:
Do not invoke Import as SYSDBA, except at the request of Oracle technical support. SYSDBA is used internally and has specialized functions; its behavior is not the same as for general users.

Related

How to generate files on the client file system using oracle UTL_FILE package

I'd like to know, if is it possible to generate and retrieve the file using UTL_FILE package in oracle on the client side?
As far as I can tell, UTL_FILE operates with files on a server, stored in a directory accessible to Oracle. Therefore, you should CREATE DIRECTORY (oracle object) which points to a local directory (possibly, using UNC), grant required privileges (READ, WRITE most probably) and use that directory while working with UTL_FILE.

export and import schema using expdp and impdp

I use this tuotrial to export/import schema. The steps in the tutorial are working until the expdp command, see the screenshot:
I am using oracle12c. Any Idea?
The article you linked to notes that:
The directory object is only a pointer to a physical directory, creating it does not actually create the physical directory on the file system of the database server.
You have to create the physical operating system directory separately, outside the database. That physical directory has to be readable and writable by the operating system user that is running the Oracle database; as you seem to be on Windows that will be the account the services are running under.
You can create the physical directory before or after creating the directory object as they are completely independent, except when Oracle is trying to access it through a UTL_FILE or related activity - data pump uses UTL_FILE, as you can see from the error message stack.
The CREATE DIRECTORY doesn't check that the physical directory it points to exists; and you can delete or create the physical directory without Oracle noticing; as long as it is there are accessible when you try to use it.
From the Oracle documentation:
A directory object specifies an alias for a directory on the server file system ...
and
For file storage, you must also create a corresponding operating system directory, an Oracle Automatic Storage Management (Oracle ASM) disk group, or a directory within an Oracle ASM disk group. Your system or database administrator must ensure that the operating system directory has the correct read and write permissions for Oracle Database processes.
Privileges granted for the directory are created independently of the permissions defined for the operating system directory, and the two may or may not correspond exactly. For example, an error occurs if sample user hr is granted READ privilege on the directory object but the corresponding operating system directory does not have READ permission defined for Oracle Database processes.

Can't import dump from mapped net drive using data pump

I'm trying to import few users from .dmp file from a net drive. Unofrtunately it seems that I lack some rights to do so since I get
ORA-39001: invalid argument value
ORA-39000: bad dump file specification
ORA-31640: unable to open dump file "\\net\drive\directory\placeholder\my_dump.dmp" for read
ORA-27041: unable to open file
OSD-04002: unable to open file
O/S-Error: (OS 5) Access is denied.
I'm not sure why, because I can both access that directory, and for example save a txt file there.
Directory is saved on database as '\net\drive\directory\placeholder'. Log file has other directory specified (not on net drive).
Is there any workaround to import this dump without actually moving it to local drive? Dump is really big, and I don't have space for it (not even close) and neither can I (probably) change my rights on this mapped drive.
Also I can't really make dump smaller.
On one site I've found this advice - " Remember, your OS user ID may not be the ID that is running a submitted RMAN job, in an operating system, UNIX, Linux or Windows."
The solution was to "
In the ControlPanel services:
Right click on service
Select ?properties?
Select ?logon?
Change the default user ID to an Oracle user with Windows administrator privileges"
But I'm not sure what changing this would actually do to server/database, and I'm working on client's server so I don't want to act rashly. I also don't want to reset database or server itself.
Any help with what should I do?
The problem is that your Oracle instance is running under different user account which doesn't have an access to the network drive.
Unless you don't want to run Oracle under different account, you can give the read access to the current Oracle's instance user account (usually LocalSystem for Windows platform) to your network share. Another option could be to import data from the source database via dblink (you won't need dump file in this case at all)

impdp in Oracle. Why it does not create users?

I am newbie in oracle and I am facing troubles with impdp. I have a production server and I have created a new server for testing purposes, so I installed centos, oracle and created the database "sire". Now I make a dump from the production server with the following command:
expdp system/password#sire full=Y
directory=pump_dir dumpfile=sire_dump.dmp logfile=sire.log
The I come to the new server, and I execute impdp:
impdp system/password#sire full=Y
directory=pump_directorio dumpfile=sire_dump.dmp logfile=sire_imp.log
It starts to do the import but then I receive errors such as:
"the user vberrios does not exist". And also error beause it cannot
found some schemas and tablespaces.
My question is: It is not supposed that impdp full=Y must import all users and schemas? I have read that I have to create the users in the destination server but I have about 300 users in the database. How can I do a full import in a empty server. I just want to import the full database and user and all objects.
The documentation states, that impdp will create uses, when the dump file contains the create user statements:
If the schema you are remapping to does not already exist, the import
operation creates it, provided the dump file set contains the
necessary CREATE USER metadata and you are importing with enough
privileges.
So either your dump file is incomplete (for example due to missing privileges) or you are lacking privileges on the target database.
So please check your privileges on both, the source database and the target database. Please update your question with then according information. For the export to include the schema definitions, you must have the DATAPUMP_EXP_FULL_DATABASE privilege.

How to create a dump?

I'm one of the junior DBA working in IT company.In my company there are so many schemas is there.Now my question is How to create a dump file(some times i'm working at home.That time how to use that dump file ).Please suggest me
NOTE:I am using Oracle SQL Developer.
Expdp helps in exporting the database and impdp helps in importing the database. you can directly export one schema to another (in different database also) by using network link concept.
If network link concept is used then the creation of separate expdp file is not required.
For example If you have to export a schema called schema1 with password pwd1 from source database to target database then
first you need admin privileges of your target and source schema.
You can create a network link between source and target schema
CREATE PUBLIC DATABASE LINK example_link
CONNECT TO schema1 IDENTIFIED BY pwd1
USING 'server_name:port/service_name';--(put source database server_name,port and service name)
then create a directory in your target server :-
CREATE OR REPLACE DIRECTORY exp_dir AS 'F:/location';
grant read,write on directory exp_dir to schema1;
After this login to your target server and from command line use the below command:
impdp dba_username/dba_pwd network_link=example_link directory=exp_dir remap_tablespace=source_tbs:target_tbs remap_schema=schema1:schema1 parallel=2
You should use the Oracle Data Pump tool. The tool allows you to export your data into a .dmp file and import it into any database. Here is a video showing how to use the data pump tool in SQLDeveloper. I think this is a relatively new feature in SQLDeveloper, so make sure you have the appropriate versions..
Video Tutorial HERE
From the command line, you can use data pump with the expdp and impdp commands like so..
Set your oracle environment by running the below command and providing your oracle SID
. oraenv
Then you can run your export command..
expdp directory=/bu1/dpdump/ dumpfile=myexport.dmp logfile=mylog.log schemas=users,products,sales
The parameters are as follows..
directory - the directory where to create the dumpfile and log
dumpfile - name of the dump file (should end in .dmp)
logfile - name of the log file (should end in .log)
schemas - comma seperated list of the schemas you want to export
NOTE: you need dba privileges to use datapump. It will prompt you for the credentials
Data Pump Documentation is here
Exporting of ORACLE database objects is controlled by parameters. To get familiar with EXPORT parameters type:
exp help=y
You will get a short description and the default settings will be shown.
The EXPORT utility may be used in three ways:
Interactive dialogue
Controlled through bypassed parameters
Parameterfile controlled
Example to the 2nd option:
exp scott/tiger file=empdept.expdat tables=(EMP,DEPT) log=empdept.log
Take a look at these links for further readings:
Original Export and Import
The ORACLE Import/Export Utilities

Resources