I had to import a table from schema of one database to a schema of another database.I created a .dmp file of the table using "exp" command transfered the file to the other machine using ftp (using binary transfer) and when I tried to import the file on the other machine using "imp" command I faced the error that the .dmp file is corrupt(IMP-00010).Now if one of my machine is hosting Oracle 11G and the other machine to which data has to be imported is hosting 10G ,Could that be the reason of the error which I recieved.If, yes what else could I do to get my work done?
Commands used
exp flstd05/flstd05#flow30 tables=cust_order_lines file=cust_order_lines_data1.dmp
imp flmad01/flmad01#flow04 file=cust_order_lines_data1.dmp
Related
When Migrating from Oracle Database 11g to AWS RDS Oracle Database 19c using the Oracle Datapump tool for export and the RDS Datapump API for Import resulted in some nasty errors.
ORA-39001: invalid argument value
ORA-39000: bad dump file specification
ORA-39143: dump file "/rdsdbdata/datapump/test.dmp" may be an original export dump file
What I have tried
Changing the ownership of the dmp file
Using the full schema option
Adding the credentials
Migrating from oracle database 11g to Oracle Database 12c using traditional datapump functionality to resolve the compatibility issues resulted in the same issue mentioned above.
I resolve this issue by doing the following:
Granting permission to the schema to access the directory I was interested in storing the exported DMP file on the server.
Exporting the DMP using expdp (Data Pump)
I currently use Oracle Database 11g Express Edition Release 11.2.0.2.0. And I've downloaded dataset from the internet.
Seems like I successfully imported the .JSON file. But the tables are not the same. And there is only one table in the dataset that I downloaded. In SQL developer, there are dozens of tables that I can't understand.
What should I do?
[Process of importing .JSON file]
XE is the created one.
I guess, the json file is just for the connection details to a database, and the xml might define the tables. Or the tables have been created during installation process of the database by the DBA.
I have a dump from a database in a csv format ('|' character as the delimiter), and I want to import that into a remote Oracle database. I am using AWS and the csv is on an EC2 instance running amazon linux, and the remote Oracle database is an RDS instance. This is the first time I'm touching an Oracle database.
I expected this to be fairly simple, but trying to find info I kind of got lost. Some people say to use SQL*Loader, but I can't manage to even install that thing. Other's say that SQL*Loader is not supposed to even be installed on something that isn't the actual database server. So far I've only managed to install sqlplus and connect to the database, but no importing so far.
Basically I'm looking for an equivalent of \COPY in psql, but for Oracle. And how on earth would I use it in this context.
If you don't want to use SQL*Loader and only need to import CSV as a one time task, I would recommend using SQL Developer. If you want an automated process, SQL*Loader would probably be your best bet.
Here are some links (note that an Excel import and a CSV import are nearly identical):
How to move csv file into oracle database using sql developer? [closed]
How to Import from Excel to a New Oracle Table with SQL Developer
How to Import from Excel to Oracle with SQL Developer
I have installed the Cognos 10.2 Trial version. The installation comes a sample schema called "GoSales".
I would like to import the GoSales schema into my Oracle database. This will enable me to view the GoSales Schema in a database query tool (i.e. TOAD)
My ultimate goal is to:
- Understand the GoSales schema.
- Make sure that the reports I develop using Cognos Studio are correct by checking the data in the database.
So, how can I import the "Go Sales" schema into my Oracle database?
There is a compressed file with the Cognos BI samples. I guess you can download this file from IBM. Its name is bi_smps_10.2.1_mp_ml.tar.gz.
You have to uncompress this file and execute the script ...\win\GOSaleConfig.bat (for a Windows machine). This script will ask you the connection parameters of your Oracle database and create the schemas with sample data.
I am using Oracle Export data pump (expdp) for my application . My oracle client and server are on different machines . When I use my application which is on the oracle client machine , the exported dump is always created on the Oracle server machine . Is this the oracle export data pump limitation ? Or Is there a workaround ?
thats the way datapump works by design.
If you need client side, you have to use the older exp and imp. otherwise use datapump and ftp the file to your local machine afterwards.
http://docs.oracle.com/cd/E11882_01/server.112/e22490/dp_overview.htm#i1010293
Note: All Data Pump Export and Import processing, including the
reading and writing of dump files, is done on the system (server)
selected by the specified database connect string. This means that for
unprivileged users, the database administrator (DBA) must create
directory objects for the Data Pump files that are read and written on
that server file system.