I have a SQL*Loader Control File on my computer.
The input data file on the server is C:\data\myfile.csv.
In ctl file I add INFILE 'C:\data\myfile.csv'.
I have run with command:
sqlldr admin/admin#//192.10.1.1:1521/orcl control=myctlfile.ctl
The log file:
SQL*Loader: Release 11.2.0.1.0 - Production on Tue Sep 18 16:09:00
2018 Copyright (c) 1982, 2009, Oracle and/or its affiliates.
All rights reserved.
SQL*Loader-500: Unable to open file (C:\data\myfile.csv)
SQL*Loader-553: file not found
SQL*Loader-509: System error: The system cannot find the file specified.
When I move file C:\data\myfile.csv from server to my computer it works fine.
I want to use a file on the server. Is this feasibile?
SQL*Loader is a client application. It needs to be able to open the file, otherwise it can't load it.
If the file is on a remote server, you might look at network folder sharing (Windows public folder sharing, SkyDrive, Samba etc) to make the remote file accessible from your desktop.
Related
my customer has provided a dmp file (10 GO), and i tried the following:
Create a user:
create user USERNAME identified by PASSWORD;
Grant read write access
Import the dump file(using imp and impdp)
impdp or imp system/password#db dumpfile=EXPDAT.DMP FULL=Y logfile=dice.log
and here's the error message:
Import: Release 18.0.0.0.0 - Production on Tue Feb 23 11:46:07 2021
Version 18.4.0.0.0
Copyright (c) 1982, 2019, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 18c Express Edition Release 18.0.0.0.0 - Production
ORA-39002: invalid operation
ORA-39059: dump file set is incomplete
ORA-39246: cannot locate master table within provided dump files
Can anyone help on that?
First, imp and impdp are not interchangeable; they have different file formats and options. You need to know exactly which was used to create the file you have.
Second, assuming the file was created with expdp (aka datapump, the more modern choice) and you should be using impdp to load it, the error indicates that there's a problem with the datafile itself.
ORA-39246 cannot locate master table within provided dump files
Cause: Check the export log file and make sure all of the files that
were exported are included in the current job.
Action: A Data Pump IMPORT or SQL_FILE operation was being performed
but not all of the files from the Data Pump export dump file set were
included. In particular, the dump file containing the export job's
master table was not provided.
It seems likely that your customer has not provided you with the complete data dump and you should have received additional files. This is possible if either the "parallel" or "filesize" option was used during the export. Confirm with them the number and size of the files you should have.
I intend to create a DB Gateway from Oracle to Postgres. I have the appropriate file (Setup) file for installation.
Now, I after I execute the setup and proceeded with further steps.
A new directory in Oracle_Home where DB Gateway will be installed.
Selected the option for "ODBC" gateway.
The installation fails and says the following :
DB Gateway Error:
Now if I see the errorlog file --> configToolFailedCommands
The log file says:
rem Copyright (c) 1999, 2009, Oracle. All rights reserved.
C:\Windows\system32\cmd /c call
C:\oracle\product\11.2.0\client_1\OraGtw11g_home2/bin/netca.bat /orahome C:\oracle\product\11.2.0\client_1\OraGtw11g_home2 /orahnam OraGtw11g_home2 /instype custom /inscomp client,oraclenet,server /insprtcl tcp,nmp /cfg local /authadp NO_VALUE /responseFile
C:\oracle\product\11.2.0\client_1\OraGtw11g_home2\network\install\netca_typ.rsp
I do not know how to read the log file if it contains an error.
The file mentioned by Oracle (configToolFailedCommands) is not an error log file but a script file allowing you to re run the commands which have previously failed.
I'm trying to import a few files with a published Oracle Data Pump perl script: dumpinfo.pl
After successfully importing several dump files from the same export process, another file failed with:
# impdp system/****** DIRECTORY=RESTORE_DIR DUMPFILE=exp_%u.dmp PARALLEL=8
Import: Release 11.2.0.2.0 - Production on Mon Jul 7 11:40:37 2014
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
ORA-39002: invalid operation
ORA-39059: dump file set is incomplete
ORA-39246: cannot locate master table within provided dump files
The script reports that it can't find a master table. Assuming that the master table is lost, there any mechanics for recover it?
Thanks...
ORA-39246 cannot locate master table within provided dump files
Cause: Check the export log file and make sure all of the files that
were exported are included in the current job. Action: A Data Pump
IMPORT or SQL_FILE operation was being performed but not all of the
files from the Data Pump export dump file set were included. In
particular, the dump file containing the export job's master table was
not provided.
Check that you have all dump files in that directory and they all are accessible to impdp utility. I got this error when our backup team has restored first five dmp files, and two last ones were missing.
I have been handed an oracle database (10.1.0.5.0) with no documentation and very little rman information and I need to change the existing the backup location drive for rman backups.
Before I do that I want to check if the database has a recovery catalog. How do I do this?
If no recovery catalog exists how to do I query existing script names and script content?
What platform are you on?
The CATALOG option will be on the RMAN command line or in the recovery script file. Just "grep" for catalog as a start.
Please check your RMAN backup log file. Where you will see where you are connecting to catalog or not.
The output will look like as below
Recovery Manager: Release 12.1.0.2.0 - Production on Thu May 2 08:03:55 2019
Copyright (c) 1982, 2014, Oracle and/or its affiliates. All rights reserved.
RMAN>
connected to recovery catalog database
recovery catalog schema release 12.02.00.01. is newer than RMAN release
Check your backup script where you will see where you are connected to RMAN Catalog or not
Please grep for "connect catalog"
$ cat script.sh | grep 'connect catalog'
my .ctl file is :
LOAD DATA
INFILE "C:\Users\nkb1\Desktop\fnames.txt"
INTO TABLE MDB.TEACHERS
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
(first_name,last_name)
i am executing this from the windows command prompt as
sqlldr system#mdb/mdb control=C:\Users\nkb1\Desktop\load.ctl
and i am getting error like this.
C:\Users\nkb1>sqlldr system#mdb/mdb control=C:\Users\nkb1\Desktop\load.ctl
SQL*Loader: Release 10.2.0.1.0 - Production on Wed May 4 14:44:22 2011
Copyright (c) 1982, 2005, Oracle. All rights reserved.
SQL*Loader-704: Internal error: ulconnect: OCIEnvCreate [-1]
and i have set the ORACLE_HOME variable as C:\oracle\product\10.2.0\db_2 where the db stores
Bad news, this is a bug in 10.2.0.1.
Check Metalink document 361325.1 (hope you have access)!
Maybe you'll just have to grant additional privileges but it is possible that you'll need to apply the 10.2.0.3 patch set
I hit the same issue. Closed the command prompt.Opened one with Administrator Rights - Open CMD as Administrator.
That's it. This solved all the issues I had been facing.
If you have saved the SQLLOADER/SQLPLUS scripts as a batch file, then submit/open the batch file as administrator .
Hope this helps.