COPY test
FROM 'C:\Users\micro\Downloads\test.csv' CSV HEADER;
I have shared the download directory (right click, properties etc.) and I get this error:
ERROR: could not open file "C:\Users\micro\Downloads\test.csv" for reading:
Permission denied
HINT: COPY FROM instructs the PostgreSQL server process to read a file. You
may want a client-side facility such as psql's \copy.
SQL state: 42501
Using "\COPY" gives a syntax error
Many thanks
Related
I'm trying to do an import using impdp utility. but I'm Seeting
impdp system/system remap_schema=ieulive:ieusystem directory=pump_dir dumpfile=IEULIVE.DMP logfile=imp.log
ORA-39002: invalid operation
ORA-39070: Unable to open the log file
ORA-29283: invalid file operation
ORA-29283: invalid file operation
thats's what I get when I use nologfile=y option
ORA-39001: invalid argument value
ORA-39000: bad dump file specification
ORA-31640: unable to open dump file "F:\data_pump\IEULIVE.DMP"
ORA-27041: unable to open file
OSD-04002: unable to open file
O/S-Error: <os 5> access is denied
I read that its a permission issue. I have imported before onto this DB, now I'm not able to do so.
I tried everything on the internet but didn't get the solution.
Thank you for your help
Updated
when I installed oracle, I used the windows virtual account(which I don't know what it is exactly)
Either the permissions for the folder or for the specific file are not set correctly. Both must be accessible (read/write for the directory, at least read for the file) by the account which is running the Oracle processes, which is most likely the Windows SYSTEM account. Is F: a local disk drive or a network-mounted drive? On Windows network-mounted drives generally cannot be accessed using directory objects...
directory=pump_dir
Your default data pump directory location and the actual directory where you have placed your export dump files could be different.
ORA-31640: unable to open dump file "F:\data_pump\IEULIVE.DMP"
Verify the data pump directory matches with "F:\data_pump\IEULIVE.DMP":
SELECT DIRECTORY_PATH FROM dba_directories WHERE DIRECTORY_NAME = 'PUMP_DIR';
If it matches, then you need to set the appropriate permissions to the directory and files.
I want to load data from a CSV file, using UTL_FILE, but an error has occurred (see below), please note that I'm connecting to the database remotely and the CSV file is in my local machine.
29283. 00000 - "invalid file operation"
*Cause: An attempt was made to read from a file or directory that does
not exist, or file or directory access was denied by the
operating system.
*Action: Verify file and directory access privileges on the file system,
and if reading, verify that the file exists.
Is it necessary to put the CSV file where the DB is mounted?
If the file is local to your machine, your options are:
transfer the file to the server, or
make a location on your machine visible/mountable by the server, or
use a client tool to load the data from your machine to the server
Assuming we'll be going with the last one, you can do this with:
SQL Developer
- Expand the "Tables" tab, right click on your table and choose Import
SQL Loader
- SQL Loader can be run locally (assuming you have the Oracle client) installed on your machine
Plenty of SQL Loader examples on https://asktom.oracle.com, or via the standard documentation
https://docs.oracle.com/en/database/oracle/oracle-database/12.2/sutil/oracle-sql-loader.html
I am trying to export a dump file and log file on a remote machine using oracle expdp.
However i am getting the following error :
Connected to: Oracle Database 11g
Enterprise Edition Release 11.2.0.4.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing
options
ORA-39002: invalid operation
ORA-39070: Unable to open the log file.
ORA-29283: invalid file operation
ORA-06512: at "SYS.UTL_FILE", line 536
ORA-29283: invalid file operation
Command run on remote machine host-name 'Local' using oracle client are :
SQL> create directory expdp_dir as '/vault2/expdp_dir';
SQL> grant read,write on directory expdp_dir to dbuser;
expdp dbuser/dbpwd#SID SCHEMAS=dbuser DIRECTORY=expdp_dir DUMPFILE=testDB24NOV17.dmp logfile=testDB24NOV17.log EXCLUDE=STATISTICS
Note vault 2 is mounted on a remote machine with hostname 'Local'. The database is on a machine with hostname TestDB.
The OS is RHEL6.
Any thoughts /ideas on making this operation successful would be appreciated.
Please check this one:
as per Oracle Doc.ID Doc ID 1305166.1
The errors can have multiple causes. Known causes are listed below.
One of the usual reasons for this problem to occur is when the listener process has not been started under the same account as the database instance service. The listener forks the new server process and when this runs under a different security context as the database, then access to directories and files are likely impacted.
Please verify the following information:
1) the output of:
ps -ef | grep SMON
2) the output of:
ps -ef | grep tnslsnr
3) the output of:
ps -ef|grep LIST
4) the output of:
ls -ld
Note:
When using ASM, the listener may have been started from the ASM Home instead of the RDBMS Home. Depending on your security settings, this may give to this issue.
One more:
4. Directory path/folder exists but create directory is executed by a different user in the database and the import is run by a different user.
Solution:
1. Make sure the listener and instance services are started from the same account.
Make sure that the directory are shared between nodes so that the directory can be accessed on any instance, or, create a folder similar to the other nodes locally, if there is already a folder created locally on the all the node with the same file directory path structure check if the permission are correct.
Make sure the folder exist has specified in during creation in the "CREATE DIRECTORY" syntax command.
Grant the required permission to the importing user to use the directory.
grant read, write on directory to ;
If above 4 possible causes and solutions are not applicable at your end, please check if the user has proper permission to export to run utl_file package.
Hope it helps.
I am trying to run the file from network drive but getting error message as Unable to open file.
I am trying below command to run in sql developer :
#\\chllrog.amastre.com\Shares\test\execute_report.sql
I also tried like this below but getting same error message:
#"\\chllrog.amastre.com\Shares\test\execute_report.sql"
When i open the execute_report.sql file from this location and execute the contents of the file which creates new excel file in the same directory then its creating new file and its working.
Use network map for your operating system (in Windows, right click the shared folder in explorer and Map Network Drive). Then you can run the script from SQL Developer using the mapped drive, for example: #z:\Shares\test\execute_report.sql
You can use the same addressing of file for spool and output file
It did work for me without drive mapping:
sqlplus /nolog "#\\server\path\file.sql"
sqlplus version 12.2.0.1
The quotes and '#' are important, apparently.
I have the following error:
when try to run the report without full path.
REP-110: File abc.rdf cannot be opened.
REP-1070: An error occurred while opening or saving a document.
REP-0110: File abc.rdf cannot be opened.
but with full path its ok.
Reports 11g is installed with weblogic server.
make sure that you added where the reports are in your registry editor in the following directory
Computer\HKEY_LOCAL_MACHINE\SOFTWARE\ORACLE\KEY_OH#########\REPORTS_PATH
Add
...ows\Fonts;C:\Oracle\Middleware\Oracle_FRHome1\reports