I understand this is a cliche but while doing export dump in oracle, I am getting these errors.
I have pretty much followed almost all documentation online and did these following steps:
CREATE OR REPLACE DIRECTORY export_meta as '/C:/oracle/'; (where C/oracle is my local path)
GRANT READ,WRITE ON export_meta to HR
expdp username/password DIRECTORY=export_meta
dumpfile=hr.dmp
But getting these errors:
ORA-39002: invalid operation
ORA-39070: Unable to open the log file.
ORA-29283: invalid file operation
ORA-06512: ad "SYS.UTIL_FILE"
Can someone please tell me if this directory needs to be created on local and any idea as to why I am getting this error?
most probably your path '/C:/oracle/' is wrong (it starts with "/") try 'C:/oracle/' if this is just a copy/paste error then:
1) See which user started the oracle.exe process, most probably that user does not have access to write to that directory (to test this try to create the ora DIRECTORY in the $ORACLE_HOME folder, or to point to datafile folder).
2) the file hr.dmp should not exist in that directory when you start expdp
Related
I am trying to copy an existing CSV file in a SQL table in pgAdmin4 1.5.
I am running the following query to copy the data from the CSV file:
COPY console_games FROM '/users/user1/Desktop/ConsoleGames.csv' DELIMITER ',' CSV HEADER;
And I get this result:
********** Error **********
ERROR: could not open file "/Users/user1/Desktop/ConsoleGames.csv" for reading: Permission denied
SQL state: 42501
I have changed the permissions of this file for all users to be read and write, but I still get the error.
I faced the same problem and here is how I solved
On Mac-
Open 'System Preferences'
Select 'Security and Privacy' option
Select 'Full Disk Access' from the list
Give access to PgAdmin AND postgres
Then, close and reopen your pgadmin
Use PgAdmin4's bulk-load features to import the CSV. This will do a COPY ... FROM STDIN behind the scenes. PgAdmin4 will access the file with your user's permissions, not those of the postgres server like a direct COPY from file will.
I solved this problem by creating a folder named Database in the PostgreSQL bin where I save all the data files that I want to work on. I use pgAdmin4 on both Mac and Windows. Whereas it is rather straightforward to find the bin on Windows, it is a bit tricky to find it on Mac because the Library where the bin is kept is hidden. Try to save your files on C:\ProgramFiles\PostgreSQL\10\bin\Database if you are using Windows and /Library/PostgreSQL/10/bin/Database/if you are using Mac. The Library can be found by pressing command+shift+g and type ~/Library on the go to folder. Your Mac will require your password to make changes in the Library. After saving your data files here, your code on pgAdmin4 should be like this:
COPY console_games FROM 'C:\ProgramFiles\PostgreSQL\10\bin\Database\ConsoleGames.csv' DELIMITER ',' CSV HEADER;
or
COPY console_games FROM '/Library/PostgreSQL/10/bin/Database/ConsoleGames.csv' DELIMITER ',' CSV HEADER;
On a Mac
Close PGAdmin
Open 'System Preferences'
Select 'Security and Privacy' option
Select 'Privacy'
Select 'Files and folders' from the list
Grant PgAdmin access for Documents-folder
Reopen PGAdmin
When I log in, this error show ->
i already give permission to src/storage
file_put_contents(/var/www/html/catalog/src/storage/framework/cache/55/f9/55f98a9ae16c0c5f7c41c2f6d5435d3f37274a71): failed to open stream: No such file or directory
Create cache directory in storage directory, and give him read & write permission to it. Error saying that your cache directory does not exists
I have installed Oracle 10g on my machine but not able to run ed for edit option error:
SP2-0110: Cannot create save file "afiedt.buf"
Please, help me out to solve this. I have already tried so many options for fix it.
Check the file path where you are trying to save, check if the folder structure exists and you have write permission
Can I run the VSDBCMD command remotely? I mean without copying the files to the SQL server? I am trying to create a dbschema file to use it as a reference in a database project.
I tried to run the command on my machine, and I get the following error: "TSD An error was received from SQL Server while attempting to reverse engineer elements of type Microsoft.Data.Schema.Sql.SchemaModel.ISql100DatabaseEncryptionKey: The user does not have permission to perform this action. An unexpected failure occurred: Access to the path 'C:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\Pivotal_dev_ed.dbschema' is denied."
Do I need special permission on the SQl server?
I found the answer, it seems you can run it remotely, you just have to specify the path to the folder where you want the schema to be saved. I got the error mentioned above because I didn't had permissions to write on the server, but specifying a path to a folder where I could write solved the problem
I am trying to import data from a dump file created by Oracle 10g data pump utility. The command that I am issuing is
impdp \"username/password#DB as sysdba\" remap_schema=SRC_SCHEMA:TARGET_SCHEMA remap_tablespace=source_tablespace:target_tablespace DUMPFILE=db.dmp
I am getting the following error message:
ORA - 39001: Invalid argument value
ORA - 39000: Bad dump file spcification
ORA - 39088: file name cannot contain a path specification
What is the cause of this error?
From the documentation:
ORA-39088: file name cannot contain a path specification
Cause: The name of a dump file, log file, or sql file contains a path specification.
Action: Use the name of a directory object to indicate where the file should be stored.
This suggests that the parameter you've shown as DUMPFILE=db.dmp is really something like DUMPFILE=C:\some\dir\path\db.dmp, which is not allowed. You have to use a directory that is recognised by the database and specify it with a DIRECTORYparameter.
As #ruffin notes from that directory parameter link, you can put the dump file in the default DATA_PUMP_DIR directory, which you can find from the dba_directories view or, if you have permission to use that object, the all_directories view. The user you're importing as has to have been granted read and write privileges on that for you to be able to use it. You also need to be able to move your dump file into the operating-system directory, so permissions may be an issue there too.
If you don't have a suitable directory object that you have database privileges for and operating-system access to, you'll need to create one and grant suitable privileges. This needs to be done by someone with the appropriate privileges, usually as SYS:
create directory my_data_pump_dir as 'C:\some\dir\path';
grant read, write on directory my_data_pump_dir to <username>;
Then the import is modified to have:
... DUMPFILE=db.dmp DIRECTORY=my_data_pump_dir
Note that the operating system directory has to be available to the Oracle user account (whoever is running the database processes, pmon etc.) on the database server. You cannot import to a remote database using a local file, unless the local directory is somehow mounted on the remote server. The old imp command was a client-side application that often ran on the server but didn't have to; impdp is a server-side application.