I have lost my database on the server because the computer has been formatted on drive C. But the oracle folder was located on Drive E, like oradata, etc. Can I restore oracle database like before?
Well, first you need to install the oracle server with the version which was there before formatting.
Follow these steps:
Install the same Oracle database version with the starter database.
Copy all control files, datafiles to oradata directory
Copy init.ora filr to ADMIN folder
Revise the init.ora file for the changed control file, archive and dump locations. (keep instance, SID, and global db name the same)
Create the instance by running oradim with the pfile(init.ora) location
Connect to DB using internal and mount it.
Check existing data and logfile locations by typing:
SELECT name FROM V$DATAFILE;
SELECT member FROM V$LOGFILE;
Now change the file locations that come up above by using:
ALTER DATABASE RENAME FILE <old file loc list> to <new file loc list>;
After renaming them open the database by alter database open.
And you should be set.
Reference
Related
how can I migrate all my data and configuration for matrix synapse and Riot.Im installed on the system to another one VM ?
Can I backup and restore all the rooms (created with Riot.IM) , the chat logs and the users and migrate all the content to another machine ?
The old system is configured without using docker.
Thank you
Information
All the applications are decentralized and there will be configurations file which are holding your server and connection information, Remaining all the data is stores in the Database which you are using. So we have client in your case Riot , Matrix Synapse and Database(Migration)
Riot Migration
We have a configuration file named config.json (default) which has the URL's of your synapse server. While Migrating copy the values of the from your existing riot config file to your new riot config file.
Synapse Migration
Similar to the Riot there is a homeserver.yaml and conf.d/server_name.yaml files in matrix-synapse installation folder, which has all the configurations. Copy the contents from these files to new matrix files and you are done with client and interface, Let's get into Data Migration.
Database Migration
SQLITE3 to PostgreSQL follow the command
create dump file from sqlite
sqlite database .dump > /the/path/to/sqlite-dumpfile.sql
copy that sql dump file to PostgreSQL
/path/to/psql -d database -U username -W < /the/path/to/sqlite-dumpfile.sql
Old PostgreSQL to new PostgreSQL
Create a dump file as backup from older PostgreSQL
pg_dump dbname > outfile
Restore the data from this dump
psql dbname < infile
Using Database migration GUI tools such as Pentaho or dbsoft . Follow the dbsofts article
You can refer to element docs on migration, matrix docs and SQLite to PostgreSQL
I have developed File-table(Oracle)scenario 12c, everything is all good on my local machine when I ran individually. I want to get the file name and location dynamically as start up parameters. Built PKG with 2 declared variable( Name and location) followed by the file interface( included file location in PS corresponding LS) and in model resource I only included #project.variable(location.) I ran the scn got the 2 variables ( file name and file location) it is success but here comes the issue nothing is getting loaded into table 0 records. What I am missing here ( agent used local agent, file is in my c directory local) does the local agent not able to read the file if it is 11g i might put in absolute path assuming agent permissions to read file ? But its local agent and file is in local folder c its picking up the file when ran individually as mentioned with the same. Pls share your thoughts.
I am trying to copy an existing CSV file in a SQL table in pgAdmin4 1.5.
I am running the following query to copy the data from the CSV file:
COPY console_games FROM '/users/user1/Desktop/ConsoleGames.csv' DELIMITER ',' CSV HEADER;
And I get this result:
********** Error **********
ERROR: could not open file "/Users/user1/Desktop/ConsoleGames.csv" for reading: Permission denied
SQL state: 42501
I have changed the permissions of this file for all users to be read and write, but I still get the error.
I faced the same problem and here is how I solved
On Mac-
Open 'System Preferences'
Select 'Security and Privacy' option
Select 'Full Disk Access' from the list
Give access to PgAdmin AND postgres
Then, close and reopen your pgadmin
Use PgAdmin4's bulk-load features to import the CSV. This will do a COPY ... FROM STDIN behind the scenes. PgAdmin4 will access the file with your user's permissions, not those of the postgres server like a direct COPY from file will.
I solved this problem by creating a folder named Database in the PostgreSQL bin where I save all the data files that I want to work on. I use pgAdmin4 on both Mac and Windows. Whereas it is rather straightforward to find the bin on Windows, it is a bit tricky to find it on Mac because the Library where the bin is kept is hidden. Try to save your files on C:\ProgramFiles\PostgreSQL\10\bin\Database if you are using Windows and /Library/PostgreSQL/10/bin/Database/if you are using Mac. The Library can be found by pressing command+shift+g and type ~/Library on the go to folder. Your Mac will require your password to make changes in the Library. After saving your data files here, your code on pgAdmin4 should be like this:
COPY console_games FROM 'C:\ProgramFiles\PostgreSQL\10\bin\Database\ConsoleGames.csv' DELIMITER ',' CSV HEADER;
or
COPY console_games FROM '/Library/PostgreSQL/10/bin/Database/ConsoleGames.csv' DELIMITER ',' CSV HEADER;
On a Mac
Close PGAdmin
Open 'System Preferences'
Select 'Security and Privacy' option
Select 'Privacy'
Select 'Files and folders' from the list
Grant PgAdmin access for Documents-folder
Reopen PGAdmin
If we create a directory using create or replace directory and another directory exists which has the same path will the original directory get deleted?
Nope, you can have many oracle directories which point to the same place. Creating or removing an oracle directory does nothing at the OS level, the OS directory doesn't have to exist in order to create the oracle directory.
I am trying to import data from a dump file created by Oracle 10g data pump utility. The command that I am issuing is
impdp \"username/password#DB as sysdba\" remap_schema=SRC_SCHEMA:TARGET_SCHEMA remap_tablespace=source_tablespace:target_tablespace DUMPFILE=db.dmp
I am getting the following error message:
ORA - 39001: Invalid argument value
ORA - 39000: Bad dump file spcification
ORA - 39088: file name cannot contain a path specification
What is the cause of this error?
From the documentation:
ORA-39088: file name cannot contain a path specification
Cause: The name of a dump file, log file, or sql file contains a path specification.
Action: Use the name of a directory object to indicate where the file should be stored.
This suggests that the parameter you've shown as DUMPFILE=db.dmp is really something like DUMPFILE=C:\some\dir\path\db.dmp, which is not allowed. You have to use a directory that is recognised by the database and specify it with a DIRECTORYparameter.
As #ruffin notes from that directory parameter link, you can put the dump file in the default DATA_PUMP_DIR directory, which you can find from the dba_directories view or, if you have permission to use that object, the all_directories view. The user you're importing as has to have been granted read and write privileges on that for you to be able to use it. You also need to be able to move your dump file into the operating-system directory, so permissions may be an issue there too.
If you don't have a suitable directory object that you have database privileges for and operating-system access to, you'll need to create one and grant suitable privileges. This needs to be done by someone with the appropriate privileges, usually as SYS:
create directory my_data_pump_dir as 'C:\some\dir\path';
grant read, write on directory my_data_pump_dir to <username>;
Then the import is modified to have:
... DUMPFILE=db.dmp DIRECTORY=my_data_pump_dir
Note that the operating system directory has to be available to the Oracle user account (whoever is running the database processes, pmon etc.) on the database server. You cannot import to a remote database using a local file, unless the local directory is somehow mounted on the remote server. The old imp command was a client-side application that often ran on the server but didn't have to; impdp is a server-side application.