I did succeed to load a flex table before but this time I am having an Error
sql error[2886]:vertica :couldn't open file 'path....' for reading,no
such file or directory
notes:
tried moving file to desktop didn't help
checked file security...it open to everything no restrictions
flex table name :flex_flights
sql query :
copy flex_flights
FROM '/na-dev-nas-1/unix_inst/software/files/flight_data.csv'
parser fcsvparser();
tried both types of Slashes still didn't help,
thank you for your time
Are you connected on a Vertica node as dbadmin?
Then,
FROM '/na-dev-nas-1/unix_inst/software/files/flight_data.csv'
would work.
At least if dbadmin has reading privileges on that file.
In all other cases (you might be on a Vertica node but not be dbadmin, but have reading privileges on the file in exchange - or sit on an altogether different computer, and log in via the vsql of a client Vertica stack installation) try COPY FROM LOCAL:
COPY flex_flights
FROM LOCAL -- < NOTE THE LOCAL KEYWORD
'/na-dev-nas-1/unix_inst/software/files/flight_data.csv'
PARSER fcsvparser();
You will be slower, as you will have your connection to Vertica as the only parsing thread, and the data will travel through your log in connection to Vertica, but it will work
Related
My instructor gave me a username and password and .dbf file and tell me to open it and try to retrieve with sqlplus and oracle database
I tried to open the dbf file from excel mysql and ms server but it i gave me an error
Speaking as a DBA: As Littlefoot stated, you can't just read a data file from an Oracle DB. At best they are proprietary binary file formats, assuming it isn't encrypted on top of that. Nor can you take a data file from one database instance and just plug it in to another database instance. You also can't import it to mySQL or any other database engine: as a stand-alone data file it can only be properly read by its original database installation (i.e. the specific database instance that created it).
Oracle has specific tools available to copy data and/or files from one database to another, but those would generally use the RMAN backup manager (used to make physical backups) or (more likely in your case) the Datapump "Transportable Tablespace" feature.
To restore it from an RMAN backup you would need a complete full backup of the entire source database instance: RMAN backup sets including all data files, redo logs (and perhaps archived logs), control files, parameter files, encryption keys,, and possibly more.
To restore a transportable tablespace dump you would need your own running Oracle database instance, the correct parameters to run the impdp import utility, and the assistance/cooperation of the DBA.
You need to confirm if the file you were given is such an export dump (though the .dbf file extension would suggest not), and how you are expected to access the data. You won't be able to just "open the file".
.DBF extension probably represents datafile; I don't think you can read it with any tool (at least, I don't know of any).
You should find an Oracle DBA who might try to help; in order to restore a database (which is contained in that file), they might need control file(s), redo log files and ... can't name what other files (I'm not a DBA).
Then, if everything goes OK, database might be started up so that you'd be able to connect to it using credentials you were given.
I'm using JDBC to transfer data from a delimited file to a db2 database table. Initially, I encountered SQLCODE=-104, SQLSTATE=42601, so on further debugging I found this which referred me to call stored procedure SYSPROC.ADMIN_CMD.
I modified the call and tried running the procedure version, but I'm still getting the same error:
com.ibm.db2.jcc.am.SqlSyntaxErrorException: DB2 SQL Error: SQLCODE=-104, SQLSTATE=42601, SQLERRMC=CLIENT;LOAD;FROM, DRIVER=4.26.14
Error |
at com.ibm.db2.jcc.am.b7.a(b7.java:810)
Error |
at com.ibm.db2.jcc.am.b7.a(b7.java:66)
I'm not sure what exactly am doing wrong.
Code:
CALL SYSPROC.ADMIN_CMD('LOAD CLIENT FROM "<PATH_TO_FILE>" OF DEL MODIFIED BY COLDEL0X09 INSERT INTO <SCHEMA_NAME>.<TABLE_NAME> NONRECOVERABLE')
I ran the LOAD command on the db2 command prompt and it ran without any issues.
Db2 version: 11.5
The load client command is intended for use at a client workstation, not at a Db2-server, so sysproc.admin_cmd will reject the client keyword.
The stored procedure executes at the Db2-server , it does not have access to files that are stored at the client workstation.
Therefore any file you mention in parameters to sysproc.admin_cmd stored procedure must be relative to the Db2-server file-system and must be accessible (readable) to the Db2-instance owner account.
If your data-file is already located on the Db2-server, just reference its fully qualified filename and run the sysproc.admin_cmd procedure with the load command. Carefully check the documentation for the load command to understand all of the implications of using LOAD, especially if the target Db2-server is highly-available. This is an administration matter.
If your data-file is not already located at a Db2-server, then first copy the file to the Db2-server and retry with load (or slower import).
You can also run the import command via sysproc.admin_cmd when the data file is accessible to the Db2-instance owner on the Db2-server than runs that stored-procedure.
If your Db2-version is recent you can also consider the ingest command, refer to the documentation for details.
If your data file is located on a client workstation, and you are unable or unwilling to transfer it to a Db2-server, then you can use your local workstation Db2-client (if you installed a suitable Db2-client that includes the db2 CLP) to run a script/batch-file to perform the relevant commands. You cannot use jdbc for that specific purpose, although you can exec/shell out from java to run the required commands in one script (db2 connect to ... user ...using .., db2 load client ... , or db2 ingest ... or db2 import ...
If your target Db2-server is already at version 11.5 or higher then it should support insert from external table, and remote external table, and since INSERT is plain SQL then you can do that via jdbc.
Apart from the above, most DBAs would arrange for direct Db2-server to Db2-server transfers if both are Db2-LUW and have IP-connectivity and if security rules permit, this avoids the slow and insecure business of taking the data outside of Db2. That is not a programming matter, more an administrative matter.
my server had been attacked by a ransomware .rapid and all my data had been encrypted , luckily for me the oracle home folder is not encrypted - yet - and most of the files including the datafiles folder and tablespaces are still accessible
Can any One please tell me how to recover my database objects?
no backup is available , only oracle home folder -most of it-
EDIT :
The System is broken , I am trying to know witch files to collect and copy that will enable me to recover my database files from another system
when I try to log into sqlplus throw cmd I get the following error :
'sqlplus' is not recognized as an internal or external command ,
operable program or batch file.
Blockquote
EDIT :
FILES THAT I STILL HAVE ACCESS TO - NOT ENCRYPTED -
Okay. If you can find an init.ora file on your server, that's the PFILE - initialization parameter file - that's the last thing missing to easily copy your database to a new server. If you can't find it, that's ok - it'll just be a little harder. As long as you have the datafiles, you can eventually get your database back.
Basically, you'll want to follow steps 2-8 in the link I posted. You can also find some helpful info in the Oracle guide to manually creating a database in Windows. I'll walk you through them.
Shutdown your old database (if it's still running). This will make sure your datafiles are in a consistent state for copying. Probably stopping the Windows Service would be the easiest way to do that if you can't access sqlplus.
Copy the data to your new server. I'm assuming it'll be in the same location, D:\app\Administrator\oradata\VTC\
Make a copy of the control file CONTROL01.CTL and name it create_db.sql (EDIT: I was assuming that this was a backup to trace ascii version of the control file, but it sounds like this is the binary file)
Edit create_db.sql. Where it says CREATE CONTROLFILE REUSE DATABASE "MY_DB" NORESETLOGS, change it to CREATE CONTROLFILE SET DATABASE "MY_DB" RESETLOGS. Make note of whatever "MY_DB" is - this is your database name. Most people make it the same as the SID. I normally do RESETLOGS which throws out the old redo logs, but you could try keeping them with NORESETLOGS if that works for you.
Remove or comment out the lines that say RECOVER DATABASE and ALTER DATABASE OPEN;. Make sure the paths for the datafiles and logfiles look correct. Save the file.
If you couldn't find your init.ora to copy, I think this very minimal one will work for you, although you'll want to fix your memory settings later. Create it in the same folder.
DB_NAME=MY_DB
INSTANCE_NAME=MY_DB
SERVICE_NAMES=MY_DB
CONTROL_FILES = ("D:\app\Administrator\oradata\VTC\CONTROL01.CTL")
DB_FILES=100
Create an Oracle Database Windows Service. Afterwards check Services to make sure it's running.
oradim -NEW -SID MY_DB -STARTMODE manual -PFILE "D:\app\Administrator\oradata\VTC\init.ora"
Log in to your new Oracle instance as SYSDBA. There's no database yet.
cd D:\app\Administrator\oradata\VTC\
set ORACLE_SID=MY_DB
sqlplus / as sysdba
Create the database, using the control file from the old server as a script.
#create_db.sql
If everything comes back OK, run:
alter database open
I have a specific need for data pump and I am having a hard time searching for a solution.
Currently, I have a exp/imp program that exports tables (selectively based on queries) from one database, and imports that same data into another database. This program and the dump files reside on a common server that can access both the source and destination databases. This is a totally automated process. It works good, albeit slowly.
Due to various reasons, I must migrate this program to use data pump. The biggest change now is the location of the dmp files. I also have very limited access to the database servers themselves, but I can run data pump.
The process will be run from the same common server, but the exported files will now reside on the database server for the source database. No issue there. I can create dmp files using expdp.
My issue is how to get that same data into the destination database. When I run impdp, it is expecting a data_pump_dir in the destination area (not source area). Again, this is automated, and I don't have the luxury of being able to transfer dmp files using scp or ftp or anything like that.
What can I use to overcome this problem using datapump?
No reason you cannot configure an external directory on BOTH databases:
CREATE DIRECTORY mydumpdir AS '/whatever/the/path/is';
Then, impdp and expdp will take the DIRECTORY argument as mydumpdir
Make sure you configure permissions for the Oracle schemas/users to read/write to the directory AND the oracle process account should have OS level rights to read/write to that location also. The expdp server should also have write access as it might be trying to write reports to the locations or you might be using to do file cleanup.
we get problem, while trying to export Oracle DB. OS: CentOS ~ 5.2 DB: Oracle 10g.
Exp command exports db files only in location:
/home/oracle/OraHome_1/oradata/master/xxx.dbf
, but tool can't export files in different location (we know about this files after getting trace) like this:
'/disk1/dblog06.dbf',
'/home/disk2/system01.dbf',
Please, advice me, how to get dump file. or buckup it.
Thanks.
You appear to have misunderstood what exp does, and particularly what the file parameter is for. The file is the output dump file, normally given a .dmp extension. Export takes data out of the database instance, it does not work under the hood on the datafiles - you have to tell it which data you want (full, user, tables, or tablespaces) and where to put it, not where it comes from.
If you really did try to exp file=/home/disk2/system01.dbf then what you actually asked it to do was trash your database; you're lucky that it did not overwrite the datafile and cause a catastrophic failure. Oracle seems to have saved you from yourself there, though possibly only thanks to having exclusive locks on the files at the time.
You need to read up on how it works and see if it actually does what you want - as APC notes it's not a backup tool. Looks at the Oracle documentation for your version, or somewhere like http://www.orafaq.com/wiki/Import_Export_FAQ, and also look at using data pump instead of exp.
I am not sure if that is the question, but the exp command will export database objects according to their logical schema (user name, table name). It does not matter which physical database file the data is coming from.
exp works through an Oracle instance, which needs to have mounted the datafiles.
Are these other files part of the Oracle database? Maybe another database? You need to find out which Oracle server uses them, and then run exp against that instance.
EXPORT is not a backup tool. It is meant for transferring data from one database to another, or perhaps from one schema to another.
If you want to recover your data in the event of a database crash or corruption then you need to use the appropriate tool. There are OS solutions to this, but Oracle comes with a sophisticated backup and recovery tool: RMAN. Find out more.