How to copy data encrypted by dbms_obfuscation_toolkit.DESEncrypt - oracle

I have an Oracle (10.2.0.4) database table with a column which is encrypted by dbms_obfuscation_toolkit.DESEncrypt tool kit.
Some of our data has been messed up by it getting re-encrypted with another key.
I want to do some testing on this data to try and recover it. Therefore, I want to copy the data from our live system and into a test system.
I've tried simply exporting the data from SQL Developer (in various text based formats), but the "binary" nature of the encrypted data seems to break the file format.
I tried exp, but this reported errors (although I'm not sure if this is to do with the encrypted data or not).
How can I copy just this one table's data from one database to another?
Thanks.
The errors I got when exporting the table are below. I was doing this from my local machine connecting to a remote database:
c:\>exp <user>/<password>#<sid> FILE=export.dmp TABLES=(TABLE1)
Export: Release 11.1.0.6.0 - Production on Thu Oct 14 20:46:51 2010
Copyright (c) 1982, 2007, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit
Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Export done in WE8ISO8859P1 character set and AL16UTF16 NCHAR character set
server uses WE8ISO8859P15 character set (possible charset conversion)
About to export specified tables via Conventional Path ...
. . exporting table TABLE1
EXP-00008: ORACLE error 904 encountered
ORA-00904: "MAXSIZE": invalid identifier

I would try with a database link. If you can't create a database link, you could try the COPY command of SQL*Plus, although I'm not sure if it would work with encrypted columns (it looks like this command is deprecated in the newest releases).
If this fails, the best tool to export/import data from Oracle to Oracle would probably be Data Pump (included in the DB).

It turned out that my Windows test database had a slightly different character set encoding when compared to our live (unix) system - WE8ISO8859P1 -v- WE8ISO8859P15. I did a character set conversion on my test database, using the instructions here and then I was able to import the data.

Related

invalid argument value and bad dump file specification when I try to impdp in Oracle 11g

I'm trying get the SQL DDL statements from a Data Pump dmp file. However I'm getting this when I run impdp:
Connected to: Oracle Database 11g Express Edition Release 11.2.0.2.0 - 64bit Production
ORA-39001: invalid argument value
ORA-39000: bad dump file specification
ORA-31619: invalid dump file "D:\Oracle\dmp\mydmp.dmp"
I'm trying to run this command:
impdp myuser/password directory=mydir dumpfile=mydmp.dmp sqlfile=myddl.sql logfile=mylog.log;
My user got pretty much all privileges and read, write access to the directory.
I'm doing this to retrieve the tablespaces from this dmp file.
My end goal is to fully restore this database that was done from another machine. I was told that before importing this DB to a new machine that I need to get the tablespaces and re-create the same ones on the new machine. The old machine also had a different user.
Sorry if all this sounds basic. It's my first time playing with Oracle DB.
Any advise or guidance would be greatly appreciated.
Thank You

Disable foreign key in dbunit test

I am using dbunit 2.5.4 with junit 4, Java 8, and an oracle db (11 something). I successfully downloaded my test db to a flat file (xml) following online tutorials. I now want to do a CLEAN_INSERT but I get a CyclicTablesDependencyException. The solution appears to be to turn off the foreign key checks but I am not sure how to do this. How can I disabled foreign key checks in my dbunit test when I am doing a clean_insert?
I don't know what all that (up to "Oracle") is (yes, I know, Google is my friend, but only if I use it).
However, if you want to move that "test DB" (is it really a database? Or is it a schema? I presume the latter, but - even if it is a former, no problem), I'd suggest you to use
Data pump Export (and Import), or
original EXP (and IMP) utilities
The first one is more powerful, but EXP & IMP are somewhat simpler to use (don't require access to DB server, you don't have to create a directory (an Oracle object), DMP file resides on your computer).
What is the export/import benefit? In your case Oracle will take care about constraints. Besides, you'd export all objects by default (tables, views, procedures, triggers, packages, sequences, ... - everything) in a simple manner. I'd suggest you to look at it.
Documentation is, as usual, on OTN (pick your 11 something version, although - for such a simple requirements - any version will do).
Here's a short demonstration: I'm exporting MIKE's objects (not that many of them) and importing them into SCOTT's schema.
M:\>exp mike/lion#orcl file=mike.dmp
Export: Release 11.2.0.2.0 - Production on Uto Tra 10 07:11:42 2018
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.4.0 - 64bit Production
With the Partitioning, Real Application Clusters, Automatic Storage Management, OLAP,
Data Mining and Real Application Tes
Export done in EE8MSWIN1250 character set and AL16UTF16 NCHAR character set
. exporting pre-schema procedural objects and actions
. exporting foreign function library names for user MIKE
. exporting PUBLIC type synonyms
. exporting private type synonyms
. exporting object type definitions for user MIKE
About to export MIKE's objects ...
. exporting database links
. exporting sequence numbers
. exporting cluster definitions
. about to export MIKE's tables via Conventional Path ...
. . exporting table DEPT 4 rows exported
. exporting synonyms
. exporting views
. exporting stored procedures
. exporting operators
. exporting referential integrity constraints
. exporting triggers
. exporting indextypes
. exporting bitmap, functional and extensible indexes
. exporting posttables actions
. exporting materialized views
. exporting snapshot logs
. exporting job queues
. exporting refresh groups and children
. exporting dimensions
. exporting post-schema procedural objects and actions
. exporting statistics
Export terminated successfully without warnings.
M:\>imp scott/tiger#orcl file=mike.dmp full=y
Import: Release 11.2.0.2.0 - Production on Uto Tra 10 07:13:51 2018
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.4.0 - 64bit Production
With the Partitioning, Real Application Clusters, Automatic Storage Management, OLAP,
Data Mining and Real Application Tes
Export file created by EXPORT:V11.02.00 via conventional path
Warning: the objects were exported by MIKE, not by you
import done in EE8MSWIN1250 character set and AL16UTF16 NCHAR character set
. importing MIKE's objects into SCOTT
. . importing table "DEPT" 4 rows imported
Import terminated successfully without warnings.
M:\>
Try disabling useSequenceFiltering: useSequenceFiltering = false

Oracle 12C: ORA-00406: COMPATIBLE parameter needs to be 12.2.0.0.0

We have a 2 current Oracle 10G(10.2.0.1) production databases, and we are planning to migrate to a new database server with Oracle 12C. Since data is for each database is only around 5GB, the best way to create a new instance and use data pump to transfer data. To achieve this, I created a database link from the Oracle 12C database to the 10G, and use that to expdp the data from the 12C database. However when I import the exported data I have around tables that have an error like this:
ORA-39083: Object type TABLE:"USER"."WH_SEARCH_ACT" failed to create with error:
ORA-00406: COMPATIBLE parameter needs to be 12.2.0.0.0 or greater
ORA-00722: Feature "Partition Read Only"
Is there any solution to this other than adding the COMPATIBLE parameter in the production database? This is production so I can't really just update/modify the current production database. Any other solutions here, because i don't like just create the tables before importing the data pump file.
Use the version parameter in the data pump utility. For example:
expdp hr/hr TABLES=hr.employees VERSION=10.2
DIRECTORY=data_pump_dir DUMPFILE=emp10g.dmp LOGFILE=emp.log
Just make sure the version of the export utility is 10g and import utility is 12G. Hope this helps.
In the production database . you can use exp program
exp username/password buffer=64000 file=/path/to/path.dmp full=y
copy exp dump file to oracle 12c and use
imp username/password buffer=64000 file=/path/to/path.dmp full=y

Import dmp file created in Oracle 11g (WE8ISO8859P1) to Oracle 11g XE database (AL32UTF8)

I am regularly given a dmp file that gets created using oracle v.11g (using the exp utility).
I used to import this file to the Western European edition of Oracle 10g XE.
The import would terminate successfully without warnings but there was an error log (alert_xe.log) that would contstantly increase in size because I was using the the 32 Bit Oracle Database on a 64 Bit Windows OS.
I have now installed 11g XE and I am trying to import the same dmp file but I am seeing the following in the import log file:
import done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set
import server uses AL32UTF8 character set (possible charset conversion)
export client uses WE8ISO8859P1 character set (possible charset conversion)
and the import terminates with warnings as I have a lot of the following errors:
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column XXX (actual: 256, maximum: 255)
I understand that the cause of the problem is that the source database is using byte semantics and my new 11g XE database is using multibyte character set.
I have no control over the source database so I cannot change anything there.
Moreover I cant pre-create the tables with columns definitions using character length semantics over byte semantics (as indicated here for example Character set in Oracle 11g r2 XE) as somethimes the source database schema gets changed (columns might get added) and I am not notified so that breaks the import.
Is there a solution to this problem?
Is there any way to use WE8MSWIN1252 with Oracle 11g XE?
No, unfortunately you can't do it.
As it's stated in Oracle official documentation:
9 Oracle Database XE Character and Language Configurations
Oracle Database XE is available only in Universal character set and language configurations:
The database is created using Unicode(AL32UTF8) character set, which is suitable for global data in any language.
http://docs.oracle.com/cd/E17781_01/install.112/e18803/toc.htm#XEINW138
The only workaround is to preallocate the tables in advanced.
I had the same issue.
When I run this command
imp <username>/<password>#<hostname> file=<filename>.dmp log=<filename>.log
It was showing
import done in AL32UTF8 character set and AL16UTF16 NCHAR character set
export client uses AR8MSWIN1256 character set (possible charset conversion)
IMP-00031: Must specify FULL=Y or provide FROMUSER/TOUSER or TABLES arguments
IMP-00000: Import terminated unsuccessfully
That mean my oracle server is AL32UTF8 character set and export dump file client use AR8MSWIN1256 character set.
So I just changed the oracle character set to AR8MSWIN1256 by using following command.
SHUTDOWN IMMEDIATE;
STARTUP MOUNT;
ALTER SYSTEM ENABLE RESTRICTED SESSION;
ALTER SYSTEM SET JOB_QUEUE_PROCESSES=0;
ALTER SYSTEM SET AQ_TM_PROCESSES=0;
ALTER DATABASE OPEN;
ALTER DATABASE CHARACTER SET INTERNAL_USE AR8ISO8859P6;
SHUTDOWN IMMEDIATE;
STARTUP;
Then run again
imp <username>/<password>#<hostname> file=<filename>.dmp log=<filename>.log FULL=Y
I hope this anwser will help someone
Same problem i was facing to transfer data from oracle 11g to oracle xe version. what steps i have used at that time shown below:
'CONN SYS/(YOUR sys user PASSWORD) AS SYSDBA
SHUTDOWN IMMEDIATE;
STARTUP restrict;
ALTER DATABASE CHARACTER SET INTERNAL_USE WE8MSWIN1252;
SHUTDOWN IMMEDIATE;
STARTUP;'
Once oracle get started try to import your .dmp file after recreating user.

SSIS (VS2008) with Oracle OLE DB Source

I have a strange problem where in the DEV environment everything is fine but in PROD SSIS reports an error about conversion from unicode to non-unicode. This error happens at the OLE DB Source task level. So, I don't even get the chance to use Derived Columns to perform the conversion.
We installed BIDS on the production server and the task is flagged as in error. When I try to open the properties, it tells me that the metadata is different from what is in the DTSX file. When I accept the offer to automatically correct, all faulty columns input type (External and output columns) have their type switch from DT_WSTR to DT_STR.
The descriptions of the tables involved are identical in DEV and PROD (same types for the columns). If I query for the character sets they are identical in both environment.
For your information, here is the query:
SELECT *
FROM v$nls_parameters
where parameter
like '%CHARACTERSET'
which returns:
NLS_CHARACTERSET WE8MSWIN1252
NLS_NCHAR_CHARACTERSET AL16UTF16
on both environment.
Any idea to solve this ?
Thank you,
Michel
You are likely running into a difference in the drivers installed/configured on the different machines.
I ran into a similar issue with MySQL Drivers a long while back. Version X.Y.Z.13 was developed against. Server got driver version X.Y.Z.14 and boom, invalid metadata.
You will want to examine the Dev and prod server version of metadata and determine which one is right for you. In my case, the Dev driver produced varchar (non-unicode) strings and that aligned with the target system whereas the newer driver they had installed in prod deduced they should have been nvarchar (unicode) strings. Reworking the nvarchar to varchar or changing the tables was outside of the allowable timeframe for the project and the insane rules the data management team used for table creation.

Resources