I have a MariaDB database which uses dynamic columns.
There are around 10 such columns, because the data comes from many different devices and each of the device has different attributes. The devices send some binary data which is converted into csv and then inserted. I don't have control over this at all.
Now I am planning to migrate to oracle database 12.2 but not sure how to migrate the dynamic columns to Oracle. Any ideas please?
Oracle RDBMS doesn't support this feature natively, so you will have to write some procedures to implement something analogous to the MariaDB calls.
The closest functionality to dynamic columns is JSON. You're moving to Oracle 12.2 which has pretty JSON support. Find out more. Unless your data is very complicated with lots of nesting it should be trivial to turn CSV into JSON. Once you have JSON is easy to insert, maintain and retrieve the data using Oracle's functionality.
Related
In my current project i have a challenge regarding data redaction. we do not get data from Client because they have sensitive information in some of the columns. So to get the data from them we decided to encrypt the data and then it will come to us.
I am struggling to find an inbuilt algorithm in oracle which encrypt the data. One of the main objective of mine is:
1.) Length of my original input should remain the same after data redaction.
2.) Data type of my original input should remain after data redaction.
Can you please give me your input to achieve this.
Thanks and Regards
Ankit.
There are various options to encrypt your data, you can build your own PLSQL procedures using the DBMS_CRYPTO builtin package.
https://docs.oracle.com/cd/E11882_01/appdev.112/e40758/d_crypto.htm#ARPLS65670
Or if you use Enterprise Edition of oracle:
transparent data encryption:
https://docs.oracle.com/cd/E11882_01/network.112/e40393/asotrans.htm#ASOAG600
and if you have 12c, you can use Oracle Data Redaction:
http://www.oracle.com/technetwork/articles/database/data-redaction-odb12c-2331480.html
The advantage of the last one is that the data is not touched inside the database, Oracle Data Redaction acts as an upper layer masking data on the fly before it is returned to the application.
I am trying copy an oracle table into postgres. One of the columns in the oracle table is of "sdo_geometry" type. I want to know what is the equivalent for that data type in postgresql and how I can copy that data into my postgres table and if it requires any transformation etc. I am not looking for any migration tool as it is just one table and I am manually copying the data which is very small in size.
Thanks!
PostgreSQL has a number of geometrical data types built in, see the documentation. These offer limited functionality, but if these types and the functions and operators available for them do the trick for you, it would be the simplest solution.
If you need more advanced geometry support, you'll have to install the PostGIS extension.
I'm learning how to implement change data capture in oracle. However, not being a DB specialist but rather a DEV, i find the process tedious with respect to other things that i have to do. I end up doing it because my DBA/DEVOP don't want to take care of it.
Hence i was wondering if there is any tool that can help set oracle change data capture. In other words a simple graphical user interface that would write the all code for me. Creation of change table, PL/SQL Script and etc....
Many thanks
topic duplicated in: dba.stackexchange
What problem are you trying to solve?
How (when) will the CDC data be consumed
Are you planning to use something akin to: Oracle 11.1 CDC doc
Be sure to heed: Oracle 11.2 CDC Warning
"Oracle Change Data Capture will be de-supported in a future release of Oracle Database and will be replaced with Oracle GoldenGate. Therefore, Oracle strongly recommends that you use Oracle GoldenGate for new applications."
The company I work for, Attunity, has a pretty slick GUI CDC tool called "Replicate".
It can directly apply changes to a selected target DB, or store changes to be applies.
Many sources (Oracle, SQLserver, DB2...) many targets (Oracle. SQLserver, Netezza, Vertica,...)
Define your source and target DB, Search/Select source table, and one click to go.
Optional transformations such as: table and column names, drop and add columns, calculate values.
Regards,
Hein.
I have been using teradata for a while. Why would Oracle "migrate" into Teradata?
What are the advantages of Oracle UDFs
Supposing that you are not creating functions and stored procedures yourself, what is so cool about Oracles UDFs? It appears that almost all functions could be replaced using longer coding lines in Teradata. Is there something that is not supported natively by Teradata (except for regular expressions) that make Oracle UDFs so neccessary?
Those UDFs mainly add some functions which didn't exist in Teradata (before TD14) and because Oracle is the most commonly used DBMS and the most common migration path to Teradata is from Oracle they're called Oracle UDFs.
In fact functions like REPLACE and TRANSLATE are quite basic string functions in other DBMSes, too, and they alleviate the migration. And you definitely don't want to rewrite an oREPLACE or oTRANSLATE in plain old Standard SQL using SUBTRINGS and POSITIONS :-)
From here:-
Ability to access the industry standard CD-ROM and DVD-ROM media when they contain a UDF file system.
Flexibility in exchanging information across platforms and operating systems.
A mechanism for implementing new applications rich in broadcast-quality video, high-quality sound along with the richness
in interactivity using the DVD video specification based on UDF
format.
Also Functions Not Supported by Teradata
Oracle SQL functions with no equivalent function in Teradata are not
supported in DELETE, INSERT, or UPDATE statements, but are evaluated
by the Oracle database server if the statement is a SELECT statement.
That is, the Oracle database server performs post-processing of SELECT
statements sent to the gateway.
If an unsupported function is used in a DELETE, INSERT, or UPDATE,
statement, the following Oracle error occurs:
ORA-02070: database db_link_name does not support function in this context
I am new to Oracle. Since we have rewritten an earlier application , we have to migrate the data from the earlier database in Oracle 9i to a new database , also in 9i, with totally different structures. The column names and types would be totally different. We need to map the tables and columns , try to export as much data as possible, eliminate duplicates, and fill empty values with defaults.
Are there any tools which can help in mapping the elements of the 2 databases , with rules to handle duplicates, and default values and migrate the data ?
Thanks,
Chak.
If your goal is to migrate data between two very different schemas you will probably need an ETL solution (ETL=Extract Transform Load).
An ETL will allow you to:
Select data from your source database(s) [Extract]
apply business logic to the selected data [Transform] (deal with duplicates, default values, map source tables/columns with destination tables/columns...)
insert the data into the new database [Load]
Most ETLs also allow some kind of automatisation and reporting of the loads (bad/discarded rows...)
Oracle's ETL is called Oracle Warehouse Builder (OWB). It is included in the Database licence and you can download it from the Oracle website. As most Oracle products it is powerful but the learning curve is a bit steep.
You may want to look into the [ETL] section here in SO, among others:
What ETL tool do you use?
ETL tools… what do they do exactly? In laymans terms please.
In many cases, creating a database link and some scripts a'la
insert into newtable select distinct foo, bar, 'defaultvalue' from oldtable#olddatabase where xxx
should do the trick