ODI = ELT tool
OCDM = Data warehouse.
Is my understanding of the above correct ? More information/explanation is welcome.
Now my question is :
Is it possible to load into OCDM's pre-existing tables via ODI, when the source of ODI are in flatfiles/XML format ? If possible, how ?
Any links related to above are also welcome.
It is indeed possible. OCDM is a solution using an Oracle 11g database to store the data, so ODI can definitely load it.
Actually OCDM comes out-of-the-box with adapters to load the data from NCC (Oracle Communications Network Charging and Control) and BRM (Oracle Communications Billing and Revenue Management), and these adapters are using ODI 11g – and optionally Golden Gate.
Each of these adapters is composed of some models and one ODI project holding interfaces and packages.
If you want to build you own integration process, it is just a standard loading from flat file to Oracle or XML to Oracle. Both of these are covered by the tutorials in the ODI 11g Series in the Oracle Learning Library : https://apexapps.oracle.com/pls/apex/f?p=44785:24:0::NO:24:P24_CONTENT_ID,P24_PREV_PAGE:5185,29
It's possible, using OCDM ad-ons to load the data (NCC, BRM)
NCC (Oracle Communications Network Charging and Control),
BRM (Oracle Communications Billing and Revenue Management),
and these adapters are using ODI 11g and ODI 12c and optionally Golden Gate also.
ODI is mostly used to load historical data. For more real-time data , they use Oracle Golden Gate. However, OGG is used to load the data to staging, for data sync from staging to presentation they still use ODI>
Yes, its possible. You need normal standard interface writing techniques to implement this.
Related
i have 3 different databases (oracle , sap,db2) and would like to implement data masking on oracle db , since the data is flowing to sap and db2 how can i solve this issue? data in oracle is compared with db2 and sap and say for example if i mask first name in oracle then the same will not be masked at sap and db2. so is there a way to unmask and send data to downstream systems ?
Generally the task can be solved by vendor's tools like IBM Optim Data Privacy. Such tools provide the capabilities for consistent masking, e.g. same input produce the same masked output, provided equivalent algorithms and parameters.
Probably by saying SAP you mean SAP HANA. This can be a bit tricky, due to missing SQL compatibility and lack of integration, but anyway this is doable too with the very same tools - just a bit more work to implement.
I need to start a long-term project in mapping out data tables so that we can get a high-level view of what information we store in our Oracle database and how the tables are linked to each other. This is largely for GDPR preparation.
Since our organization has been around for a number of decades, its database is massive. With TOAD for Oracle, I'm able to see all columns in our tables easily, so I started looking at different database mapping tools (ER/ONE, DDM, Astah) but they all look like I need to manually create all the tables and columns and draw their relationships out by hand.
I'm hoping to minimize as much manual labor as possible and am wondering if using TOAD data modeler would help since I'm using TOAD for Oracle anyways. Could I somehow automate the table, column, and relationship creation process?
Our organization only has Oracle's base version unfortunately (I think the premium bundle has data mapper included in it maybe... not sure.) Any thoughts on the options I have?
-
Bundle: Toad for Oracle Base (64-bit), Add-Ons: <-none->
Our organization only has Oracle's base version
Note: TOAD is not an Oracle product, it is owned and developed by Quest.
they all look like I need to manually create all the tables and columns and draw their relationships out by hand
Any decent data modelling tool supports reverse engineer a physical data model from an existing schema. How good the derived model is will depend on how good your schema is (my bet: decades of development without an existing data modelling tool? not good). For instance, if your schema has foreign keys the reverse engineering process will use them to draw the relationships between tables (even if they are disabled). But if there are no foreign keys then you're on your own.
As you're using already TOAD you are right to want the TOAD modelling extension. You can buy it as a standalone purchase. But if your company won't spring for the extra licenses you should check out Oracle SQL Developer Data Modeler. It's free and it has the most comprehensive support for idiomatic Oracle. (I'm not saying it's the best DM tool of them all but it's very good for something which is free). Find out more.
How to load oracle table data into kafka topic? i did some research and got to know,i should use CDC tool,but all CDC tools are paid version ,can anyone suggest me how to achieve this ?
You'll find this article useful: No More Silos: How to Integrate your Databases with Apache Kafka and CDC
It details all of your options and currently-available tools. In short, you can do bulk (or query-based CDC) with the Kafka Connect JDBC Connector, or you can use a log-based CDC approach with one of several CDC tools that support Oracle as a source, including Attunity, GoldenGate, SQ Data, and IBM's IIDR.
You'll generally find that if you've paid for your database (e.g. Oracle, DB2, etc) you're going to have to pay for a log-based CDC tool. Open source CDC tools are available for open source databases. For example, Debezium is open source and works great with MongoDB, MySQL, and PostgreSQL.
You might be interested in the Debezium project, which provides open-source CDC connectors for a variety of databases. Amongst others, we provide one for Oracle DB. Note that this connector currently is based on the XStream API of Oracle, which itself requires a separate license, but we hope to add a fully free alternative soon.
Disclaimer: I'm the lead of Debezium
Please refer to kafka jdbc source connector . Below is link
https://docs.confluent.io/current/connect/connect-jdbc/docs/index.html
You don't need a Change Data Capture (CDC) tool in order to load data from Oracle Table into a Kafka topic.
You can use Kafka Confluent's JDBC Source Connector in order to load the data.
However, if you need to capture deletes and updates you must use a CDC tool for which you need to pay a licence. Confluent has certified the following CDC tools (Source connectors):
Attunity
Dbvisit
Striim
Oracle GoldenGate
As others have mentioned, CDC requires paid products. If you'd just like to try something out, Striim is available for free for the first 30 days.
https://www.striim.com/instant-download/
The 'free' options which include JDBC..but you would be introducing a significant load on your database if you actually want to use triggers to capture changes.
disclaimer: i work at striim
There's a custom Kafka source connector for Oracle database which is based on logminer here:
https://github.com/erdemcer/kafka-connect-oracle
This project is in development.
You might be interested in OpenLogReplicator. It is an open source GPL-licensed tool written completely in C++. It reads binary format of Oracle Redo logs and sends them to Kafka.
It is very fast - you can achieve low latency without much effort, since it operates fully in memory. It supports all Oracle database versions since 11.2.0.1 and requires no additional licensing.
It can work on the database host, but you can also configure it to read the redo logs using sshfs from another host - with minimal load of the database.
disclaimer: I am the author of this solution
We have two divisions in our company, one uses E1 on Oracle 11g the other uses SAP on Oracle 11g.
We also have a SQL Server system we use to data warehouse information once a night from both system to run our report server against.
The question I have is for pooled tables in SAP, such as A016, how would I get that information out of SAP?
Currently we have SSIS's setup with a linked server to the two Oracle servers which pull the data we need I just don't have the knowledge of SAP to find the Pooled tables.
if I can't pull the pooled tables because they don't physically exist is there a tool I can use in SAP to find out what tables the pooled table is getting it's information from? This way I can rebuild that table in SQL using a open query and some fun Joins.
Thanks
You have to access those tables using the application server. They can't be accessed directly from the database.
You'll probably want to write an ABAP program to extract the data you need go from there.
I have 2 databases, Oracle and SQlite. And I want to create exact copies of some of the Oracle tables in SQLite in one of my applications. Most of these tables contains more than 10,000 rows so copying each table by going through each row programmatically is not efficient. Also the table structure may change in the future so I want to achieve this using a generic way without hard-coding the SQL statements. Is there a any way to do this?
p.s. - This application is being developed using Qt framework. All the queries and databases are represented by QtSql module objects.
Can't help with Qt framework, but for large amounts of data is is usually better to use bulk-copy operations.
Export data from Oracle
http://download.oracle.com/docs/cd/B25329_01/doc/admin.102/b25107/impexp.htm#BCEGAFAB
Import data into SQLite
http://www.sqlite.org/cvstrac/wiki?p=ImportingFiles
IHTH
What you probably really want to use is the Oracle Database Mobile Server, which can automatically synchronize a SQLite and an Oracle Database.
The recent release of the Oracle Database Mobile Server (formally called Oracle Database Lite Mobile Server) supports synchronization between an Oracle Database and a SQLite or a Berkeley DB database running on the client. It supports both synchronous and asynchronous data exchange, as well as secure communications between client and server. You can configure the Mobile Server to synchronize based on several options without the need to modify the application that is accessing the database.
You can also find an excellent discussion forum for questions from developers and implementers using the Mobile Server.