Import Neo4J data into Oracle relational database - oracle

I would like to keep the Neo4J database as the master database, and I would like to keep oracle relational database tables sincronized with Neo4J dat as a read-only Materialized View .
I only find links and articles explaining how to import relational data into Neo4j and not the other way around.
Conceptually I am looking for a kind of Materizalized View in Oracle using a Cypher Query as source. Maybe I could make a custom Merge program, mapping Oracle Tables to Cypher queries. Ideally I would like to run this program in Oracle (PLSQL).
Thanks in advance,

Related

Generate Logical model from oracle database

I know i can generate a sql script from a logical model in data modeler, can i do the opposite?
i created the database from django and migrate to oracle, o generate an logical model would be really useful.
Yes.
In Data Modeler, File - Import
Choose either your SQL file(s) or use a database connection to import a data dictionary from the database.
This will give you a relational model.
Then you can use the engineer to logical model button to get your logical design and ERD.

neo4j: how to import data from Oracle

I have 5 tables: 3 for nodes and 2 for relationships between them ( relationship = child). How to transfer the data from Oracle to neo4j ?
The Neo4j site has an entire documentation section on moving data from relational databases to neo4j. There are a bunch of different possibilities.
The simplest way though is to use your chosen database tools to export your tables to a CSV format, and then to use Cypher's LOAD CSV command to pull the data in.
The data can't be directly transferred though, in the sense that your tables represent entities and relationships between them; when moving the data to neo4j, this requires that you consider what you want your graph data model to look like.
Because the flexibility and the power that neo4j will give you will ultimately have a lot to do with how you modeled your data, you should give this careful consideration before you dump the CSV and try to import it into neo4j.

Oracle to Apache Cassandra data migration

I am working on an Apache Cassandra data migration. I have couple of tables which I need to move to the Cassandra column family with data - what is the best way to do this?
I have seen Apache Sqoop, will it help me? If yes, then what are the steps?
There is no silver bullet to migrate data from Oracle (or any RDBMS) to Cassandra. The way your data is modeled in Cassandra is fundamentally different from a relation database schema. Tools might help you to some degree, but you'll first have to create a new data model that will match the way you're going to read and write data into Cassandra. This article gives you a good start with Cassandra data modeling: http://www.datastax.com/dev/blog/basic-rules-of-cassandra-data-modeling

Index Check in OpenEdge 10.2b which uses Oracle schema

How to know index usage of particular module in Openedge 10.2 which uses Oracle db schema?
I have used XREF but .xrf does not give any index details for my module, so I have run below simple query and then checked in .xrf but no index detail available.
FOR EACH tablename NO-LOCK USE-INDEX indexname:
DISPLAY tablename.field.
END.
Please help me how to get index detail for Progress db using oracle schema.
First I assume you are using Oracle DataServer from Progress.
If that is the case, bear in mind that all USE-INDEX will be translated basically into ORDER BY in the resulting query, so mostly being used to order not to access the data.
If you want a know how your information is accessed you'll need to enable qt_debug when connecting to the schema holder, that will allow you to print many information about how your progress code is translated to SQL to access the Oracle DB. You'll need to analyze those SQL (SQL EXPLAIN as an example) to see the performance of your queries and how they are accessing the DB.

Migrating data between 2 databases in Oracle 9i

I am new to Oracle. Since we have rewritten an earlier application , we have to migrate the data from the earlier database in Oracle 9i to a new database , also in 9i, with totally different structures. The column names and types would be totally different. We need to map the tables and columns , try to export as much data as possible, eliminate duplicates, and fill empty values with defaults.
Are there any tools which can help in mapping the elements of the 2 databases , with rules to handle duplicates, and default values and migrate the data ?
Thanks,
Chak.
If your goal is to migrate data between two very different schemas you will probably need an ETL solution (ETL=Extract Transform Load).
An ETL will allow you to:
Select data from your source database(s) [Extract]
apply business logic to the selected data [Transform] (deal with duplicates, default values, map source tables/columns with destination tables/columns...)
insert the data into the new database [Load]
Most ETLs also allow some kind of automatisation and reporting of the loads (bad/discarded rows...)
Oracle's ETL is called Oracle Warehouse Builder (OWB). It is included in the Database licence and you can download it from the Oracle website. As most Oracle products it is powerful but the learning curve is a bit steep.
You may want to look into the [ETL] section here in SO, among others:
What ETL tool do you use?
ETL tools… what do they do exactly? In laymans terms please.
In many cases, creating a database link and some scripts a'la
insert into newtable select distinct foo, bar, 'defaultvalue' from oldtable#olddatabase where xxx
should do the trick

Resources